Sample records for cern accelerator logging

  1. Scale out databases for CERN use cases

    NASA Astrophysics Data System (ADS)

    Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Lanza Garcia, Daniel; Surdy, Kacper

    2015-12-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database.

  2. Integration of Oracle and Hadoop: Hybrid Databases Affordable at Scale

    NASA Astrophysics Data System (ADS)

    Canali, L.; Baranowski, Z.; Kothuri, P.

    2017-10-01

    This work reports on the activities aimed at integrating Oracle and Hadoop technologies for the use cases of CERN database services and in particular on the development of solutions for offloading data and queries from Oracle databases into Hadoop-based systems. The goal and interest of this investigation is to increase the scalability and optimize the cost/performance footprint for some of our largest Oracle databases. These concepts have been applied, among others, to build offline copies of CERN accelerator controls and logging databases. The tested solution allows to run reports on the controls data offloaded in Hadoop without affecting the critical production database, providing both performance benefits and cost reduction for the underlying infrastructure. Other use cases discussed include building hybrid database solutions with Oracle and Hadoop, offering the combined advantages of a mature relational database system with a scalable analytics engine.

  3. Introduction to CERN

    ScienceCinema

    Heuer, R.-D.

    2018-02-19

    Summer Student Lecture Programme Introduction. The mission of CERN; push back the frontiers of knowledge, e.g. the secrets of the Big Bang...what was the matter like within the first moments of the Universe's existence? You have to develop new technologies for accelerators and detectors (also information technology--the Web and the GRID and medicine--diagnosis and therapy). There are three key technology areas at CERN; accelerating, particle detection, large-scale computing.

  4. The Proton Synchrotron (PS): At the Core of the CERN Accelerators

    NASA Astrophysics Data System (ADS)

    Cundy, Donald; Gilardoni, Simone

    The following sections are included: * Introduction * Extraction: Getting the Beam to Leave the Accelerator * Acceleration and Bunch Gymnastics * Boosting PS Beam Intensity * Capacitive Energy Storage Replaces Flywheel * Taking the Neutrinos by the Horns * OMEGA: Towards the Electronic Bubble Chamber * ISOLDE: Targeting a New Era in Nuclear Physics * The CERN n_TOF Facility: Catching Neutrons on the Fly * References

  5. CERN Collider, France-Switzerland

    NASA Image and Video Library

    2013-08-23

    This image, acquired by NASA Terra spacecraft, is of the CERN Large Hadron Collider, the world largest and highest-energy particle accelerator laying beneath the French-Swiss border northwest of Geneva yellow circle.

  6. Radiation protection challenges in the management of radioactive waste from high-energy accelerators.

    PubMed

    Ulrici, Luisa; Algoet, Yvon; Bruno, Luca; Magistris, Matteo

    2015-04-01

    The European Laboratory for Particle Physics (CERN) has operated high-energy accelerators for fundamental physics research for nearly 60 y. The side-product of this activity is the radioactive waste, which is mainly generated as a result of preventive and corrective maintenance, upgrading activities and the dismantling of experiments or accelerator facilities. Prior to treatment and disposal, it is common practice to temporarily store radioactive waste on CERN's premises and it is a legal requirement that these storage facilities are safe and secure. Waste treatment typically includes sorting, segregation, volume and size reduction and packaging, which will depend on the type of component, its chemical composition, residual activity and possible surface contamination. At CERN, these activities are performed in a dedicated waste treatment centre under the supervision of the Radiation Protection Group. This paper gives an overview of the radiation protection challenges in the conception of a temporary storage and treatment centre for radioactive waste in an accelerator facility, based on the experience gained at CERN. The CERN approach consists of the classification of waste items into 'families' with similar radiological and physical-chemical properties. This classification allows the use of specific, family-dependent techniques for radiological characterisation and treatment, which are simultaneously efficient and compliant with best practices in radiation protection. The storage was planned on the basis of radiological and other possible hazards such as toxicity, pollution and fire load. Examples are given of technical choices for the treatment and radiological characterisation of selected waste families, which could be of interest to other accelerator facilities. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Medical Applications at CERN and the ENLIGHT Network

    PubMed Central

    Dosanjh, Manjit; Cirilli, Manuela; Myers, Steve; Navin, Sparsh

    2016-01-01

    State-of-the-art techniques derived from particle accelerators, detectors, and physics computing are routinely used in clinical practice and medical research centers: from imaging technologies to dedicated accelerators for cancer therapy and nuclear medicine, simulations, and data analytics. Principles of particle physics themselves are the foundation of a cutting edge radiotherapy technique for cancer treatment: hadron therapy. This article is an overview of the involvement of CERN, the European Organization for Nuclear Research, in medical applications, with specific focus on hadron therapy. It also presents the history, achievements, and future scientific goals of the European Network for Light Ion Hadron Therapy, whose co-ordination office is at CERN. PMID:26835422

  8. Medical Applications at CERN and the ENLIGHT Network.

    PubMed

    Dosanjh, Manjit; Cirilli, Manuela; Myers, Steve; Navin, Sparsh

    2016-01-01

    State-of-the-art techniques derived from particle accelerators, detectors, and physics computing are routinely used in clinical practice and medical research centers: from imaging technologies to dedicated accelerators for cancer therapy and nuclear medicine, simulations, and data analytics. Principles of particle physics themselves are the foundation of a cutting edge radiotherapy technique for cancer treatment: hadron therapy. This article is an overview of the involvement of CERN, the European Organization for Nuclear Research, in medical applications, with specific focus on hadron therapy. It also presents the history, achievements, and future scientific goals of the European Network for Light Ion Hadron Therapy, whose co-ordination office is at CERN.

  9. Preparation of a primary argon beam for the CERN fixed target physics.

    PubMed

    Küchler, D; O'Neil, M; Scrivens, R; Thomae, R

    2014-02-01

    The fixed target experiment NA61 in the North Area of the Super Proton Synchrotron is studying phase transitions in strongly interacting matter. Up to now they used the primary beams available from the CERN accelerator complex (protons and lead ions) or fragmented beams created from the primary lead ion beam. To explore a wider range of energies and densities a request was made to provide primary argon and xenon beams. This paper describes the results of the setting up and 10 week test run of the Ar(11+) beam from the 14.5 GHz ECR ion source and the linear accelerator (Linac3) at CERN.

  10. Accelerating hydrodynamic description of pseudorapidity density and the initial energy density in p +p , Cu + Cu, Au + Au, and Pb + Pb collisions at energies available at the BNL Relativistic Heavy Ion Collider and the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Ze-Fang, Jiang; Chun-Bin, Yang; Csanád, Máté; Csörgő, Tamás

    2018-06-01

    A known class of analytic, exact, accelerating solutions of prefect relativistic hydrodynamics with longitudinal acceleration is utilized to describe results on the pseudorapidity distributions for different collision systems. These results include d N /d η measured in p +p , Cu+Cu, Au+Au, and Pb+Pb collisions at the BNL Relativistic Heavy Ion Collider and the CERN Large Hadron Collider, in a broad centrality range. Going beyond the traditional Bjorken model, from the accelerating hydrodynamic description we determine the initial energy density and other thermodynamic quantities in those collisions.

  11. LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    NASA Astrophysics Data System (ADS)

    Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor

    2017-12-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.

  12. International Workshop on Linear Colliders 2010

    ScienceCinema

    Lebrun, Ph.

    2018-06-20

    IWLC2010 International Workshop on Linear Colliders 2010ECFA-CLIC-ILC joint meeting: Monday 18 October - Friday 22 October 2010Venue: CERN and CICG (International Conference Centre Geneva, Switzerland). This year, the International Workshop on Linear Colliders organized by the European Committee for Future Accelerators (ECFA) will study the physics, detectors and accelerator complex of a linear collider covering both CLIC and ILC options. Contact Workshop Secretariat  IWLC2010 is hosted by CERN.

  13. International Workshop on Linear Colliders 2010

    ScienceCinema

    Yamada, Sakue

    2018-05-24

    IWLC2010 International Workshop on Linear Colliders 2010ECFA-CLIC-ILC joint meeting: Monday 18 October - Friday 22 October 2010Venue: CERN and CICG (International Conference Centre Geneva, Switzerland) This year, the International Workshop on Linear Colliders organized by the European Committee for Future Accelerators (ECFA) will study the physics, detectors and accelerator complex of a linear collider covering both CLIC and ILC options. Contact Workshop Secretariat  IWLC2010 is hosted by CERN

  14. 1987 Nuclear Science Symposium, 34th, and 1987 Symposium on Nuclear Power Systems, 19th, San Francisco, CA, Oct. 21-23, 1987, Proceedings

    NASA Astrophysics Data System (ADS)

    Armantrout, Guy A.

    1988-02-01

    The present conference consideres topics in radiation detectors, advanced electronic circuits, data acquisition systems, radiation detector systems, high-energy and nuclear physics radiation detection, spaceborne instrumentation, health physics and environmental radiation detection, nuclear medicine, nuclear well logging, and nuclear reactor instrumentation. Attention is given to the response of scintillators to heavy ions, phonon-mediated particle detection, ballistic deficits in pulse-shaping amplifiers, fast analog ICs for particle physics, logic cell arrays, the CERN host interface, high performance data buses, a novel scintillating glass for high-energy physics applications, background events in microchannel plates, a tritium accelerator mass spectrometer, a novel positron tomograph, advancements in PET, cylindrical positron tomography, nuclear techniques in subsurface geology, REE borehole neutron activation, and a continuous tritium monitor for aqueous process streams.

  15. QM2017: Status and Key open Questions in Ultra-Relativistic Heavy-Ion Physics

    NASA Astrophysics Data System (ADS)

    Schukraft, Jurgen

    2017-11-01

    Almost exactly 3 decades ago, in the fall of 1986, the era of experimental ultra-relativistic E / m ≫ 1) heavy ion physics started simultaneously at the SPS at CERN and the AGS at Brookhaven with first beams of light Oxygen ions at fixed target energies of 200 GeV/A and 14.6 GeV/A, respectively. The event was announced by CERN [CERN's subatomic particle accelerators: Set up world-record in energy and break new ground for physics (CERN-PR-86-11-EN) (1986) 4 p, issued on 29 September 1986. URL (http://cds.cern.ch/record/855571)

  16. Commissioning results of CERN HIE-ISOLDE and INFN ALPI cryogenic control systems

    NASA Astrophysics Data System (ADS)

    Inglese, V.; Pezzetti, M.; Calore, A.; Modanese, P.; Pengo, R.

    2017-02-01

    The cryogenic systems of both accelerators, namely HIE ISOLDE (High Intensity and Energy Isotope Separator On Line DEvice) at CERN and ALPI (Acceleratore Lineare Per Ioni) at LNL, have been refurbished. HIE ISOLDE is a major upgrade of the existing ISOLDE facilities, which required the construction of a superconducting linear accelerator consisting of six cryomodules, each containing five superconductive RF cavities and superconducting solenoids. The ALPI linear accelerator, similar to HIE ISOLDE, is located at Legnaro National Laboratories (LNL) and became operational in the early 90’s. It is composed of 74 superconducting RF cavities, assembled inside 22 cryostats. The new control systems are equipped with PLC, developed on the CERN UNICOS framework, which include Schneider and Siemens PLCs and various fieldbuses (Profibus DP and PA, WorldFIP). The control systems were developed in synergy between CERN and LNL in order to build, effectively and with an optimized use of resources, control systems allowing to enhance ease of operation, maintainability, and long-term availability. This paper describes (i) the cryogenic systems, with special focus on the design of the control systems hardware and software, (ii) the strategy adopted in order to achieve a synergic approach, and (iii) the commissioning results after the cool-down to 4.5 K of the cryomodules.

  17. CERN and 60 years of science for peace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heuer, Rolf-Dieter, E-mail: Rolf.Heuer@cern.ch

    2015-02-24

    This paper presents CERN as it celebrates its 60{sup th} Anniversary since its founding. The presentation first discusses the mission of CERN and its role as an inter-governmental Organization. The paper also reviews aspects of the particle physics research programme, looking at both current and future accelerator-based facilities at the high-energy and intensity frontiers. Finally, the paper considers issues beyond fundamental research, such as capacity-building and the interface between Art and Science.

  18. Knowledge and Technology: Sharing With Society

    NASA Astrophysics Data System (ADS)

    Benvenuti, Cristoforo; Sutton, Christine; Wenninger, Horst

    The following sections are included: * A Core Mission of CERN * Medical Accelerators: A Tool for Tumour Therapy * Medipix: The Image is the Message * Crystal Clear: From Higgs to PET * Solar Collectors: When Nothing is Better * The TARC Experiment at CERN: Modern Alchemy * A CLOUD Chamber with a Silvery Lining * References

  19. Contextualized Magnetism in Secondary School: Learning from the LHC (CERN)

    ERIC Educational Resources Information Center

    Cid, Ramon

    2005-01-01

    Physics teachers in secondary schools usually mention the world's largest particle physics laboratory--CERN (European Organization for Nuclear Research)--only because of the enormous size of the accelerators and detectors used there, the number of scientists involved in their activities and also the necessary international scientific…

  20. Upgrade of the cryogenic CERN RF test facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pirotte, O.; Benda, V.; Brunner, O.

    2014-01-29

    With the large number of superconducting radiofrequency (RF) cryomodules to be tested for the former LEP and the present LHC accelerator a RF test facility was erected early in the 1990’s in the largest cryogenic test facility at CERN located at Point 18. This facility consisted of four vertical test stands for single cavities and originally one and then two horizontal test benches for RF cryomodules operating at 4.5 K in saturated helium. CERN is presently working on the upgrade of its accelerator infrastructure, which requires new superconducting cavities operating below 2 K in saturated superfluid helium. Consequently, the RFmore » test facility has been renewed in order to allow efficient cavity and cryomodule tests in superfluid helium and to improve its thermal performances. The new RF test facility is described and its performances are presented.« less

  1. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  2. A new approach to characterize very-low-level radioactive waste produced at hadron accelerators.

    PubMed

    Zaffora, Biagio; Magistris, Matteo; Chevalier, Jean-Pierre; Luccioni, Catherine; Saporta, Gilbert; Ulrici, Luisa

    2017-04-01

    Radioactive waste is produced as a consequence of preventive and corrective maintenance during the operation of high-energy particle accelerators or associated dismantling campaigns. Their radiological characterization must be performed to ensure an appropriate disposal in the disposal facilities. The radiological characterization of waste includes the establishment of the list of produced radionuclides, called "radionuclide inventory", and the estimation of their activity. The present paper describes the process adopted at CERN to characterize very-low-level radioactive waste with a focus on activated metals. The characterization method consists of measuring and estimating the activity of produced radionuclides either by experimental methods or statistical and numerical approaches. We adapted the so-called Scaling Factor (SF) and Correlation Factor (CF) techniques to the needs of hadron accelerators, and applied them to very-low-level metallic waste produced at CERN. For each type of metal we calculated the radionuclide inventory and identified the radionuclides that most contribute to hazard factors. The methodology proposed is of general validity, can be extended to other activated materials and can be used for the characterization of waste produced in particle accelerators and research centres, where the activation mechanisms are comparable to the ones occurring at CERN. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Beam experiments with the Grenoble test electron cyclotron resonance ion source at iThemba LABS.

    PubMed

    Thomae, R; Conradie, J; Fourie, D; Mira, J; Nemulodi, F; Kuechler, D; Toivanen, V

    2016-02-01

    At iThemba Laboratory for Accelerator Based Sciences (iThemba LABS) an electron cyclotron ion source was installed and commissioned. This source is a copy of the Grenoble Test Source (GTS) for the production of highly charged ions. The source is similar to the GTS-LHC at CERN and named GTS2. A collaboration between the Accelerators and Beam Physics Group of CERN and the Accelerator and Engineering Department of iThemba LABS was proposed in which the development of high intensity argon and xenon beams is envisaged. In this paper, we present beam experiments with the GTS2 at iThemba LABS, in which the results of continuous wave and afterglow operation of xenon ion beams with oxygen as supporting gases are presented.

  4. Numerical simulations of energy deposition caused by 50 MeV—50 TeV proton beams in copper and graphite targets

    NASA Astrophysics Data System (ADS)

    Nie, Y.; Schmidt, R.; Chetvertkova, V.; Rosell-Tarragó, G.; Burkart, F.; Wollmann, D.

    2017-08-01

    The conceptual design of the Future Circular Collider (FCC) is being carried out actively in an international collaboration hosted by CERN, for the post-Large Hadron Collider (LHC) era. The target center-of-mass energy of proton-proton collisions for the FCC is 100 TeV, nearly an order of magnitude higher than for LHC. The existing CERN accelerators will be used to prepare the beams for FCC. Concerning beam-related machine protection of the whole accelerator chain, it is critical to assess the consequences of beam impact on various accelerator components in the cases of controlled and uncontrolled beam losses. In this paper, we study the energy deposition of protons in solid copper and graphite targets, since the two materials are widely used in magnets, beam screens, collimators, and beam absorbers. Nominal injection and extraction energies in the hadron accelerator complex at CERN were selected in the range of 50 MeV-50 TeV. Three beam sizes were studied for each energy, corresponding to typical values of the betatron function. Specifically for thin targets, comparisons between fluka simulations and analytical Bethe equation calculations were carried out, which showed that the damage potential of a few-millimeter-thick graphite target and submillimeter-thick copper foil can be well estimated directly by the Bethe equation. The paper provides a valuable reference for the quick evaluation of potential damage to accelerator elements over a large range of beam parameters when beam loss occurs.

  5. A Bonner Sphere Spectrometer with extended response matrix

    NASA Astrophysics Data System (ADS)

    Birattari, C.; Dimovasili, E.; Mitaroff, A.; Silari, M.

    2010-08-01

    This paper describes the design, calibration and applications at high-energy accelerators of an extended-range Bonner Sphere neutron Spectrometer (BSS). The BSS was designed by the FLUKA Monte Carlo code, investigating several combinations of materials and diameters of the moderators for the high-energy channels. The system was calibrated at PTB in Braunschweig, Germany, using monoenergetic neutron beams in the energy range 144 keV-19 MeV. It was subsequently tested with Am-Be source neutrons and in the simulated workplace neutron field at CERF (the CERN-EU high-energy reference field facility). Since 2002, it has been employed for neutron spectral measurements around CERN accelerators.

  6. AT2 DS II - Accelerator System Design (Part II) - CCC Video Conference

    ScienceCinema

    None

    2017-12-09

    Discussion Session - Accelerator System Design (Part II) Tutors: C. Darve, J. Weisend II, Ph. Lebrun, A. Dabrowski, U. Raich Video Conference with the CERN Control Center. Experts in the field of Accelerator science will be available to answer the students questions. This session will link the CCC and SA (using Codec VC).

  7. Beam experiments with the Grenoble test electron cyclotron resonance ion source at iThemba LABS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomae, R., E-mail: rthomae@tlabs.ac.za; Conradie, J.; Fourie, D.

    2016-02-15

    At iThemba Laboratory for Accelerator Based Sciences (iThemba LABS) an electron cyclotron ion source was installed and commissioned. This source is a copy of the Grenoble Test Source (GTS) for the production of highly charged ions. The source is similar to the GTS-LHC at CERN and named GTS2. A collaboration between the Accelerators and Beam Physics Group of CERN and the Accelerator and Engineering Department of iThemba LABS was proposed in which the development of high intensity argon and xenon beams is envisaged. In this paper, we present beam experiments with the GTS2 at iThemba LABS, in which the resultsmore » of continuous wave and afterglow operation of xenon ion beams with oxygen as supporting gases are presented.« less

  8. The trigger system for K0→2 π0 decays of the NA48 experiment at CERN

    NASA Astrophysics Data System (ADS)

    Mikulec, I.

    1998-02-01

    A fully pipelined 40 MHz "dead-time-free" trigger system for neutral K0 decays for the NA48 experiment at CERN is described. The NA48 experiment studies CP-violation using the high intensity beam of the CERN SPS accelerator. The trigger system sums, digitises, filters and processes signals from 13 340 channels of the liquid krypton electro-magnetic calorimeter. In 1996 the calorimeter and part of the trigger electronics were installed and tested. In 1997 the system was completed and prepared to be used in the first NA48 physics data taking period. Cagliari, Cambridge, CERN, Dubna, Edinburgh, Ferrara, Firenze, Mainz, Orsay, Perugia, Pisa, Saclay, Siegen, Torino, Warszawa, Wien Collaboration.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Discussion Session - Accelerator System Design (Part II) Tutors: C. Darve, J. Weisend II, Ph. Lebrun, A. Dabrowski, U. Raich Video Conference with the CERN Control Center. Experts in the field of Accelerator science will be available to answer the students questions. This session will link the CCC and SA (using Codec VC).

  10. First experience with carbon stripping foils for the 160 MeV H- injection into the CERN PSB

    NASA Astrophysics Data System (ADS)

    Weterings, Wim; Bracco, Chiara; Jorat, Louise; Noulibos, Remy; van Trappen, Pieter

    2018-05-01

    160 MeV H- beam will be delivered from the new CERN linear accelerator (Linac4) to the Proton Synchrotron Booster (PSB), using a H- charge-exchange injection system. A 200 µg/cm2 carbon stripping foil will convert H- into protons by stripping off the electrons. The H- charge-exchange injection principle will be used for the first time in the CERN accelerator complex and involves many challenges. In order to gain experience with the foil changing mechanism and the very fragile foils, in 2016, prior to the installation in the PSB, a stripping foil test stand has been installed in the Linac4 transfer line. In addition, parts of the future PSB injection equipment are also temporarily installed in the Linac4 transfer line for tests with a 160 MeV H- commissioning proton beam. This paper describes the foil changing mechanism and control system, summarizes the practical experience of gluing and handling these foils and reports on the first results with beam.

  11. How to create successful Open Hardware projects — About White Rabbits and open fields

    NASA Astrophysics Data System (ADS)

    van der Bij, E.; Arruat, M.; Cattin, M.; Daniluk, G.; Gonzalez Cobas, J. D.; Gousiou, E.; Lewis, J.; Lipinski, M. M.; Serrano, J.; Stana, T.; Voumard, N.; Wlostowski, T.

    2013-12-01

    CERN's accelerator control group has embraced ''Open Hardware'' (OH) to facilitate peer review, avoid vendor lock-in and make support tasks scalable. A web-based tool for easing collaborative work was set up and the CERN OH Licence was created. New ADC, TDC, fine delay and carrier cards based on VITA and PCI-SIG standards were designed and drivers for Linux were written. Often industry was paid for developments, while quality and documentation was controlled by CERN. An innovative timing network was also developed with the OH paradigm. Industry now sells and supports these designs that find their way into new fields.

  12. Study of muon-induced neutron production using accelerator muon beam at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakajima, Y.; Lin, C. J.; Ochoa-Ricoux, J. P.

    2015-08-17

    Cosmogenic muon-induced neutrons are one of the most problematic backgrounds for various underground experiments for rare event searches. In order to accurately understand such backgrounds, experimental data with high-statistics and well-controlled systematics is essential. We performed a test experiment to measure muon-induced neutron production yield and energy spectrum using a high-energy accelerator muon beam at CERN. We successfully observed neutrons from 160 GeV/c muon interaction on lead, and measured kinetic energy distributions for various production angles. Works towards evaluation of absolute neutron production yield is underway. This work also demonstrates that the setup is feasible for a future large-scale experimentmore » for more comprehensive study of muon-induced neutron production.« less

  13. Optical fibres in the radiation environment of CERN

    NASA Astrophysics Data System (ADS)

    Guillermain, E.

    2017-11-01

    CERN, the European Organization for Nuclear Research (in Geneva, Switzerland), is home to a complex scientific instrument: the 27-kilometre Large Hadron Collider (LHC) collides beams of high-energy particles at close to the speed of light. Optical fibres are widely used at CERN, both in surface areas (e.g. for inter-building IT networks) and in the accelerator complex underground (e.g. for cryogenics, vacuum, safety systems). Optical fibres in the accelerator are exposed to mixed radiation fields (mainly composed of protons, pions, neutrons and other hadrons, gamma rays and electrons), with dose rates depending on the particular installation zone, and with radiation levels often significantly higher than those encountered in space. In the LHC and its injector chain radiation levels range from relatively low annual doses of a few Gy up to hundreds of kGy. Optical fibres suffer from Radiation Induced Attenuation (RIA, expressed in dB per unit length) that affect light transmission and which depends on the irradiation conditions (e.g. dose rate, total dose, temperature). In the CERN accelerator complex, the failure of an optical link can affect the proper functionality of control or monitoring systems and induce the interruption of the accelerator operation. The qualification of optical fibres for installation in critical radiation areas is therefore crucial. Thus, all optical fibre types installed in radiation areas at CERN are subject to laboratory irradiation tests, in order to evaluate their RIA at different total dose and dose rates. This allows the selection of the appropriate optical fibre type (conventional or radiation resistant) compliant with the requirements of each installation. Irradiation tests are performed in collaboration with Fraunhofer INT (irradiation facilities and expert team in Euskirchen, Germany). Conventional off-the-shelf optical fibres can be installed for optical links exposed to low radiation levels (i.e. annual dose typically below few kGy). Nevertheless, the conventional optical fibres must be carefully qualified as a spread in RIA of factor 10 is observed among optical fibres of different types and dopants. In higher radiation areas, special radiation resistant optical fibres are installed. For total dose above 1 kGy, the RIA of these special optical fibres is at least 10 times lower than the conventional optical fibres RIA at same irradiation conditions. 2400 km of these special radiation resistant optical fibres were recently procured at CERN. As part of this procurement process, a quality assurance plan including the irradiation testing of all 65 produced batches was set up. This presentation will review the selection process of the appropriate optical fibre types to be installed in the radiation environment of CERN. The methodology for choosing the irradiation parameters for the laboratory tests will be discussed together with an overview of the RIA of different optical fibre types under several irradiation conditions.

  14. INTEGRATED OPERATIONAL DOSIMETRY SYSTEM AT CERN.

    PubMed

    Dumont, Gérald; Pedrosa, Fernando Baltasar Dos Santos; Carbonez, Pierre; Forkel-Wirth, Doris; Ninin, Pierre; Fuentes, Eloy Reguero; Roesler, Stefan; Vollaire, Joachim

    2017-04-01

    CERN, the European Organization for Nuclear Research, upgraded its operational dosimetry system in March 2013 to be prepared for the first Long Shutdown of CERN's facilities. The new system allows the immediate and automatic checking and recording of the dosimetry data before and after interventions in radiation areas. To facilitate the analysis of the data in context of CERN's approach to As Low As Reasonably Achievable (ALARA), this new system is interfaced to the Intervention Management Planning and Coordination Tool (IMPACT). IMPACT is a web-based application widely used in all CERN's accelerators and their associated technical infrastructures for the planning, the coordination and the approval of interventions (work permit principle). The coupling of the operational dosimetry database with the IMPACT repository allows a direct and almost immediate comparison of the actual dose with the estimations, in addition to enabling the configuration of alarm levels in the dosemeter in function of the intervention to be performed. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Accelerator controls at CERN: Some converging trends

    NASA Astrophysics Data System (ADS)

    Kuiper, B.

    1990-08-01

    CERN's growing services to the high-energy physics community using frozen resources has led to the implementation of "Technical Boards", mandated to assist the management by making recommendations for rationalizations in various technological domains. The Board on Process Control and Electronics for Accelerators, TEBOCO, has emphasized four main lines which might yield economy in resources. First, a common architecture for accelerator controls has been agreed between the three accelerator divisions. Second, a common hardware/software kit has been defined, from which the large majority of future process interfacing may be composed. A support service for this kit is an essential part of the plan. Third, high-level protocols have been developed for standardizing access to process devices. They derive from agreed standard models of the devices and involve a standard control message. This should ease application development and mobility of equipment. Fourth, a common software engineering methodology and a commercial package of application development tools have been adopted. Some rationalization in the field of the man-machine interface and in matters of synchronization is also under way.

  16. Techniques for hazard analysis and their use at CERN.

    PubMed

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  17. Preparation of a primary argon beam for the CERN fixed target physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Küchler, D., E-mail: detlef.kuchler@cern.ch; O’Neil, M.; Scrivens, R.

    2014-02-15

    The fixed target experiment NA61 in the North Area of the Super Proton Synchrotron is studying phase transitions in strongly interacting matter. Up to now they used the primary beams available from the CERN accelerator complex (protons and lead ions) or fragmented beams created from the primary lead ion beam. To explore a wider range of energies and densities a request was made to provide primary argon and xenon beams. This paper describes the results of the setting up and 10 week test run of the Ar{sup 11+} beam from the 14.5 GHz ECR ion source and the linear acceleratormore » (Linac3) at CERN.« less

  18. Application of accelerator sources for pulsed neutron logging of oil and gas wells

    NASA Astrophysics Data System (ADS)

    Randall, R. R.

    1985-05-01

    Dresser Atlas introduced the first commercial pulsed neutron oil well log in the early 1960s. This log had the capability of differentiating oil from salt water in a completed well. In the late 1970s the first continuous carbon/oxygen (C/O) log capable of differentiating oil from fresh water was introduced. The sources used in these commercial logs are radial geometry deuterium-tritium reaction devices with Cockcroft-Walton voltage multipliers providing the accelerator voltage. The commercial logging tools using these accelerators are comprised of scintillators detectors, power supplies, line drivers and receivers, and various timing and communications electronics. They are used to measure either the time decay or energy spectra of neutron-induced gamma events. The time decay information is useful in determining the neutron capture cross section, and the energy spectra is used to characterize inelastic neutron events.

  19. Space Radiation Effects Laboratory

    NASA Technical Reports Server (NTRS)

    1969-01-01

    The SREL User's Handbook is designed to provide information needed by those who plan experiments involving the accelerators at this laboratory. Thus the Handbook will contain information on the properties of the machines, the beam parameters, the facilities and services provided for experimenters, etc. This information will be brought up to date as new equipment is added and modifications accomplished. This Handbook is influenced by the many excellent models prepared at other accelerator laboratories. In particular, the CERN Synchrocyclotron User's Handbook (November 1967) is closely followed in some sections, since the SREL Synchrocyclotron is a duplicate of the CERN machine. We wish to thank Dr. E. G. Michaelis for permission to draw so heavily on his work, particularly in Section II of this Handbook. We hope that the Handbook will prove useful, and will welcome suggestions and criticism.

  20. Shielding design for the front end of the CERN SPL.

    PubMed

    Magistris, Matteo; Silari, Marco; Vincke, Helmut

    2005-01-01

    CERN is designing a 2.2-GeV Superconducting Proton Linac (SPL) with a beam power of 4 MW, to be used for the production of a neutrino superbeam. The SPL front end will initially accelerate 2 x 10(14) negative hydrogen ions per second up to an energy of 120 MeV. The FLUKA Monte Carlo code was employed for shielding design. The proposed shielding is a combined iron-concrete structure, which also takes into consideration the required RF wave-guide ducts and access labyrinths to the machine. Two beam-loss scenarios were investigated: (1) constant beam loss of 1 Wm(-1) over the whole accelerator length and (2) full beam loss occurring at various locations. A comparison with results based on simplified approaches is also presented.

  1. [The CERN and the megascience].

    PubMed

    Aguilar Peris, José

    2006-01-01

    In this work we analyse the biggest particle accelerator in the world: the LHC (Large Hadron Collider). The ring shaped tunnel is 27 km long and it is buried over 110 meters underground, straddling the border betwen France and Switzerland at the CERN laboratory near Geneva. Its mission is to recreate the conditions that existed shortly after the Big-Bang and to look for the hypothesised Higgs particle. The LHC will accelerate protons near the speed of the light and collide them head on at an energy of to 14 TeV (1 TeV = 10(12) eV). Keeping such high energy in the proton beams requires enormous magnetic fields which are generated by superconducting electromagnets chilled to less than two degrees above absolute zero. It is expected that LHC will be inaugurated in summer 2007.

  2. Effects of Horizontal Acceleration on Human Visual Acuity and Stereopsis

    PubMed Central

    Horng, Chi-Ting; Hsieh, Yih-Shou; Tsai, Ming-Ling; Chang, Wei-Kang; Yang, Tzu-Hung; Yauan, Chien-Han; Wang, Chih-Hung; Kuo, Wu-Hsien; Wu, Yi-Chang

    2015-01-01

    The effect of horizontal acceleration on human visual acuity and stereopsis is demonstrated in this study. Twenty participants (mean age 22.6 years) were enrolled in the experiment. Acceleration from two different directions was performed at the Taiwan High-Speed Rail Laboratory. Gx and Gy (< and >0.1 g) were produced on an accelerating platform where the subjects stood. The visual acuity and stereopsis of the right eye were measured before and during the acceleration. Acceleration <0.1 g in the X- or Y-axis did not affect dynamic vision and stereopsis. Vision decreased (mean from 0.02 logMAR to 0.25 logMAR) and stereopsis declined significantly (mean from 40 s to 60.2 s of arc) when Gx > 0.1 g. Visual acuity worsened (mean from 0.02 logMAR to 0.19 logMAR) and poor stereopsis was noted (mean from 40 s to 50.2 s of arc) when Gy > 0.1 g. The effect of acceleration from the X-axis on the visual system was higher than that from the Y-axis. During acceleration, most subjects complained of ocular strain when reading. To our knowledge, this study is the first to report the exact levels of visual function loss during Gx and Gy. PMID:25607601

  3. Overview of LHC physics results at ICHEP

    ScienceCinema

    Mangano, Michelangelo

    2018-06-20

    This month LHC physics day will review the physics results presented by the LHC experiments at the 2010 ICHEP in Paris. The experimental presentations will be preceeded by the bi-weekly LHC accelerator status report.The meeting will be broadcast via EVO (detailed info will appear at the time of the meeting in the "Video Services" item on the left menu bar). For those attending, information on accommodation, access to CERN and laptop registration is available from http://cern.ch/lpcc/visits

  4. Overview of LHC physics results at ICHEP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2011-02-25

     This month LHC physics day will review the physics results presented by the LHC experiments at the 2010 ICHEP in Paris. The experimental presentations will be preceeded by the bi-weekly LHC accelerator status report.The meeting will be broadcast via EVO (detailed info will appear at the time of the meeting in the "Video Services" item on the left menu bar)For those attending, information on accommodation, access to CERN and laptop registration is available from http://cern.ch/lpcc/visits

  5. High Energy Electron Detection with ATIC

    NASA Technical Reports Server (NTRS)

    Chang, J.; Schmidt, W. K. H.; Adams, James H., Jr.; Ahn, H.; Ampe, J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The ATIC (Advanced Thin Ionization Calorimeter) balloon-borne ionization calorimeter is well suited to record and identify high energy cosmic ray electrons. The instrument was exposed to high-energy beams at CERN H2 bean-dine in September of 1999. We have simulated the performance of the instrument, and compare the simulations with actual high energy electron exposures at the CERN accelerator. Simulations and measurements do not compare exactly, in detail, but overall the simulations have predicted actual measured behavior quite well.

  6. Streamlining CASTOR to manage the LHC data torrent

    NASA Astrophysics Data System (ADS)

    Lo Presti, G.; Espinal Curull, X.; Cano, E.; Fiorini, B.; Ieri, A.; Murray, S.; Ponce, S.; Sindrilaru, E.

    2014-06-01

    This contribution describes the evolution of the main CERN storage system, CASTOR, as it manages the bulk data stream of the LHC and other CERN experiments, achieving over 90 PB of stored data by the end of LHC Run 1. This evolution was marked by the introduction of policies to optimize the tape sub-system throughput, going towards a cold storage system where data placement is managed by the experiments' production managers. More efficient tape migrations and recalls have been implemented and deployed where bulk meta-data operations greatly reduce the overhead due to small files. A repack facility is now integrated in the system and it has been enhanced in order to automate the repacking of several tens of petabytes, required in 2014 in order to prepare for the next LHC run. Finally the scheduling system has been evolved to integrate the internal monitoring. To efficiently manage the service a solid monitoring infrastructure is required, able to analyze the logs produced by the different components (about 1 kHz of log messages). A new system has been developed and deployed, which uses a transport messaging layer provided by the CERN-IT Agile Infrastructure and exploits technologies including Hadoop and HBase. This enables efficient data mining by making use of MapReduce techniques, and real-time data aggregation and visualization. The outlook for the future is also presented. Directions and possible evolution will be discussed in view of the restart of data taking activities.

  7. Accelerator Test of an Imaging Calorimeter

    NASA Technical Reports Server (NTRS)

    Christl, Mark J.; Adams, James H., Jr.; Binns, R. W.; Derrickson, J. H.; Fountain, W. F.; Howell, L. W.; Gregory, J. C.; Hink, P. L.; Israel, M. H.; Kippen, R. M.; hide

    2001-01-01

    The Imaging Calorimeter for ACCESS (ICA) utilizes a thin sampling calorimeter concept for direct measurements of high-energy cosmic rays. The ICA design uses arrays of small scintillating fibers to measure the energy and trajectory of the produced cascades. A test instrument has been developed to study the performance of this concept at accelerator energies and for comparison with simulations. Two test exposures have been completed using a CERN test beam. Some results from the accelerator tests are presented.

  8. NA61/SHINE facility at the CERN SPS: beams and detector system

    NASA Astrophysics Data System (ADS)

    Abgrall, N.; Andreeva, O.; Aduszkiewicz, A.; Ali, Y.; Anticic, T.; Antoniou, N.; Baatar, B.; Bay, F.; Blondel, A.; Blumer, J.; Bogomilov, M.; Bogusz, M.; Bravar, A.; Brzychczyk, J.; Bunyatov, S. A.; Christakoglou, P.; Cirkovic, M.; Czopowicz, T.; Davis, N.; Debieux, S.; Dembinski, H.; Diakonos, F.; Di Luise, S.; Dominik, W.; Drozhzhova, T.; Dumarchez, J.; Dynowski, K.; Engel, R.; Efthymiopoulos, I.; Ereditato, A.; Fabich, A.; Feofilov, G. A.; Fodor, Z.; Fulop, A.; Gaździcki, M.; Golubeva, M.; Grebieszkow, K.; Grzeszczuk, A.; Guber, F.; Haesler, A.; Hasegawa, T.; Hierholzer, M.; Idczak, R.; Igolkin, S.; Ivashkin, A.; Jokovic, D.; Kadija, K.; Kapoyannis, A.; Kaptur, E.; Kielczewska, D.; Kirejczyk, M.; Kisiel, J.; Kiss, T.; Kleinfelder, S.; Kobayashi, T.; Kolesnikov, V. I.; Kolev, D.; Kondratiev, V. P.; Korzenev, A.; Koversarski, P.; Kowalski, S.; Krasnoperov, A.; Kurepin, A.; Larsen, D.; Laszlo, A.; Lyubushkin, V. V.; Maćkowiak-Pawłowska, M.; Majka, Z.; Maksiak, B.; Malakhov, A. I.; Maletic, D.; Manglunki, D.; Manic, D.; Marchionni, A.; Marcinek, A.; Marin, V.; Marton, K.; Mathes, H.-J.; Matulewicz, T.; Matveev, V.; Melkumov, G. L.; Messina, M.; Mrówczyński, St.; Murphy, S.; Nakadaira, T.; Nirkko, M.; Nishikawa, K.; Palczewski, T.; Palla, G.; Panagiotou, A. D.; Paul, T.; Peryt, W.; Petukhov, O.; Pistillo, C.; Płaneta, R.; Pluta, J.; Popov, B. A.; Posiadala, M.; Puławski, S.; Puzovic, J.; Rauch, W.; Ravonel, M.; Redij, A.; Renfordt, R.; Richter-Was, E.; Robert, A.; Röhrich, D.; Rondio, E.; Rossi, B.; Roth, M.; Rubbia, A.; Rustamov, A.; Rybczyński, M.; Sadovsky, A.; Sakashita, K.; Savic, M.; Schmidt, K.; Sekiguchi, T.; Seyboth, P.; Sgalaberna, D.; Shibata, M.; Sipos, R.; Skrzypczak, E.; Słodkowski, M.; Sosin, Z.; Staszel, P.; Stefanek, G.; Stepaniak, J.; Stroebele, H.; Susa, T.; Szuba, M.; Tada, M.; Tereshchenko, V.; Tolyhi, T.; Tsenov, R.; Turko, L.; Ulrich, R.; Unger, M.; Vassiliou, M.; Veberic, D.; Vechernin, V. V.; Vesztergombi, G.; Vinogradov, L.; Wilczek, A.; Włodarczyk, Z.; Wojtaszek-Szwarz, A.; Wyszyński, O.; Zambelli, L.; Zipper, W.

    2014-06-01

    NA61/SHINE (SPS Heavy Ion and Neutrino Experiment) is a multi-purpose experimental facility to study hadron production in hadron-proton, hadron-nucleus and nucleus-nucleus collisions at the CERN Super Proton Synchrotron. It recorded the first physics data with hadron beams in 2009 and with ion beams (secondary 7Be beams) in 2011. NA61/SHINE has greatly profited from the long development of the CERN proton and ion sources and the accelerator chain as well as the H2 beamline of the CERN North Area. The latter has recently been modified to also serve as a fragment separator as needed to produce the Be beams for NA61/SHINE. Numerous components of the NA61/SHINE set-up were inherited from its predecessors, in particular, the last one, the NA49 experiment. Important new detectors and upgrades of the legacy equipment were introduced by the NA61/SHINE Collaboration. This paper describes the state of the NA61/SHINE facility — the beams and the detector system — before the CERN Long Shutdown I, which started in March 2013.

  9. Sharing scientific discovery globally: toward a CERN virtual visit service

    NASA Astrophysics Data System (ADS)

    Goldfarb, S.; Hatzifotiadou, D.; Lapka, M.; Papanestis, A.

    2017-10-01

    The installation of virtual visit services by the LHC collaborations began shortly after the first high-energy collisions were provided by the CERN accelerator in 2010. The experiments: ATLAS [1], CMS [2], LHCb [3], and ALICE [4] have all joined in this popular and effective method to bring the excitement of scientific exploration and discovery into classrooms and other public venues around the world. Their programmes, which use a combination of video conference, webcast, and video recording to communicate with remote audiences have already reached tens of thousands of viewers, and the demand only continues to grow. Other venues, such as the CERN Control Centre, are also considering similar permanent installations. We present a summary of the development of the various systems in use around CERN today, including the technology deployed and a variety of use cases. We then lay down the arguments for the creation of a CERN-wide service that would support these programmes in a more coherent and effective manner. Potential services include a central booking system and operational management similar to what is currently provided for the common CERN video conference facilities. Certain choices in technology could be made to support programmes based on popular tools including (but not limited to) Skype™ [5], Google Hangouts [6], Facebook Live [7], and Periscope [8]. Successful implementation of the project, which relies on close partnership between the experiments, CERN IT CDA [9], and CERN IR ECO [10], has the potential to reach an even larger, global audience, more effectively than ever before.

  10. A microprocessor-based system for continuous monitoring of radiation levels around the CERN PS and PSB accelerators

    NASA Astrophysics Data System (ADS)

    Agoritsas, V.; Beck, F.; Benincasa, G. P.; Bovigny, J. P.

    1986-06-01

    This paper describes a new beam loss monitor system which has been installed in the PS and PSB machines, replacing an earlier system. The new system is controlled by a microprocessor which can operate independently of the accelerator control system, though setting up and central display are usually done remotely, using the standard control system facilities.

  11. The CERN-EU high-energy Reference Field (CERF) facility: applications and latest developments

    NASA Astrophysics Data System (ADS)

    Silari, Marco; Pozzi, Fabio

    2017-09-01

    The CERF facility at CERN provides an almost unique high-energy workplace reference radiation field for the calibration and test of radiation protection instrumentation employed at high-energy accelerator facilities and for aircraft and space dosimetry. This paper describes the main features of the facility and supplies a non-exhaustive list of recent (as of 2005) applications for which CERF is used. Upgrade work started in 2015 to provide the scientific and industrial communities with a state-of-the-art reference facility is also discussed.

  12. Windows Terminal Servers Orchestration

    NASA Astrophysics Data System (ADS)

    Bukowiec, Sebastian; Gaspar, Ricardo; Smith, Tim

    2017-10-01

    Windows Terminal Servers provide application gateways for various parts of the CERN accelerator complex, used by hundreds of CERN users every day. The combination of new tools such as Puppet, HAProxy and Microsoft System Center suite enable automation of provisioning workflows to provide a terminal server infrastructure that can scale up and down in an automated manner. The orchestration does not only reduce the time and effort necessary to deploy new instances, but also facilitates operations such as patching, analysis and recreation of compromised nodes as well as catering for workload peaks.

  13. CERN openlab: Engaging industry for innovation in the LHC Run 3-4 R&D programme

    NASA Astrophysics Data System (ADS)

    Girone, M.; Purcell, A.; Di Meglio, A.; Rademakers, F.; Gunne, K.; Pachou, M.; Pavlou, S.

    2017-10-01

    LHC Run3 and Run4 represent an unprecedented challenge for HEP computing in terms of both data volume and complexity. New approaches are needed for how data is collected and filtered, processed, moved, stored and analysed if these challenges are to be met with a realistic budget. To develop innovative techniques we are fostering relationships with industry leaders. CERN openlab is a unique resource for public-private partnership between CERN and leading Information Communication and Technology (ICT) companies. Its mission is to accelerate the development of cutting-edge solutions to be used by the worldwide HEP community. In 2015, CERN openlab started its phase V with a strong focus on tackling the upcoming LHC challenges. Several R&D programs are ongoing in the areas of data acquisition, networks and connectivity, data storage architectures, computing provisioning, computing platforms and code optimisation and data analytics. This paper gives an overview of the various innovative technologies that are currently being explored by CERN openlab V and discusses the long-term strategies that are pursued by the LHC communities with the help of industry in closing the technological gap in processing and storage needs expected in Run3 and Run4.

  14. Measurements and FLUKA simulations of bismuth and aluminium activation at the CERN Shielding Benchmark Facility (CSBF)

    NASA Astrophysics Data System (ADS)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.

    2018-03-01

    The CERN High Energy AcceleRator Mixed field facility (CHARM) is located in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5 ṡ1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7 ṡ1010 p/s that then impacts on the CHARM target. The shielding of the CHARM facility also includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target. This facility consists of 80 cm of cast iron and 360 cm of concrete with barite concrete in some places. Activation samples of bismuth and aluminium were placed in the CSBF and in the CHARM access corridor in July 2015. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields for these samples. The results estimated by FLUKA Monte Carlo simulations are compared to activation measurements of these samples. The comparison between FLUKA simulations and the measured values from γ-spectrometry gives an agreement better than a factor of 2.

  15. TOWARDS A NOVEL MODULAR ARCHITECTURE FOR CERN RADIATION MONITORING.

    PubMed

    Boukabache, Hamza; Pangallo, Michel; Ducos, Gael; Cardines, Nicola; Bellotta, Antonio; Toner, Ciarán; Perrin, Daniel; Forkel-Wirth, Doris

    2017-04-01

    The European Organization for Nuclear Research (CERN) has the legal obligation to protect the public and the people working on its premises from any unjustified exposure to ionising radiation. In this context, radiation monitoring is one of the main concerns of the Radiation Protection Group. After 30 y of reliable service, the ARea CONtroller (ARCON) system is approaching the end of its lifecycle, which raises the need for new, more efficient radiation monitors with a high level of modularity to ensure better maintainability. Based on these two main principles, new detectors are currently being developed that will be capable of measuring very low dose rates down to 50 nSv h-1, whilst being able to measure radiation over an extensive range of 8 decades without any auto scaling. To reach these performances, CERN Radiation MOnitoring Electronics (CROME), the new generation of CERN radiation monitors, is based on the versatile architecture that includes new read-out electronics developed by the Instrumentation and Logistics section of the CERN Radiation Protection Group as well as a reconfigurable system on chip capable of performing complex processing calculations. Beside the capabilities of CROME to continuously measure the ambient dose rate, the system generates radiation alarms, provides interlock signals, drives alarm display units through a fieldbus and provides long-term, permanent and reliable data logging. The measurement tests performed during the first phase of the development show very promising results that pave the way to the second phase: the certification. © The Author 2016. Published by Oxford University Press.

  16. TOWARDS A NOVEL MODULAR ARCHITECTURE FOR CERN RADIATION MONITORING

    PubMed Central

    Boukabache, Hamza; Pangallo, Michel; Ducos, Gael; Cardines, Nicola; Bellotta, Antonio; Toner, Ciarán; Perrin, Daniel; Forkel-Wirth, Doris

    2017-01-01

    Abstract The European Organization for Nuclear Research (CERN) has the legal obligation to protect the public and the people working on its premises from any unjustified exposure to ionising radiation. In this context, radiation monitoring is one of the main concerns of the Radiation Protection Group. After 30 y of reliable service, the ARea CONtroller (ARCON) system is approaching the end of its lifecycle, which raises the need for new, more efficient radiation monitors with a high level of modularity to ensure better maintainability. Based on these two main principles, new detectors are currently being developed that will be capable of measuring very low dose rates down to 50 nSv h−1, whilst being able to measure radiation over an extensive range of 8 decades without any auto scaling. To reach these performances, CERN Radiation MOnitoring Electronics (CROME), the new generation of CERN radiation monitors, is based on the versatile architecture that includes new read-out electronics developed by the Instrumentation and Logistics section of the CERN Radiation Protection Group as well as a reconfigurable system on chip capable of performing complex processing calculations. Beside the capabilities of CROME to continuously measure the ambient dose rate, the system generates radiation alarms, provides interlock signals, drives alarm display units through a fieldbus and provides long-term, permanent and reliable data logging. The measurement tests performed during the first phase of the development show very promising results that pave the way to the second phase: the certification. PMID:27909154

  17. Air liquide 1.8 K refrigeration units for CERN LHC project

    NASA Astrophysics Data System (ADS)

    Hilbert, Benoît; Gistau-Baguer, Guy M.; Caillaud, Aurélie

    2002-05-01

    The Large Hadron Collider (LHC) will be CERN's next research instrument for high energy physics. This 27 km long circular accelerator will make intensive use of superconducting magnets, operated below 2.0 K. It will thus require high capacity refrigeration below 2.0 K [1, 2]. Coupled to a refrigerator providing 18 kW equivalent at 4.5 K [3], these systems will be able to absorb a cryogenic power of 2.4 kW at 1.8 K in nominal conditions. Air Liquide has designed one Cold Compressor System (CCS) pre-series for CERN-preceding 3 more of them (among 8 in total located around the machine). These systems, making use of cryogenic centrifugal compressors in a series arrangement coupled to room temperature screw compressors, are presented. Key components characteristics will be given.

  18. Data acquisition software for DIRAC experiment

    NASA Astrophysics Data System (ADS)

    Olshevsky, V.; Trusov, S.

    2001-08-01

    The structure and basic processes of data acquisition software of the DIRAC experiment for the measurement of π +π - atom lifetime are described. The experiment is running on the PS accelerator of CERN. The developed software allows one to accept, record and distribute up to 3 Mbytes of data to consumers in one accelerator supercycle of 14.4 s duration. The described system is successfully in use in the experiment since its startup in 1998.

  19. Monitoring Evolution at CERN

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Fiorini, B.; Murphy, S.; Pigueiras, L.; Santos, M.

    2015-12-01

    Over the past two years, the operation of the CERN Data Centres went through significant changes with the introduction of new mechanisms for hardware procurement, new services for cloud provisioning and configuration management, among other improvements. These changes resulted in an increase of resources being operated in a more dynamic environment. Today, the CERN Data Centres provide over 11000 multi-core processor servers, 130 PB disk servers, 100 PB tape robots, and 150 high performance tape drives. To cope with these developments, an evolution of the data centre monitoring tools was also required. This modernisation was based on a number of guiding rules: sustain the increase of resources, adapt to the new dynamic nature of the data centres, make monitoring data easier to share, give more flexibility to Service Managers on how they publish and consume monitoring metrics and logs, establish a common repository of monitoring data, optimise the handling of monitoring notifications, and replace the previous toolset by new open source technologies with large adoption and community support. This contribution describes how these improvements were delivered, present the architecture and technologies of the new monitoring tools, and review the experience of its production deployment.

  20. Accounting for measurement error in log regression models with applications to accelerated testing.

    PubMed

    Richardson, Robert; Tolley, H Dennis; Evenson, William E; Lunt, Barry M

    2018-01-01

    In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  1. Commissioning of the helium cryogenic system for the HIE- ISOLDE accelerator upgrade at CERN

    NASA Astrophysics Data System (ADS)

    Delruelle, N.; Inglese, V.; Leclercq, Y.; Pirotte, O.; Williams, L.

    2015-12-01

    The High Intensity and Energy ISOLDE (HIE-ISOLDE) project is a major upgrade of the existing ISOLDE and REX-ISOLDE facilities at CERN. The most significant improvement will come from replacing the existing REX accelerating structure by a superconducting linear accelerator (SC linac) composed ultimately of six cryo-modules installed in series, each containing superconducting RF cavities and solenoids operated at 4.5 K. In order to provide the cooling capacity at all temperature levels between 300 K and 4.5 K for the six cryo-modules, an existing helium refrigerator, manufactured in 1986 and previously used to cool the ALEPH magnet during LEP operation from 1989 to 2000, has been refurbished, reinstalled and recommissioned in a dedicated building located next to the HIE-ISOLDE experimental hall. This helium refrigerator has been connected to a new cryogenic distribution line, consisting of a 30-meter long vacuum-insulated transfer line, a 2000-liter storage dewar and six interconnecting valve boxes, one for each cryo-module. This paper describes the whole cryogenic system and presents the commissioning results including the preliminary operation at 4.5 K of the first cryo- module in the experimental hall.

  2. Target R and D for high power proton beam applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fabich, A.

    High power targets are one of the major issues in an accelerator complex for future HEP physic studies. The paper will review status of studies worldwide. It will focus on the status of the MERIT mercury-jet target experiment at CERN.

  3. The management of large cabling campaigns during the Long Shutdown 1 of LHC

    NASA Astrophysics Data System (ADS)

    Meroli, S.; Machado, S.; Formenti, F.; Frans, M.; Guillaume, J. C.; Ricci, D.

    2014-03-01

    The Large Hadron Collider at CERN entered into its first 18 month-long shutdown period in February 2013. During this period the entire CERN accelerator complex will undergo major consolidation and upgrade works, preparing the machines for LHC operation at nominal energy (7 TeV/beam). One of the most challenging activities concerns the cabling infrastructure (copper and optical fibre cables) serving the CERN data acquisition, networking and control systems. About 1000 kilometres of cables, distributed in different machine areas, will be installed, representing an investment of about 15 MCHF. This implies an extraordinary challenge in terms of project management, including resource and activity planning, work execution and quality control. The preparation phase of this project started well before its implementation, by defining technical solutions and setting financial plans for staff recruitment and material supply. Enhanced task coordination was further implemented by deploying selected competences to form a central support team.

  4. Underground neutrino detectors for particle and astroparticle Science: The Giant Liquid Argon Charge Imaging ExpeRiment (GLACIER)

    NASA Astrophysics Data System (ADS)

    Rubbia, André

    2009-06-01

    The current focus of the CERN program is the Large Hadron Collider (LHC), however, CERN is engaged in long baseline neutrino physics with the CNGS project and supports T2K as recognized CERN RE13, and for good reasons: a number of observed phenomena in high-energy physics and cosmology lack their resolution within the Standard Model of particle physics; these puzzles include the origin of neutrino masses, CP-violation in the leptonic sector, and baryon asymmetry of the Universe. They will only partially be addressed at LHC. A positive measurement of sin2 2θ13 > 0.01 would certainly give a tremendous boost to neutrino physics by opening the possibility to study CP violation in the lepton sector and the determination of the neutrino mass hierarchy with upgraded conventional super-beams. These experiments (so called 'Phase II') require, in addition to an upgraded beam power, next generation very massive neutrino detectors with excellent energy resolution and high detection efficiency in a wide neutrino energy range, to cover 1st and 2nd oscillation maxima, and excellent particle identification and p0 background suppression. Two generations of large water Cherenkov detectors at Kamioka (Kamiokande and Super-Kamiokande) have been extremely successful. And there are good reasons to consider a third generation water Cherenkov detector with an order of magnitude larger mass than Super-Kamiokande for both non-accelerator (proton decay, supernovae,...) and accelerator-based physics. On the other hand, a very massive underground liquid Argon detector of about 100 kton could represent a credible alternative for the precision measurements of 'Phase II' and aim at significantly new results in neutrino astroparticle and non-accelerator-based particle physics (e.g. proton decay).

  5. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; hide

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  6. Build your own

    NASA Astrophysics Data System (ADS)

    Moniz, Ernest; McAndrew, Elizabeth; Chan, Albert; Eggleton, David

    2015-01-01

    In reply to the physicsworld.com blog post "Build your own LEGO particle collider" (2 December 2014, http://ow.ly/Fe3Vy, see also p3) which described a campaign to get the popular plastic-bricks firm to make a building set based on a particle accelerator, such as the Large Hadron Collider at CERN.

  7. Disturbance-mediated accelerated succession in two Michigan forest types

    USGS Publications Warehouse

    Abrams, Marc D.; Scott, Michael L.

    1989-01-01

    In northern lower Michigan, logging accelerated sugar maple (Acer saccharum) dominance in a northern white cedar (Thuja occidentals) community, and clear-cutting and burning quickly converted certain sites dominated by mature jack pine (Pinus banksiana) to early-succesional hardwoods, including Prunus, Populus, and Quercus. In both forest types the succeeding hardwoods should continue to increase in the future at the expense of the pioneer conifer species. In the cedar example, sugar maple was also increasing a an undisturbed, old-growth stand, but at a much reduced rate than in the logged stand. Traditionally, disturbance was through to set back succession to some earlier stage. However, out study sites and at least several other North American forest communities exhibited accelerated succession following a wide range of disturbances, including logging fire, ice storms, wind-throw, disease, insect attack, and herbicide spraying.

  8. The IPHI Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferdinand, Robin; Beauvais, Pierre-Yves

    High Power Proton Accelerators (HPPAs) are studied for several projects based on high-flux neutron sources driven by proton or deuteron beams. Since the front end is considered as the most critical part of such accelerators, the two French national research agencies CEA and CNRS decided to collaborate in 1997 to study and build a High-Intensity Proton Injector (IPHI). The main objective of this project is to master the complex technologies used and the concepts of manufacturing and controlling the HPPAs. Recently, a collaboration agreement was signed with CERN and led to some evolutions in the design and in the schedule.more » The IPHI design current was maintained at 100 mA in Continuous Wave mode. This choice should allow to produce a high reliability beam at reduced intensity (typically 30 mA) tending to fulfill the Accelerator Driven System requirements. The output energy of the Radio Frequency Quadrupole (RFQ), was reduced from 5 to 3 MeV, allowing then the adjunction and the test, in pulsed operation of a chopper line developed by CERN for the Superconducting Proton Linac (SPL). In a final step, the IPHI RFQ and the chopper line should become parts of the SPL injector. In this paper, the IPHI project and the recent evolutions are reported together with the construction and operation schedule.« less

  9. Big data analytics for the Future Circular Collider reliability and availability studies

    NASA Astrophysics Data System (ADS)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  10. Simon van der Meer (1925-2011):. A Modest Genius of Accelerator Science

    NASA Astrophysics Data System (ADS)

    Chohan, Vinod C.

    2011-02-01

    Simon van der Meer was a brilliant scientist and a true giant of accelerator science. His seminal contributions to accelerator science have been essential to this day in our quest for satisfying the demands of modern particle physics. Whether we talk of long base-line neutrino physics or antiproton-proton physics at Fermilab or proton-proton physics at LHC, his techniques and inventions have been a vital part of the modern day successes. Simon van der Meer and Carlo Rubbia were the first CERN scientists to become Nobel laureates in Physics, in 1984. Van der Meer's lesserknown contributions spanned a whole range of subjects in accelerator science, from magnet design to power supply design, beam measurements, slow beam extraction, sophisticated programs and controls.

  11. Cryogenic studies for the proposed CERN large hadron electron collider (LHEC)

    NASA Astrophysics Data System (ADS)

    Haug, F.; LHeC Study Team, The

    2012-06-01

    The LHeC (Large Hadron electron Collider) is a proposed future colliding beam facility for lepton-nucleon scattering particle physics at CERN. A new 60 GeV electron accelerator will be added to the existing 27 km circumference 7 TeV LHC for collisions of electrons with protons and heavy ions. Two basic design options are being pursued. The first is a circular accelerator housed in the existing LHC tunnel which is referred to as the "Ring-Ring" version. Low field normal conducting magnets guide the particle beam while superconducting (SC) RF cavities cooled to 2 K are installed at two opposite locations at the LHC tunnel to accelerate the beams. For this version in addition a 10 GeV re-circulating SC injector will be installed. In total four refrigerators with cooling capacities between 1.2 kW and 3 kW @ 4.5 K are needed. The second option, referred to as the "Linac-Ring" version consists of a race-track re-circulating energyrecovery type machine with two 1 km long straight acceleration sections. The 944 high field 2 K SC cavities dissipate 30 kW at CW operation. Eight 10 kW @ 4.5 K refrigerators are proposed. The particle detector contains a combined SC solenoid and dipole forming the cold mass and an independent liquid argon calorimeter. Cooling is done with two individual small sized cryoplants; a 4.5 K helium, and a 87 K liquid nitrogen plant.

  12. Physics at the SPS.

    PubMed

    Gatignon, L

    2018-05-01

    The CERN Super Proton Synchrotron (SPS) has delivered a variety of beams to a vigorous fixed target physics program since 1978. In this paper, we restrict ourselves to the description of a few illustrative examples in the ongoing physics program at the SPS. We will outline the physics aims of the COmmon Muon Proton Apparatus for Structure and Spectroscopy (COMPASS), north area 64 (NA64), north area 62 (NA62), north area 61 (NA61), and advanced proton driven plasma wakefield acceleration experiment (AWAKE). COMPASS studies the structure of the proton and more specifically of its spin. NA64 searches for the dark photon A', which is the messenger for interactions between normal and dark matter. The NA62 experiment aims at a 10% precision measurement of the very rare decay K + → π + νν. As this decay mode can be calculated very precisely in the Standard Model, it offers a very good opportunity to look for new physics beyond the Standard Model. The NA61/SHINE experiment studies the phase transition to Quark Gluon Plasma, a state in which the quarks and gluons that form the proton and the neutron are de-confined. Finally, AWAKE investigates proton-driven wake field acceleration: a promising technique to accelerate electrons with very high accelerating gradients. The Physics Beyond Colliders study at CERN is paving the way for a significant and diversified continuation of this already rich and compelling physics program that is complementary to the one at the big colliders like the Large Hadron Collider.

  13. Physics at the SPS

    NASA Astrophysics Data System (ADS)

    Gatignon, L.

    2018-05-01

    The CERN Super Proton Synchrotron (SPS) has delivered a variety of beams to a vigorous fixed target physics program since 1978. In this paper, we restrict ourselves to the description of a few illustrative examples in the ongoing physics program at the SPS. We will outline the physics aims of the COmmon Muon Proton Apparatus for Structure and Spectroscopy (COMPASS), north area 64 (NA64), north area 62 (NA62), north area 61 (NA61), and advanced proton driven plasma wakefield acceleration experiment (AWAKE). COMPASS studies the structure of the proton and more specifically of its spin. NA64 searches for the dark photon A', which is the messenger for interactions between normal and dark matter. The NA62 experiment aims at a 10% precision measurement of the very rare decay K+ → π+νν. As this decay mode can be calculated very precisely in the Standard Model, it offers a very good opportunity to look for new physics beyond the Standard Model. The NA61/SHINE experiment studies the phase transition to Quark Gluon Plasma, a state in which the quarks and gluons that form the proton and the neutron are de-confined. Finally, AWAKE investigates proton-driven wake field acceleration: a promising technique to accelerate electrons with very high accelerating gradients. The Physics Beyond Colliders study at CERN is paving the way for a significant and diversified continuation of this already rich and compelling physics program that is complementary to the one at the big colliders like the Large Hadron Collider.

  14. In AppreciationThe Depth and Breadth of John Bell's Physics

    NASA Astrophysics Data System (ADS)

    Jackiw, Roman; Shimony, Abner

    This essay surveys the work of John Stewart Bell, one of the great physicists of the twentieth century. Section 1 is a brief biography, tracing his career from working-class origins and undergraduate training in Belfast, Northern Ireland, to research in accelerator and nuclear physics in the British national laboratories at Harwell and Malvern, to his profound research on elementary particle physics as a member of the Theory Group at CERN and his equally profound ``hobby'' of investigating the foundations of quantum mechanics. Section 2 concerns this hobby, which began in his discontent with Bohr's and Heisenberg's analyses of the measurement process. He was attracted to the program of hidden variables interpretations, but he revolutionized the foundations of quantum mechanics by a powerful negative result: that no hidden variables theory that is ``local'' (in a clear and well-motivated sense) can agree with all the correlations predicted by quantum mechanics regarding well-separated systems. He further deepened the foundations of quantum mechanics by penetrating conceptual analyses of results concerning measurement theory of von Neumann, de Broglie and Bohm, Gleason, Jauch and Piron, Everett, and Ghirardi-Rimini-Weber. Bell's work in particle theory (Section 3) began with a proof of the CPT theorem in his doctoral dissertation, followed by investigations of the phenomenology of CP-violating experiments. At CERN Bell investigated the commutation relations in current algebras from various standpoints. The failure of current algebra combined with partially conserved current algebra to permit the experimentally observed decay of the neutral pi-meson into two photons stimulated the discovery by Bell and Jackiw of anomalous or quantal symmetry breaking, which has numerous implications for elementary particle phenomena. Other late investigations of Bell on elementary particle physics were bound states in quantum chromodynamics (in collaboration with Bertlmann) and estimates for the anomalous magnetic moment of the muon (in collaboration with de Rafael). Section 4 concerns accelerations, starting at Harwell with the algebra of strong focusing and the stability of orbits in linear accelerators and synchrotrons. At CERN he continued to contribute to accelerator physics, and with his wife Mary Bell he wrote on electron cooling and Beamstrahlung. A spectacular late achievement in accelerator physics was the demonstration (in collaboration with Leinaas) that the effective black-body radiation seen by an accelerated observer in an electromagnetic vacuum - the ``Unruh effect''- had already been observed experimentally in the partial depolarization of electrons traversing circular orbits.

  15. AliEn—ALICE environment on the GRID

    NASA Astrophysics Data System (ADS)

    Saiz, P.; Aphecetche, L.; Bunčić, P.; Piskač, R.; Revsbech, J.-E.; Šego, V.; Alice Collaboration

    2003-04-01

    AliEn ( http://alien.cern.ch) (ALICE Environment) is a Grid framework built on top of the latest Internet standards for information exchange and authentication (SOAP, PKI) and common Open Source components. AliEn provides a virtual file catalogue that allows transparent access to distributed datasets and a number of collaborating Web services which implement the authentication, job execution, file transport, performance monitor and event logging. In the paper we will present the architecture and components of the system.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, V.E.; Carmony, D.D.; Garfinkel, A.F.

    This report discusses: The CDF for {bar p}-p Collisions at FNAL; The L3 Detector for e{sup +}e{sup {minus}} Collisions at CERN; The SCD Detector for pp Collisions at the SSCL (calorimeters); The SDC Detector for pp Collisions at the SSCL (muon detector); The CO experiment for {bar p}-p Collisions at FNAL; and Accelerator Physics at Fermilab.

  17. Fermilab Heroes of the LHC: Joel Butler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, Joel

    2017-08-23

    Particle physics research is both international and collaborative, with large national laboratories working together to most efficiently advance science. Joel Butler, Distinguished Scientist at Fermi National Accelerator Laboratory is the leader of the Compact Muon Solenoid experiment at the CERN laboratory in Europe. In this video, Joel tells us a bit about what it’s like.

  18. Electronic Desorption of gas from metals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molvik, A W; Kollmus, H; Mahner, E

    During heavy ion operation in several particle accelerators world-wide, dynamic pressure rises of orders of magnitude were triggered by lost beam ions that bombarded the vacuum chamber walls. This ion-induced molecular desorption, observed at CERN, GSI, and BNL, can seriously limit the ion beam lifetime and intensity of the accelerator. From dedicated test stand experiments we have discovered that heavy-ion induced gas desorption scales with the electronic energy loss (dE{sub e}/dx) of the ions slowing down in matter; but it varies only little with the ion impact angle, unlike electronic sputtering.

  19. Postfire logging: is it beneficial to a forest?

    Treesearch

    Sally Duncan

    2002-01-01

    Public debate on postfire logging has intensified in recent years, particularly since passage of the "salvage rider" in 1995, directing accelerated harvest of dead trees in the western United States. Supporters of postfires logging argue that it is part of a suite of restoration techniques, and that removal of timber means reduction of fuels for...

  20. Accelerators for America's Future

    NASA Astrophysics Data System (ADS)

    Bai, Mei

    2016-03-01

    Particle accelerator, a powerful tool to energize beams of charged particles to a desired speed and energy, has been the working horse for investigating the fundamental structure of matter and fundermental laws of nature. Most known examples are the 2-mile long Stanford Linear Accelerator at SLAC, the high energy proton and anti-proton collider Tevatron at FermiLab, and Large Hadron Collider that is currently under operation at CERN. During the less than a century development of accelerator science and technology that led to a dazzling list of discoveries, particle accelerators have also found various applications beyond particle and nuclear physics research, and become an indispensible part of the economy. Today, one can find a particle accelerator at almost every corner of our lives, ranging from the x-ray machine at the airport security to radiation diagnostic and therapy in hospitals. This presentation will give a brief introduction of the applications of this powerful tool in fundermental research as well as in industry. Challenges in accelerator science and technology will also be briefly presented

  1. Path to AWAKE: Evolution of the concept

    DOE PAGES

    Caldwell, A.; Adli, E.; Amorim, L.; ...

    2016-01-02

    This study describes the conceptual steps in reaching the design of the AWAKE experiment currently under construction at CERN. We start with an introduction to plasma wakefield acceleration and the motivation for using proton drivers. We then describe the self-modulation instability – a key to an early realization of the concept. This is then followed by the historical development of the experimental design, where the critical issues that arose and their solutions are described. We conclude with the design of the experiment as it is being realized at CERN and some words on the future outlook. A summary of themore » AWAKE design and construction status as presented in this conference is given in Gschwendtner et al. [1] .« less

  2. The high Beta cryo-modules and the associated cryogenic system for the HIE-ISOLDE upgrade at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delruelle, N.; Leclercq, Y.; Pirotte, O.

    2014-01-29

    The major upgrade of the energy and intensity of the existing ISOLDE and REX-ISOLDE radioactive ion beam facilities at CERN requires the replacement of most of the existing ISOLDE post-acceleration equipment by a superconducting linac based on quarter-wave resonators housed together with superconducting solenoids in a series of four high-β and two low-β cryo-modules. As well as providing optimum conditions for physics, the cryo-modules need to function under stringent vacuum and cryogenic conditions. We present the detail design and expected cryogenic performance of the high- β cryo-module together with the cryogenic supply and distribution system destined to service the completemore » superconducting linac.« less

  3. 40th Anniversary of the First Proton-Proton Collisions in the CERN Intersecting Storage Rings (ISR)

    ScienceCinema

    None

    2018-06-20

    Welcome, Luigi di Lella and Rolf Heuer-Design and Construction of the ISR, Kurt Hubner-Physics at small angles, Ugo Amaldi (TERA Foundation)-The Impact of the ISR on Accelerator Physics and Technology, Philip J. Bryant-Physics at high transverse momentum, Pierre Darriulat (VATLY-Hanoi). Concluding remarks, Rolf Heuer

  4. How to Create Black Holes on Earth

    ERIC Educational Resources Information Center

    Bleicher, Marcus

    2007-01-01

    We present a short overview on the ideas of large extra dimensions and their implications for the possible production of micro black holes in the next generation particle accelerator at CERN (Geneva, Switzerland) from this year on. In fact, the possibility of black hole production on Earth is currently one of the most exciting predictions for the…

  5. Applied geodesy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, S.

    1987-01-01

    This volume is based on the proceedings of the CERN Accelerator School's course on Applied Geodesy for Particle Accelerators held in April 1986. The purpose was to record and disseminate the knowledge gained in recent years on the geodesy of accelerators and other large systems. The latest methods for positioning equipment to sub-millimetric accuracy in deep underground tunnels several tens of kilometers long are described, as well as such sophisticated techniques as the Navstar Global Positioning System and the Terrameter. Automation of better known instruments such as the gyroscope and Distinvar is also treated along with the highly evolved treatmentmore » of components in a modern accelerator. Use of the methods described can be of great benefit in many areas of research and industrial geodesy such as surveying, nautical and aeronautical engineering, astronomical radio-interferometry, metrology of large components, deformation studies, etc.« less

  6. Heavy-ion induced electronic desorption of gas from metals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molvik, A W; Kollmus, H; Mahner, E

    During heavy ion operation in several particle accelerators world-wide, dynamic pressure rises of orders of magnitude were triggered by lost beam ions that bombarded the vacuum chamber walls. This ion-induced molecular desorption, observed at CERN, GSI, and BNL, can seriously limit the ion beam lifetime and intensity of the accelerator. From dedicated test stand experiments we have discovered that heavy-ion induced gas desorption scales with the electronic energy loss (dE{sub e}/d/dx) of the ions slowing down in matter; but it varies only little with the ion impact angle, unlike electronic sputtering.

  7. Robert R. Wilson Prize III: Applications of Intrabeam Scattering Formulae to a Myriad of Accelerator Systems

    NASA Astrophysics Data System (ADS)

    Mtingwa, Sekazi K.

    2017-01-01

    We discuss our entree into accelerator physics and the problem of intrabeam scattering in particular. We focus on the historical importance of understanding intrabeam scattering for the successful operation of Fermilab's Accumulator and Tevatron and the subsequent hunt for the top quark, and its importance for successful operation of CERN's Large Hadron Collider that discovered the Higgs boson. We provide details on intrabeam scattering formalisms for hadron and electron beams at high energies, concluding with an Ansatz by Karl Bane that has applications to electron damping rings and synchrotron light sources.

  8. Future HEP Accelerators: The US Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhat, Pushpalatha; Shiltsev, Vladimir

    2015-11-02

    Accelerator technology has advanced tremendously since the introduction of accelerators in the 1930s, and particle accelerators have become indispensable instruments in high energy physics (HEP) research to probe Nature at smaller and smaller distances. At present, accelerator facilities can be classified into Energy Frontier colliders that enable direct discoveries and studies of high mass scale particles and Intensity Frontier accelerators for exploration of extremely rare processes, usually at relatively low energies. The near term strategies of the global energy frontier particle physics community are centered on fully exploiting the physics potential of the Large Hadron Collider (LHC) at CERN throughmore » its high-luminosity upgrade (HL-LHC), while the intensity frontier HEP research is focused on studies of neutrinos at the MW-scale beam power accelerator facilities, such as Fermilab Main Injector with the planned PIP-II SRF linac project. A number of next generation accelerator facilities have been proposed and are currently under consideration for the medium- and long-term future programs of accelerator-based HEP research. In this paper, we briefly review the post-LHC energy frontier options, both for lepton and hadron colliders in various regions of the world, as well as possible future intensity frontier accelerator facilities.« less

  9. The GBAR experiment: gravitational behaviour of antihydrogen at rest

    NASA Astrophysics Data System (ADS)

    Perez, P.; Sacquin, Y.

    2012-09-01

    The recently recommended experiment GBAR is foreseen to run at CERN at the AD/ELENA antiproton source. It aims at performing the first measurement of the Earth's gravitational acceleration on antimatter by observing the free-fall of antihydrogen atoms. This requires creating anti-atoms at an unprecedented low energy. The different steps of the experiment and their present status are reviewed.

  10. Plans for an ERL Test Facility at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Erik; Bruning, O S; Calaga, Buchi Rama Rao

    2014-12-01

    The baseline electron accelerator for LHeC and one option for FCC-he is an Energy Recovery Linac. To prepare and study the necessary key technologies, CERNhas started – in collaboration with JLAB and Mainz University – the conceptual design of an ERL Test Facility (ERL-TF). Staged construction will allow the study under different conditions with up to 3 passes, beam energies of up to about 1 GeV and currents of up to 50 mA. The design and development of superconducting cavity modules, including coupler and HOM damper designs, are also of central importance for other existing and future accelerators and theirmore » tests are at the heart of the current ERL-TF goals. However, the ERL-TF could also provide a unique infrastructure for several applications that go beyond developing and testing the ERL technology at CERN. In addition to experimental studies of beam dynamics, operational and reliability issues in an ERL, it could equally serve for quench tests of superconducting magnets, as physics experimental facility on its own right or as test stand for detector developments. This contribution will describe the goals and the concept of the facility and the status of the R&D.« less

  11. CERN-derived analysis of lunar radiation backgrounds

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.; Svoboda, Robert

    1993-01-01

    The Moon produces radiation which background-limits scientific experiments there. Early analyses of these backgrounds have either failed to take into consideration the effect of charm in particle physics (because they pre-dated its discovery), or have used branching ratios which are no longer strictly valid (due to new accelerator data). We are presently investigating an analytical program for deriving muon and neutrino spectra generated by the Moon, converting an existing CERN computer program known as GEANT which does the same for the Earth. In so doing, this will (1) determine an accurate prompt neutrino spectrum produced by the lunar surface; (2) determine the lunar subsurface particle flux; (3) determine the consequence of charm production physics upon the lunar background radiation environment; and (4) provide an analytical tool for the NASA astrophysics community with which to begin an assessment of the Moon as a scientific laboratory versus its particle radiation environment. This will be done on a recurring basis with the latest experimental results of the particle data groups at Earth-based high-energy accelerators, in particular with the latest branching ratios for charmed meson decay. This will be accomplished for the first time as a full 3-dimensional simulation.

  12. Space charge problems in high intensity RFQs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiss, M.

    1996-06-01

    Measurements were made to check the performance of the CERN high intensity RFQs (RFQ2A and RFQ2B) and assess the validity of the design approach; the study of space charge effects was undertaken in this context. RFQ2A and RFQ2B are 200 mA, 750 keV proton accelerators, operating at 202.56 MHz. Since the beginning of 1993, RFQ2B serves as injector to the CERN 50 MeV Alvarez linac (Linac 2). In 1992, both RFQs were on the test stand to undergo a series of beam measurements, which were compared with computations. The studies concerning the RFQ2A were more detailed and they are reportedmore » in this paper. {copyright} {ital 1996 American Institute of Physics.}« less

  13. SLHC, the High-Luminosity Upgrade (public event)

    ScienceCinema

    None

    2017-12-09

    In the morning of June 23rd a public event is organised in CERN's Council Chamber with the aim of providing the particle physics community with up-to-date information about the strategy for the LHC luminosity upgrade and to describe the current status of preparation work. The presentations will provide an overview of the various accelerator sub-projects, the LHC physics prospects and the upgrade plans of ATLAS and CMS. This event is organised in the framework of the SLHC-PP project, which receives funding from the European Commission for the preparatory phase of the LHC High Luminosity Upgrade project. Informing the public is among the objectives of this EU-funded project. A simultaneous transmission of this meeting will be broadcast, available at the following address: http://webcast.cern.ch/

  14. Estimating two indirect logging costs caused by accelerated erosion.

    Treesearch

    Glen O. Klock

    1976-01-01

    In forest areas where high soil erosion potential exists, a comparative yarding cost estimate, including the indirect costs determined by methods proposed here, shows that the total cost of using "advanced" logging methods may be less than that of "traditional" systems.

  15. Elementary Particle Physics and High Energy Phenomena: Final Report for FY2010-13

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cumalat, John P.; de Alwis, Senarath P.; DeGrand, Thomas A.

    2013-06-27

    The work under this grant consists of experimental, theoretical, and phenomenological research on the fundamental properties of high energy subnuclear particles. The work is conducted at the University of Colorado, the European Organization for Nuclear Research (CERN), the Japan Proton Accelerator Research Complex (J-PARC), Fermi National Accelerator Laboratory (FNAL), SLAC National Accelerator Laboratory (SLAC), Los Alamos National Laboratory (LANL), and other facilities, employing neutrino-beam experiments, test beams of various particles, and proton-proton collider experiments. It emphasizes mass generation and symmetry-breaking, neutrino oscillations, bottom particle production and decay, detector development, supergravity, supersymmetry, superstrings, quantum chromodynamics, nonequilibrium statistical mechanics, cosmology, phase transitions,more » lattice gauge theory, and anomaly-free theories. The goals are to improve our understanding of the basic building blocks of matter and their interactions. Data from the Large Hadron Collider at CERN have revealed new interactions responsible for particle mass, and perhaps will lead to a more unified picture of the forces among elementary material constituents. To this end our research includes searches for manifestations of theories such as supersymmetry and new gauge bosons, as well as the production and decay of heavy-flavored quarks. Our current work at J-PARC, and future work at new facilities currently under conceptual design, investigate the specifics of how the neutrinos change flavor. The research is integrated with the training of students at all university levels, benefiting both the manpower and intellectual base for future technologies.« less

  16. Autopilot regulation for the Linac4 H- ion source

    NASA Astrophysics Data System (ADS)

    Voulgarakis, G.; Lettry, J.; Mattei, S.; Lefort, B.; Costa, V. J. Correia

    2017-08-01

    Linac4 is a 160 MeV H- linear accelerator part of the upgrade of the LHC injector chain. Its cesiated surface H- source is designed to provide a beam intensity of 40-50mA. It is operated with periodical Cs-injection at typically 30 days intervals [1] and this implies that the beam parameters will slowly evolve during operation. Autopilot is a control software package extending CERN developed Inspector framework. The aim of Autopilot is to automatize the mandatory optimization and cesiation processes and to derive performance indicators, thus keeping human intervention minimal. Autopilot has been developed by capitalizing on the experience from manually operating the source. It comprises various algorithms running in real-time, which have been devised to: • Optimize the ion source performance by regulation of H2 injection, RF power and frequency. • Describe the performance of the source with performance indicators, which can be easily understood by operators. • Identify failures, try to recover the nominal operation and send warning in case of deviation from nominal operation. • Make the performance indicators remotely available through Web pages.Autopilot is at the same level of hierarchy as an operator, in the CERN infrastructure. This allows the combination of all ion source devices, providing the required flexibility. Autopilot is executed in a dedicated server, ensuring unique and centralized control, yet allowing multiple operators to interact at runtime, always coordinating between them. Autopilot aims at flexibility, adaptability, portability and scalability, and can be extended to other components of CERN's accelerators. In this paper, a detailed description of the Autopilot algorithms is presented, along with first results of operating the Linac4 H- Ion Source with Autopilot.

  17. Power Supplies for High Energy Particle Accelerators

    NASA Astrophysics Data System (ADS)

    Dey, Pranab Kumar

    2016-06-01

    The on-going research and the development projects with Large Hadron Collider at CERN, Geneva, Switzerland has generated enormous enthusiasm and interest amongst all to know about the ultimate findings on `God's Particle'. This paper has made an attempt to unfold the power supply requirements and the methodology adopted to provide the stringent demand of such high energy particle accelerators during the initial stages of the search for the ultimate particles. An attempt has also been made to highlight the present status on the requirement of power supplies in some high energy accelerators with a view that, precautionary measures can be drawn during design and development from earlier experience which will be of help for the proposed third generation synchrotron to be installed in India at a huge cost.

  18. A measurement of hadron production cross sections for the simulation of accelerator neutrino beams and a search for muon-neutrino to electron-neutrino oscillations in the Δm 2 about equals 1-eV 2 region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmitz, David W.

    2008-01-01

    A measurement of hadron production cross-sections for the simulation of accelerator neutrino beams and a search for muon neutrino to electron neutrino oscillations in the Δm 2 ~ 1 eV 2} region. This dissertation presents measurements from two different high energy physics experiments with a very strong connection: the Hadron Production (HARP) experiment located at CERN in Geneva, Switzerland, and the Mini Booster Neutrino Experiment (Mini-BooNE) located at Fermilab in Batavia, Illinois.

  19. Schubert Review 2017 2-page summary of AmBe project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, A.

    2017-04-04

    Accelerator-based neutron sources to replace Americium Beryllium (AmBe) radiological sources used for oil well logging are needed for safety and security purposes. DT neutron generators have successfully been used in the past for some measurements, but are less sensitive to rock porosity than the AmBe spectrum is. Additionally, the well-logging industry has decades of data calibrated to the AmBe neutron spectrum. Ideally, if this industry were required to use an accelerator source, they would like a similar neutron spectrum to the AmBe source, with a yield of at least 1×10 7 n/s.

  20. Preliminary Results From The First Flight of ATIC

    NASA Technical Reports Server (NTRS)

    Seo, E. S.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The Advanced Thin Ionization Calorimeter (ATIC) instrument is designed to measure the composition and energy spectra of Z = 1 to 28 cosmic rays over the energy range approximately 10 GeV - 100 TeV. The instrument was calibrated in September 1999 at CERN using accelerated electron, proton and pion beams. ATIC was launched as a long duration balloon test flight on 12/28/00 local time from McMurdo, Antarctica. After flying successfully for about 16 days the payload was recovered in excellent condition. Absolute calibration of the detector response was made using cosmic-ray muons. The data analysis algorithm which was developed with Monte Carlo simulations and validated with the CERN beam test will be used for the flight data analysis. Preliminary results of the proton and helium spectra will be reported in this paper.

  1. Preliminary Results From the First Flight of ATIC

    NASA Technical Reports Server (NTRS)

    Seo, E. S.; Adams, James H., Jr.; Ahn, H.; Ampe, J.; Bashindzhagyan, G.; Case, G.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The Advanced Thin Ionization Calorimeter (ATIC) instrument is designed to measure the composition C and energy spectra of Z = 1 to 28 cosmic rays over the energy range approximately 10 GeV - 100 TeV. The instrument was calibrated in September 1999 at CERN using accelerated electron, proton and pion beams. ATIC was launched as a long duration balloon test flight on 12/28/00 local time from McMurdo, Antarctica. After flying successfully for about 16 days the payload was recovered in excellent condition. Absolute calibration of the detector response was made using cosmic-ray muons. The data analysis algorithm which was developed with Monte Carlo simulations and validated with the CERN beam test will be used for the flight data analysis. Preliminary results of the protons and C helium spectra will be reported in this paper.

  2. Analysis of SEL on Commercial SRAM Memories and Mixed-Field Characterization of a Latchup Detection Circuit for LEO Space Applications

    NASA Astrophysics Data System (ADS)

    Secondo, R.; Alía, R. Garcia; Peronnard, P.; Brugger, M.; Masi, A.; Danzeca, S.; Merlenghi, A.; Vaillé, J.-R.; Dusseau, L.

    2017-08-01

    A single event latchup (SEL) experiment based on commercial static random access memory (SRAM) memories has recently been proposed in the framework of the European Organization for Nuclear Research (CERN) Latchup Experiment and Student Satellite nanosatellite low Earth orbit (LEO) space mission. SEL characterization of three commercial SRAM memories has been carried out at the Paul Scherrer Institut (PSI) facility, using monoenergetic focused proton beams and different acquisition setups. The best target candidate was selected and a circuit for SEL detection has been proposed and tested at CERN, in the CERN High Energy AcceleRator Mixed-field facility (CHARM). Experimental results were carried out at test locations representative of the LEO environment, thus providing a full characterization of the SRAM cross sections, together with the analysis of the single-event effect and total ionizing dose of the latchup detection circuit in relation to the particle spectra expected during mission. The setups used for SEL monitoring are described, and details of the proposed circuit components and topology are presented. Experimental results obtained both at PSI and at CHARM facilities are discussed.

  3. Transaction aware tape-infrastructure monitoring

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Fotios; Kruse, Daniele Francesco

    2014-06-01

    Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.

  4. The ALICE DAQ infoLogger

    NASA Astrophysics Data System (ADS)

    Chapeland, S.; Carena, F.; Carena, W.; Chibante Barroso, V.; Costa, F.; Dénes, E.; Divià, R.; Fuchs, U.; Grigore, A.; Ionita, C.; Delort, C.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; Von Haller, B.; Alice Collaboration

    2014-04-01

    ALICE (A Large Ion Collider Experiment) is a heavy-ion experiment studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The ALICE DAQ (Data Acquisition System) is based on a large farm of commodity hardware consisting of more than 600 devices (Linux PCs, storage, network switches). The DAQ reads the data transferred from the detectors through 500 dedicated optical links at an aggregated and sustained rate of up to 10 Gigabytes per second and stores at up to 2.5 Gigabytes per second. The infoLogger is the log system which collects centrally the messages issued by the thousands of processes running on the DAQ machines. It allows to report errors on the fly, and to keep a trace of runtime execution for later investigation. More than 500000 messages are stored every day in a MySQL database, in a structured table keeping track for each message of 16 indexing fields (e.g. time, host, user, ...). The total amount of logs for 2012 exceeds 75GB of data and 150 million rows. We present in this paper the architecture and implementation of this distributed logging system, consisting of a client programming API, local data collector processes, a central server, and interactive human interfaces. We review the operational experience during the 2012 run, in particular the actions taken to ensure shifters receive manageable and relevant content from the main log stream. Finally, we present the performance of this log system, and future evolutions.

  5. Smashing Protons to Smithereens

    ScienceCinema

    Pleier, Marc-André

    2018-01-05

    Pleier discusses the extraordinary research taking place at the Large Hadron Collider (LHC) — the world’s newest, biggest, and highest energy particle accelerator located at CERN. Pleier is one of hundreds of researchers from around the world working on ATLAS, a seven-story particle detector positioned at a point where the LHC’s oppositely circulating beams of protons slam into one another head-on.

  6. Design, construction and tests of a 3 GHz proton linac booster (LIBO) for cancer therapy

    NASA Astrophysics Data System (ADS)

    Berra, Paolo

    2007-12-01

    In the last ten years the use of proton beams in radiation therapy has become a clinical tool for treatment of deep-seated tumours. LIBO is a RF compact and low cost proton linear accelerator (SCL type) for hadrontherapy. It is conceived by TERA Foundation as a 3 GHz Linac Booster, to be mounted downstream of an existing cyclotron in order to boost the energy of the proton beam up to 200 MeV, needed for deep treatment (~25 cm) in the human body. With this solution it is possible to transform a low energy commercial cyclotron, normally used for eye melanoma therapy, isotope production and nuclear physics research, into an accelerator for deep-seated tumours. A prototype module of LIBO has been built and successfully tested with full RF power at CERN and with proton beam at INFN Laboratori Nazionali del Sud (LNS) in Catania, within an international collaboration between TERA Foundation, CERN, the Universities and INFN groups of Milan and Naples. The mid-term aim of the project is the technology transfer of the accumulated know-how to a consortium of companies and to bring this novel medical tool to hospitals. The design, construction and tests of the LIBO prototype are described in detail.

  7. Predicting landslides in clearcut patches

    Treesearch

    Raymond M. Rice; Norman H. Pillsbury

    1982-01-01

    Abstract - Accelerated erosion in the form of landslides can be an undesirable consequence of clearcut logging on steep slopes. Forest managers need a method of predicting the risk of such erosion. Data collected after logging in a granitic area of northwestern California were used to develop a predictive equation. A linear discriminant function was developed that...

  8. Machine for row-mulching logging slash to enhance site- a concept

    Treesearch

    P. Koch; D.W. McKenzie

    1977-01-01

    Proposes that stumps, tops, and branches residual after logging pine plantations be hogged to build mulch beds spaced on about 2.5-m centers, thereby eliminating pile and bum operations. Growth of seedlings planted through mulch beds should be accelerated because of moisture conservation, weed suppression, and minimum disturbance of topsoil.

  9. Machine for row-mulching logging slash to enhance site-a concept

    Treesearch

    Peter Koch; Dan W. McKenzie

    1975-01-01

    Proposes that stumps, tops, and branches residual after logging pine plantations be hogged to build mulch beds spaced on about 2.5-m centers, thereby eliminating pile and burn operations. Growth of seedlings planted through mulch beds should be accelerated because of moisture conservation, weed suppression, and minimum disturbance of topsoil.

  10. Reliability and degradation of oxide VCSELs due to reaction to atmospheric water vapor

    NASA Astrophysics Data System (ADS)

    Dafinca, Alexandru; Weidberg, Anthony R.; McMahon, Steven J.; Grillo, Alexander A.; Farthouat, Philippe; Ziolkowski, Michael; Herrick, Robert W.

    2013-03-01

    850nm oxide-aperture VCSELs are susceptible to premature failure if operated while exposed to atmospheric water vapor, and not protected by hermetic packaging. The ATLAS detector in CERN's Large Hadron Collider (LHC) has had approximately 6000 channels of Parallel Optic VCSELs fielded under well-documented ambient conditions. Exact time-to-failure data has been collected on this large sample, providing for the first time actual failure data at use conditions. In addition, the same VCSELs were tested under a variety of accelerated conditions to allow us to construct a more accurate acceleration model. Failure analysis information will also be presented to show what we believe causes corrosion-related failure for such VCSELs.

  11. Upgrade of the cryogenic infrastructure of SM18, CERN main test facility for superconducting magnets and RF cavities

    NASA Astrophysics Data System (ADS)

    Perin, A.; Dhalla, F.; Gayet, P.; Serio, L.

    2017-12-01

    SM18 is CERN main facility for testing superconducting accelerator magnets and superconducting RF cavities. Its cryogenic infrastructure will have to be significantly upgraded in the coming years, starting in 2019, to meet the testing requirements for the LHC High Luminosity project and for the R&D program for superconducting magnets and RF equipment until 2023 and beyond. This article presents the assessment of the cryogenic needs based on the foreseen test program and on past testing experience. The current configuration of the cryogenic infrastructure is presented and several possible upgrade scenarios are discussed. The chosen upgrade configuration is then described and the characteristics of the main newly required cryogenic equipment, in particular a new 35 g/s helium liquefier, are presented. The upgrade implementation strategy and plan to meet the required schedule are then described.

  12. Design approach for the development of a cryomodule for compact crab cavities for Hi-Lumi LHC

    NASA Astrophysics Data System (ADS)

    Pattalwar, Shrikant; Jones, Thomas; Templeton, Niklas; Goudket, Philippe; McIntosh, Peter; Wheelhouse, Alan; Burt, Graeme; Hall, Ben; Wright, Loren; Peterson, Tom

    2014-01-01

    A prototype Superconducting RF (SRF) cryomodule, comprising multiple compact crab cavities is foreseen to realise a local crab crossing scheme for the "Hi-Lumi LHC", a project launched by CERN to increase the luminosity performance of LHC. A cryomodule with two cavities will be initially installed and tested on the SPS drive accelerator at CERN to evaluate performance with high-intensity proton beams. A series of boundary conditions influence the design of the cryomodule prototype, arising from; the complexity of the cavity design, the requirement for multiple RF couplers, the close proximity to the second LHC beam pipe and the tight space constraints in the SPS and LHC tunnels. As a result, the design of the helium vessel and the cryomodule has become extremely challenging. This paper assesses some of the critical cryogenic and engineering design requirements and describes an optimised cryomodule solution for the evaluation tests on SPS.

  13. Towards a Future Linear Collider and The Linear Collider Studies at CERN

    ScienceCinema

    Heuer, Rolf-Dieter

    2018-06-15

    During the week 18-22 October, more than 400 physicists will meet at CERN and in the CICG (International Conference Centre Geneva) to review the global progress towards a future linear collider. The 2010 International Workshop on Linear Colliders will study the physics, detectors and accelerator complex of a linear collider covering both the CLIC and ILC options. Among the topics presented and discussed will be the progress towards the CLIC Conceptual Design Report in 2011, the ILC Technical Design Report in 2012, physics and detector studies linked to these reports, and an increasing numbers of common working group activities. The seminar will give an overview of these topics and also CERN’s linear collider studies, focusing on current activities and initial plans for the period 2011-16. n.b: The Council Chamber is also reserved for this colloquium with a live transmission from the Main Auditorium.

  14. High energy beam impact tests on a LHC tertiary collimator at the CERN high-radiation to materials facility

    NASA Astrophysics Data System (ADS)

    Cauchi, Marija; Aberle, O.; Assmann, R. W.; Bertarelli, A.; Carra, F.; Cornelis, K.; Dallocchio, A.; Deboy, D.; Lari, L.; Redaelli, S.; Rossi, A.; Salvachua, B.; Mollicone, P.; Sammut, N.

    2014-02-01

    The correct functioning of a collimation system is crucial to safely operate highly energetic particle accelerators, such as the Large Hadron Collider (LHC). The requirements to handle high intensity beams can be demanding. In this respect, investigating the consequences of LHC particle beams hitting tertiary collimators (TCTs) in the experimental regions is a fundamental issue for machine protection. An experimental test was designed to investigate the robustness and effects of beam accidents on a fully assembled collimator, based on accident scenarios in the LHC. This experiment, carried out at the CERN High-Radiation to Materials (HiRadMat) facility, involved 440 GeV proton beam impacts of different intensities on the jaws of a horizontal TCT. This paper presents the experimental setup and the preliminary results obtained, together with some first outcomes from visual inspection and a comparison of such results with numerical simulations.

  15. CVD diamond detectors for ionizing radiation

    NASA Astrophysics Data System (ADS)

    Friedl, M.; Adam, W.; Bauer, C.; Berdermann, E.; Bergonzo, P.; Bogani, F.; Borchi, E.; Brambilla, A.; Bruzzi, M.; Colledani, C.; Conway, J.; Dabrowski, W.; Delpierre, P.; Deneuville, A.; Dulinski, W.; van Eijk, B.; Fallou, A.; Fizzotti, F.; Foulon, F.; Gan, K. K.; Gheeraert, E.; Grigoriev, E.; Hallewell, G.; Hall-Wilton, R.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H.; Kania, D.; Kaplon, J.; Karl, C.; Kass, R.; Knöpfle, K. T.; Krammer, M.; Logiudice, A.; Lu, R.; Manfredi, P. F.; Manfredotti, C.; Marshall, R. D.; Meier, D.; Mishina, M.; Oh, A.; Pan, L. S.; Palmieri, V. G.; Pernegger, H.; Pernicka, M.; Peitz, A.; Pirollo, S.; Polesello, P.; Pretzl, K.; Re, V.; Riester, J. L.; Roe, S.; Roff, D.; Rudge, A.; Schnetzer, S.; Sciortino, S.; Speziali, V.; Stelzer, H.; Stone, R.; Tapper, R. J.; Tesarek, R.; Thomson, G. B.; Trawick, M.; Trischuk, W.; Vittone, E.; Walsh, A. M.; Wedenig, R.; Weilhammer, P.; Ziock, H.; Zoeller, M.; RD42 Collaboration

    1999-10-01

    In future HEP accelerators, such as the LHC (CERN), detectors and electronics in the vertex region of the experiments will suffer from extreme radiation. Thus radiation hardness is required for both detectors and electronics to survive in this harsh environment. CVD diamond, which is investigated by the RD42 Collaboration at CERN, can meet these requirements. Samples of up to 2×4 cm2 have been grown and refined for better charge collection properties, which are measured with a β source or in a testbeam. A large number of diamond samples has been irradiated with hadrons to fluences of up to 5×10 15 cm-2 to study the effects of radiation. Both strip and pixel detectors were prepared in various geometries. Samples with strip metallization have been tested with both slow and fast readout electronics, and the first diamond pixel detector proved fully functional with LHC electronics.

  16. Measurements of 55Fe activity in activated steel samples with GEMPix

    NASA Astrophysics Data System (ADS)

    Curioni, A.; Dinar, N.; La Torre, F. P.; Leidner, J.; Murtas, F.; Puddu, S.; Silari, M.

    2017-03-01

    In this paper we present a novel method, based on the recently developed GEMPix detector, to measure the 55Fe content in samples of metallic material activated during operation of CERN accelerators and experimental facilities. The GEMPix, a gas detector with highly pixelated read-out, has been obtained by coupling a triple Gas Electron Multiplier (GEM) to a quad Timepix ASIC. Sample preparation, measurements performed on 45 samples and data analysis are described. The calibration factor (counts per second per unit specific activity) has been obtained via measurements of the 55Fe activity determined by radiochemical analysis of the same samples. Detection limit and sensitivity to the current Swiss exemption limit are calculated. Comparison with radiochemical analysis shows inconsistency for the sensitivity for only two samples, most likely due to underestimated uncertainties of the GEMPix analysis. An operative test phase of this technique is already planned at CERN.

  17. Towards a Future Linear Collider and The Linear Collider Studies at CERN

    ScienceCinema

    Stapnes, Steinar

    2017-12-18

    During the week 18-22 October, more than 400 physicists will meet at CERN and in the CICG (International Conference Centre Geneva) to review the global progress towards a future linear collider. The 2010 International Workshop on Linear Colliders will study the physics, detectors and accelerator complex of a linear collider covering both the CLIC and ILC options. Among the topics presented and discussed will be the progress towards the CLIC Conceptual Design Report in 2011, the ILC Technical Design Report in 2012, physics and detector studies linked to these reports, and an increasing numbers of common working group activities. The seminar will give an overview of these topics and also CERN’s linear collider studies, focusing on current activities and initial plans for the period 2011-16. n.b: The Council Chamber is also reserved for this colloquium with a live transmission from the Main Auditorium.

  18. Mapping of the thermal neutron distribution in the lead block assembly of the PS-211 experiment at CERN, using thermoluminescence and nuclear track detectors.

    PubMed

    Savvidis, E; Eleftheriadis, C A; Kitis, G

    2002-01-01

    The main purpose of the TARC (Transmutation by Adiabatic Resonance Crossing) experiment (PS-211), was to demonstrate the possibility to destroy efficiently Long-Lived Fission Fragments (LLFF) in Accelerator Driven Systems (ADS). The experimental set-up which consisted of a lead block with dimensions 3.3 x 3.3 x 3 m3, was installed in a CERN Proton Synchrotron (PS) beam line. The proton beam at 2.5 GeV/c and 3.5 GeV/c, was incident in the centre of the lead block assembly producing neutrons via spallation reactions. In this study, neutron flux measurements are presented in the lead block assembly using thermoluminescence and nuclear track detectors. The results are in good agreement with Monte Carlo calculations as well as with the results of the other methods used in the framework of the TARC experiment.

  19. openSE: a Systems Engineering Framework Particularly Suited to Particle Accelerator Studies and Development Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnal, P.; Féral, B.; Kershaw, K.

    Particle accelerator projects share many characteristics with industrial projects. However, experience has shown that best practice of industrial project management is not always well suited to particle accelerator projects. Major differences include the number and complexity of technologies involved, the importance of collaborative work, development phases that can last more than a decade, and the importance of telerobotics and remote handling to address future preventive and corrective maintenance requirements due to induced radioactivity, to cite just a few. The openSE framework it is a systems engineering and project management framework specifically designed for scientific facilities’ systems and equipment studies andmore » development projects. Best practices in project management, in systems and requirements engineering, in telerobotics and remote handling and in radiation safety management were used as sources of inspiration, together with analysis of current practices surveyed at CERN, GSI and ESS.« less

  20. Nuclear-Structure Physics with MINIBALL at HIE-ISOLDE

    NASA Astrophysics Data System (ADS)

    Reiter, P.; MINIBALL Collaboration

    2018-02-01

    The MINIBALL spectrometer utilizes successfully a variety of post-accelerated radioactive ion beams provided by the new HIE-ISOLDE accelerator at CERN. In-beam γ-ray spectroscopy after Coulomb excitation (CE) or transfer reactions is performed with optimized setups of ancillary detectors for particle detection. The physics program covers a wide range of shell model investigations. Exotic heavy ion beams will enable unique studies of collective properties up to the actinide region. First data taking with HIE-ISOLDE beams started recently. The higher energies and intensities of the new post-accelerator provides a promising perspective for a new generation of MINIBALL experiments. Intriguing first results were obtained by employing beams of 74,76,78Zn, 110,132Sn, 144Xe with beam energies in the range of 4.0 - 5.5 MeV/u for CE experiments at ‘safe’ energies. In all cases first results for various B(Eλ) values for these isotopes were obtained.

  1. New vertical cryostat for the high field superconducting magnet test station at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vande Craen, A.; Atieh, S.; Bajko, M.

    2014-01-29

    In the framework of the R and D program for new superconducting magnets for the Large Hadron Collider accelerator upgrades, CERN is building a new vertical test station to test high field superconducting magnets of unprecedented large size. This facility will allow testing of magnets by vertical insertion in a pressurized liquid helium bath, cooled to a controlled temperature between 4.2 K and 1.9 K. The dimensions of the cryostat will allow testing magnets of up to 2.5 m in length with a maximum diameter of 1.5 m and a mass of 15 tons. To allow for a faster insertionmore » and removal of the magnets and reducing the risk of helium leaks, all cryogenics supply lines are foreseen to remain permanently connected to the cryostat. A specifically designed 100 W heat exchanger is integrated in the cryostat helium vessel for a controlled cooling of the magnet from 4.2 K down to 1.9 K in a 3 m{sup 3} helium bath. This paper describes the cryostat and its main functions, focusing on features specifically developed for this project. The status of the construction and the plans for assembly and installation at CERN are also presented.« less

  2. Feasibility study for a biomedical experimental facility based on LEIR at CERN.

    PubMed

    Abler, Daniel; Garonna, Adriano; Carli, Christian; Dosanjh, Manjit; Peach, Ken

    2013-07-01

    In light of the recent European developments in ion beam therapy, there is a strong interest from the biomedical research community to have more access to clinically relevant beams. Beamtime for pre-clinical studies is currently very limited and a new dedicated facility would allow extensive research into the radiobiological mechanisms of ion beam radiation and the development of more refined techniques of dosimetry and imaging. This basic research would support the current clinical efforts of the new treatment centres in Europe (for example HIT, CNAO and MedAustron). This paper presents first investigations on the feasibility of an experimental biomedical facility based on the CERN Low Energy Ion Ring LEIR accelerator. Such a new facility could provide beams of light ions (from protons to neon ions) in a collaborative and cost-effective way, since it would rely partly on CERN's competences and infrastructure. The main technical challenges linked to the implementation of a slow extraction scheme for LEIR and to the design of the experimental beamlines are described and first solutions presented. These include introducing new extraction septa into one of the straight sections of the synchrotron, changing the power supply configuration of the magnets, and designing a new horizontal beamline suitable for clinical beam energies, and a low-energy vertical beamline for particular radiobiological experiments.

  3. Feasibility study for a biomedical experimental facility based on LEIR at CERN

    PubMed Central

    Abler, Daniel; Garonna, Adriano; Carli, Christian; Dosanjh, Manjit; Peach, Ken

    2013-01-01

    In light of the recent European developments in ion beam therapy, there is a strong interest from the biomedical research community to have more access to clinically relevant beams. Beamtime for pre-clinical studies is currently very limited and a new dedicated facility would allow extensive research into the radiobiological mechanisms of ion beam radiation and the development of more refined techniques of dosimetry and imaging. This basic research would support the current clinical efforts of the new treatment centres in Europe (for example HIT, CNAO and MedAustron). This paper presents first investigations on the feasibility of an experimental biomedical facility based on the CERN Low Energy Ion Ring LEIR accelerator. Such a new facility could provide beams of light ions (from protons to neon ions) in a collaborative and cost-effective way, since it would rely partly on CERN's competences and infrastructure. The main technical challenges linked to the implementation of a slow extraction scheme for LEIR and to the design of the experimental beamlines are described and first solutions presented. These include introducing new extraction septa into one of the straight sections of the synchrotron, changing the power supply configuration of the magnets, and designing a new horizontal beamline suitable for clinical beam energies, and a low-energy vertical beamline for particular radiobiological experiments. PMID:23824122

  4. Measurements and FLUKA Simulations of Bismuth, Aluminium and Indium Activation at the upgraded CERN Shielding Benchmark Facility (CSBF)

    NASA Astrophysics Data System (ADS)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.; Yashima, H.

    2018-06-01

    The CERN High energy AcceleRator Mixed field (CHARM) facility is situated in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5·1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7·1010 protons per second. The extracted proton beam impacts on a cylindrical copper target. The shielding of the CHARM facility includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target that allows deep shielding penetration benchmark studies of various shielding materials. This facility has been significantly upgraded during the extended technical stop at the beginning of 2016. It consists now of 40 cm of cast iron shielding, a 200 cm long removable sample holder concrete block with 3 inserts for activation samples, a material test location that is used for the measurement of the attenuation length for different shielding materials as well as for sample activation at different thicknesses of the shielding materials. Activation samples of bismuth, aluminium and indium were placed in the CSBF in September 2016 to characterize the upgraded version of the CSBF. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields of bismuth isotopes (206 Bi, 205 Bi, 204 Bi, 203 Bi, 202 Bi, 201 Bi) from 209 Bi, 24 Na from 27 Al and 115 m I from 115 I for these samples. The production yields estimated by FLUKA Monte Carlo simulations are compared to the production yields obtained from γ-spectroscopy measurements of the samples taking the beam intensity profile into account. The agreement between FLUKA predictions and γ-spectroscopy measurements for the production yields is at a level of a factor of 2.

  5. The muon component in extensive air showers and new p+C data in fixed target experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meurer, C.; Bluemer, J.; Engel, R.

    2007-03-19

    One of the most promising approaches to determine the energy spectrum and composition of the cosmic rays with energies above 1015 eV is the measurement of the number of electrons and muons produced in extensive air showers (EAS). Therefore simulation of air showers using electromagnetic and hadronic interaction models are necessary. These simulations show uncertainties which come mainly from hadronic interaction models. One aim of this work is to specify the low energy hadronic interactions which are important for the muon production in EAS. Therefore we simulate extensive air showers with a modified version of the simulation package CORSIKA. Inmore » particular we investigate in detail the energy and the phase space regions of secondary particle production, which are most important for muon production. This phase space region is covered by fixed target experiments at CERN. In the second part of this work we present preliminary momentum spectra of secondary {pi}+ and {pi}- in p+C collisions at 12 GeV/c measured with the HARP spectrometer at the PS accelerator at CERN. In addition we use the new p+C NA49 data at 158 GeV/c to check the reliability of hadronic interaction models for muon production in EAS. Finally, possibilities to measure relevant quantities of hadron production in existing and planned accelerator experiments are discussed.« less

  6. Revised LHC deal quiets congress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawler, A.

    The roughest part of the ride may be over for U.S. physicists who want to participate in the Large Hadron Collider (LHC), the $5 billion accelerator planned for CERN in Geneva. They have found themselves on a political roller coaster for the past few months. This week, U.S. and European negotiators were putting the final touches on a revamped agreement that should pave the way for the United States to help pay for construction of the accelerator and its two main detectors, and guarantee U.S. scientists a role in research on the machine. The trouble began in March, when Representativemore » Joe Barton (R-TX) declared war on a proposed $530 million U.S. contribution to the new facility, slated for completion in 2005. Barton and many other members of Congress were still smarting from what they said was a lack of European support for the canceled Superconducting Super Collider that was being built in Barton`s backyard. Representative James Sensenbrenner (R-WI), who chairs the House Science Committee, led the charge to alter a draft agreement initialed this winter by Department of Energy (DOE) and CERN officials that spelled out the details of U.S. participation. After hurried negotiations, both sides have sharpened the agreement to address the lawmakers` concerns. The new deal, says Energy Secretary Federico Pena, {open_quotes}has made that project even better.{close_quotes}« less

  7. Fabrication Technologies of the High Gradient Accelerator Structures at 100MV/M Range

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Juwen; /SLAC; Lewandowski, James

    A CERN-SLAC-KEK collaboration on high gradient X-band structure research has been established in order to demonstrate the feasibility of the CLIC baseline design for the main linac stably operating at more than 100 MV/m loaded accelerating gradient. Several prototype CLIC structures were successfully fabricated and high power tested. They operated at 105 MV/m with a breakdown rate that meets the CLIC linear collider specifications of < 5 x 10{sup -7}/pulse/m. This paper summarizes the fabrication technologies including the mechanical design, precision machining, chemical cleaning, diffusion bonding as well as vacuum baking and all related assembly technologies. Also, the tolerances control,more » tuning and RF characterization will be discussed.« less

  8. Femtosecond Intrastromal Lenticular Implantation Combined With Accelerated Collagen Cross-Linking for the Treatment of Keratoconus--Initial Clinical Result in 6 Eyes.

    PubMed

    Ganesh, Sri; Brar, Sheetal

    2015-10-01

    To evaluate the initial outcomes of femtosecond intrastromal lenticular implantation (FILI) combined with accelerated collagen cross-linking in patients with progressive keratoconus. In this interventional, prospective, exploratory case series, patients with progressive keratoconus and contact lens intolerance were included. All eyes underwent femtosecond laser-enabled placement of stromal donor tissue and simultaneous accelerated collagen cross-linking. Follow-up of patients was conducted for a mean period of 190 ± 13 days (range, 177-193 days). Six eyes from 6 patients were included in the study. Based on values before and 6 months after the procedure, clinical improvement was noted in uncorrected distance visual acuity (1.06 ± 0.48 logMAR vs. 0.38 ± 0.27 logMAR), corrected distance visual acuity (0.51 ± 0.20 logMAR vs. 0.20 ± 0.24 logMAR), and manifest spherical equivalent (-3.47 ± 1.15 D vs. -1.77 ± 1.7 D). There was flattening of mean keratometry in 3-mm and 5-mm zones by 3.42 ± 2.09 D and 1.70 ± 1.31 D, respectively. Mean pachymetry in the central and midperipheral zones increased by 18.3 ± 7.3 μm and 33.0 ± 8.8 μm, respectively. All eyes had reduction in higher-order aberrations, specifically coma. No eye lost lines of corrected distance visual acuity. No adverse events such as haze, infection, or allogeneic graft rejection were observed. Initial experience with this small number of eyes suggests that the combination of tissue addition and accelerated collagen cross-linking may be a feasible option for low to moderate keratoconus. A larger cohort and longer follow-up are required to validate our results and establish long-term safety and efficacy of the procedure.

  9. ENLIGHT: European network for Light ion hadron therapy.

    PubMed

    Dosanjh, Manjit; Amaldi, Ugo; Mayer, Ramona; Poetter, Richard

    2018-04-03

    The European Network for Light Ion Hadron Therapy (ENLIGHT) was established in 2002 following various European particle therapy network initiatives during the 1980s and 1990s (e.g. EORTC task group, EULIMA/PIMMS accelerator design). ENLIGHT started its work on major topics related to hadron therapy (HT), such as patient selection, clinical trials, technology, radiobiology, imaging and health economics. It was initiated through CERN and ESTRO and dealt with various disciplines such as (medical) physics and engineering, radiation biology and radiation oncology. ENLIGHT was funded until 2005 through the EC FP5 programme. A regular annual meeting structure was started in 2002 and continues until today bringing together the various disciplines and projects and institutions in the field of HT at different European places for regular exchange of information on best practices and research and development. Starting in 2006 ENLIGHT coordination was continued through CERN in collaboration with ESTRO and other partners involved in HT. Major projects within the EC FP7 programme (2008-2014) were launched for R&D and transnational access (ULICE, ENVISION) and education and training networks (Marie Curie ITNs: PARTNER, ENTERVISION). These projects were instrumental for the strengthening of the field of hadron therapy. With the start of 4 European carbon ion and proton centres and the upcoming numerous European proton therapy centres, the future scope of ENLIGHT will focus on strengthening current and developing European particle therapy research, multidisciplinary education and training and general R&D in technology and biology with annual meetings and a continuously strong CERN support. Collaboration with the European Particle Therapy Network (EPTN) and other similar networks will be pursued. Copyright © 2018 CERN. Published by Elsevier B.V. All rights reserved.

  10. A high-speed scintillation-based electronic portal imaging device to quantitatively characterize IMRT delivery.

    PubMed

    Ranade, Manisha K; Lynch, Bart D; Li, Jonathan G; Dempsey, James F

    2006-01-01

    We have developed an electronic portal imaging device (EPID) employing a fast scintillator and a high-speed camera. The device is designed to accurately and independently characterize the fluence delivered by a linear accelerator during intensity modulated radiation therapy (IMRT) with either step-and-shoot or dynamic multileaf collimator (MLC) delivery. Our aim is to accurately obtain the beam shape and fluence of all segments delivered during IMRT, in order to study the nature of discrepancies between the plan and the delivered doses. A commercial high-speed camera was combined with a terbium-doped gadolinium-oxy-sulfide (Gd2O2S:Tb) scintillator to form an EPID for the unaliased capture of two-dimensional fluence distributions of each beam in an IMRT delivery. The high speed EPID was synchronized to the accelerator pulse-forming network and gated to capture every possible pulse emitted from the accelerator, with an approximate frame rate of 360 frames-per-second (fps). A 62-segment beam from a head-and-neck IMRT treatment plan requiring 68 s to deliver was recorded with our high speed EPID producing approximately 6 Gbytes of imaging data. The EPID data were compared with the MLC instruction files and the MLC controller log files. The frames were binned to provide a frame rate of 72 fps with a signal-to-noise ratio that was sufficient to resolve leaf positions and segment fluence. The fractional fluence from the log files and EPID data agreed well. An ambiguity in the motion of the MLC during beam on was resolved. The log files reported leaf motions at the end of 33 of the 42 segments, while the EPID observed leaf motions in only 7 of the 42 segments. The static IMRT segment shapes observed by the high speed EPID were in good agreement with the shapes reported in the log files. The leaf motions observed during beam-on for step-and-shoot delivery were not temporally resolved by the log files.

  11. Full steam ahead

    NASA Astrophysics Data System (ADS)

    Heuer, Rolf-Dieter

    2008-03-01

    When the Economist recently reported the news of Rolf-Dieter Heuer's appointment as the next directorgeneral of CERN, it depicted him sitting cross-legged in the middle of a circular track steering a model train around him - smiling. It was an apt cartoon for someone who is about to take charge of the world's most powerful particle accelerator: the 27 km-circumference Large Hadron Collider (LHC), which is nearing completion at the European laboratory just outside Geneva. What the cartoonist did not known is that model railways are one of Heuer's passions.

  12. Recent Results from ISOLDE and HIE-ISOLDE

    NASA Astrophysics Data System (ADS)

    Borge, María J. G.

    2018-02-01

    ISOLDE is the CERN facility dedicated to the production of rare ion beams for many different experiments in the fields of nuclear and atomic physics, materials science and life sciences. The HIE-ISOLDE, Higher Intensity and Energy upgrade has finished its stage 1 dedicated to upgrade the energy up to 5.5 MeV/u, producing the first radioactive beams with this energy in September 9th 2016. Recent results from the low energy and post-accelerated beams are given in this contribution.

  13. Accelerator Tests of the KLEM Prototypes

    NASA Technical Reports Server (NTRS)

    Bashindzhagyan, G.; Adams, J. H.; Bashindzhagyan, P.; Baranova, N.; Christl, M.; Chilingarian, A.; Chupin, I.; Derrickson, J.; Drury, L.; Egorov, N.

    2003-01-01

    The Kinematic Lightweight Energy Meter (KLEM) device is planned for direct measurement of the elemental energy spectra of high-energy (10(exp 11)-10(exp 16) eV) cosmic rays. The first KLEM prototype has been tested at CERN with 180 GeV pion beam in 2001. A modified KLEM prototype will be tested in proton and heavy ion beams to give more experimental data on energy resolution and charge resolution with KLEM method. The first test results are presented and compared with simulations.

  14. Using Limes and Synthetic Psoralens to Enhance Solar Disinfection of Water (SODIS): A Laboratory Evaluation with Norovirus, Escherichia coli, and MS2

    PubMed Central

    Harding, Alexander S.; Schwab, Kellogg J.

    2012-01-01

    We investigated the use of psoralens and limes to enhance solar disinfection of water (SODIS) using an UV lamp and natural sunlight experiments. SODIS conditions were replicated using sunlight, 2 L polyethylene terephthalate (PET) bottles, and tap water with Escherichia coli, MS2 bacteriophage, and murine norovirus (MNV). Psoralens and lime acidity both interact synergistically with UV radiation to accelerate inactivation of microbes. Escherichia coli was ablated > 6.1 logs by SODIS + Lime Slurry and 5.6 logs by SODIS + Lime Juice in 30-minute solar exposures, compared with a 1.5 log reduction with SODIS alone (N = 3; P < 0.001). MS2 was inactivated > 3.9 logs by SODIS + Lime Slurry, 1.9 logs by SODIS + Lime Juice, and 1.4 logs by SODIS in 2.5-hour solar exposures (N = 3; P < 0.05). MNV was resistant to SODIS, with < 2 log reductions after 6 hours. Efficacy of SODIS against human norovirus should be investigated further. PMID:22492137

  15. The SHIP facility at CERN

    NASA Astrophysics Data System (ADS)

    De Lellis, Giovanni

    2016-04-01

    Searches for new physics with accelerators are being performed at the LHC, looking for high massive particles coupled to matter with ordinary strength. A new experimental facility meant to search for very weakly coupled particles in the few GeV mass domain has been recently proposed. The existence of such particles, foreseen in different theoretical models beyond the Standard Model, is largely unexplored from the experimental point of view. A beam dump facility, built at CERN in the north area, using 400 GeV protons is a copious factory of charmed hadrons and could be used to probe the existence of such particles. The beam dump is also an ideal source of tau neutrinos, the less known particle in the Standard Model. In particular, tau anti-neutrinos have not been directly observed so far. We report the physics potential of such an experiment and outline the performances of a detector operating at the same facility for the search for the τ → μμμ decay.

  16. Performance of the first short model 150 mm aperture Nb$$_3$$Sn Quadrupole MQXFS for the High- Luminosity LHC upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chlachidze, G.; et al.

    2016-08-30

    The US LHC Accelerator Research Program (LARP) and CERN combined their efforts in developing Nb3Sn magnets for the High-Luminosity LHC upgrade. The ultimate goal of this collaboration is to fabricate large aperture Nb3Sn quadrupoles for the LHC interaction regions (IR). These magnets will replace the present 70 mm aperture NbTi quadrupole triplets for expected increase of the LHC peak luminosity by a factor of 5. Over the past decade LARP successfully fabricated and tested short and long models of 90 mm and 120 mm aperture Nb3Sn quadrupoles. Recently the first short model of 150 mm diameter quadrupole MQXFS was builtmore » with coils fabricated both by the LARP and CERN. The magnet performance was tested at Fermilab’s vertical magnet test facility. This paper reports the test results, including the quench training at 1.9 K, ramp rate and temperature dependence studies.« less

  17. The effects of log erosion barriers on post-fire hydrologic response and sediment yield in small forested watersheds, southern Califonia

    Treesearch

    Peter M. Wohlgemuth; Ken R. Hubbert; Peter R. Robichaud

    2001-01-01

    Wildfire usually promotes flooding and accelerated erosion in upland watersheds. In the summer of 1999, a high-severity wildfire burned a series of mixed pine/oak headwater catchments in the San Jacinto Mountains of southern California. Log erosion barriers (LEBs) were constructed across much of the burned area as an erosion control measure. We built debris basins in...

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yokosawa, A.

    Spin physics activities at medium and high energies became significantly active when polarized targets and polarized beams became accessible for hadron-hadron scattering experiments. My overview of spin physics will be inclined to the study of strong interaction using facilities at Argonne ZGS, Brookhaven AGS (including RHIC), CERN, Fermilab, LAMPF, an SATURNE. In 1960 accelerator physicists had already been convinced that the ZGS could be unique in accelerating a polarized beam; polarized beams were being accelerated through linear accelerators elsewhere at that time. However, there was much concern about going ahead with the construction of a polarized beam because (i) themore » source intensity was not high enough to accelerate in the accelerator, (ii) the use of the accelerator would be limited to only polarized-beam physics, that is, proton-proton interaction, and (iii) p-p elastic scattering was not the most popular topic in high-energy physics. In fact, within spin physics, [pi]-nucleon physics looked attractive, since the determination of spin and parity of possible [pi]p resonances attracted much attention. To proceed we needed more data beside total cross sections and elastic differential cross sections; measurements of polarization and other parameters were urgently needed. Polarization measurements had traditionally been performed by analyzing the spin of recoil protons. The drawbacks of this technique are: (i) it involves double scattering, resulting in poor accuracy of the data, and (ii) a carbon analyzer can only be used for a limited region of energy.« less

  19. Production of negatively charged radioactive ion beams

    DOE PAGES

    Liu, Y.; Stracener, D. W.; Stora, T.

    2017-02-15

    Beams of short-lived radioactive nuclei are needed for frontier experimental research in nuclear structure, reactions, and astrophysics. Negatively charged radioactive ion beams have unique advantages and allow for the use of a tandem accelerator for post-acceleration, which can provide the highest beam quality and continuously variable energies. Negative ion beams can be obtained with high intensity and some unique beam purification techniques based on differences in electronegativity and chemical reactivity can be used to provide beams with high purity. This article describes the production of negative radioactive ion beams at the former holifield radioactive ion beam facility at Oak Ridgemore » National Laboratory and at the CERN ISOLDE facility with emphasis on the development of the negative ion sources employed at these two facilities.« less

  20. Measurement of Beam Tunes in the Tevatron Using the BBQ System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edstrom, Dean R.; /Indiana U.

    Measuring the betatron tunes in any synchrotron is of critical importance to ensuring the stability of beam in the synchrotron. The Base Band Tune, or BBQ, measurement system was developed by Marek Gasior of CERN and has been installed at Brookhaven and Fermilab as a part of the LHC Accelerator Research Program, or LARP. The BBQ was installed in the Tevatron to evaluate its effectiveness at reading proton and antiproton tunes at its flattop energy of 980 GeV. The primary objectives of this thesis are to examine the methods used to measure the tune using the BBQ tune measurement system,more » to incorporate the system into the Fermilab accelerator controls system, ACNET, and to compare the BBQ to existing tune measurement systems in the Tevatron.« less

  1. Improving linear accelerator service response with a real- time electronic event reporting system.

    PubMed

    Hoisak, Jeremy D P; Pawlicki, Todd; Kim, Gwe-Ya; Fletcher, Richard; Moore, Kevin L

    2014-09-08

    To track linear accelerator performance issues, an online event recording system was developed in-house for use by therapists and physicists to log the details of technical problems arising on our institution's four linear accelerators. In use since October 2010, the system was designed so that all clinical physicists would receive email notification when an event was logged. Starting in October 2012, we initiated a pilot project in collaboration with our linear accelerator vendor to explore a new model of service and support, in which event notifications were also sent electronically directly to dedicated engineers at the vendor's technical help desk, who then initiated a response to technical issues. Previously, technical issues were reported by telephone to the vendor's call center, which then disseminated information and coordinated a response with the Technical Support help desk and local service engineers. The purpose of this work was to investigate the improvements to clinical operations resulting from this new service model. The new and old service models were quantitatively compared by reviewing event logs and the oncology information system database in the nine months prior to and after initiation of the project. Here, we focus on events that resulted in an inoperative linear accelerator ("down" machine). Machine downtime, vendor response time, treatment cancellations, and event resolution were evaluated and compared over two equivalent time periods. In 389 clinical days, there were 119 machine-down events: 59 events before and 60 after introduction of the new model. In the new model, median time to service response decreased from 45 to 8 min, service engineer dispatch time decreased 44%, downtime per event decreased from 45 to 20 min, and treatment cancellations decreased 68%. The decreased vendor response time and reduced number of on-site visits by a service engineer resulted in decreased downtime and decreased patient treatment cancellations.

  2. Magnetic Measurements of the First Nb 3Sn Model Quadrupole (MQXFS) for the High-Luminosity LHC

    DOE PAGES

    DiMarco, J.; Ambrosio, G.; Chlachidze, G.; ...

    2016-12-12

    The US LHC Accelerator Research Program (LARP) and CERN are developing high-gradient Nb 3Sn magnets for the High Luminosity LHC interaction regions. Magnetic measurements of the first 1.5 m long, 150 mm aperture model quadrupole, MQXFS1, were performed during magnet assembly at LBNL, as well as during cryogenic testing at Fermilab’s Vertical Magnet Test Facility. This paper reports on the results of these magnetic characterization measurements, as well as on the performance of new probes developed for the tests.

  3. Results from the HARP Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catanesi, M. G.

    2008-02-21

    Hadron production is a key ingredient in many aspects of {nu} physics. Precise prediction of atmospheric {nu} fluxes, characterization of accelerator {nu} beams, quantification of {pi} production and capture for {nu}-factory designs, all of these would profit from hadron production measurements. HARP at the CERN PS was the first hadron production experiment designed on purpose to match all these requirements. It combines a large, full phase space acceptance with low systematic errors and high statistics. HARP was operated in the range from 3 GeV to 15 GeV. We briefly describe here the most recent results.

  4. Physics in ;Real Life;: Accelerator-based Research with Undergraduates

    NASA Astrophysics Data System (ADS)

    Klay, J. L.

    All undergraduates in physics and astronomy should have access to significant research experiences. When given the opportunity to tackle challenging open-ended problems outside the classroom, students build their problem-solving skills in ways that better prepare them for the workplace or future research in graduate school. Accelerator-based research on fundamental nuclear and particle physics can provide a myriad of opportunities for undergraduate involvement in hardware and software development as well as ;big data; analysis. The collaborative nature of large experiments exposes students to scientists of every culture and helps them begin to build their professional network even before they graduate. This paper presents an overview of my experiences - the good, the bad, and the ugly - engaging undergraduates in particle and nuclear physics research at the CERN Large Hadron Collider and the Los Alamos Neutron Science Center.

  5. Simulation and measurements of the response of an air ionisation chamber exposed to a mixed high-energy radiation field.

    PubMed

    Vincke, Helmut; Forkel-Wirth, Doris; Perrin, Daniel; Theis, Chris

    2005-01-01

    CERN's radiation protection group operates a network of simple and robust ionisation chambers that are installed inside CERN's accelerator tunnels. These ionisation chambers are used for the remote reading of ambient dose rate equivalents inside the machines during beam-off periods. This Radiation Protection Monitor for dose rates due to Induced Radioactivity ('PMI', trade name: PTW, Type 34031) is a non-confined air ionisation plastic chamber which is operated under atmospheric pressure. Besides its current field of operation it is planned to extend the use of this detector in the Large Hadron Collider to measure radiation under beam operation conditions to obtain an indication of the machine performance. Until now, studies of the PMI detector have been limited to the response to photons. In order to evaluate its response to other radiation components, this chamber type was tested at CERF, the high-energy reference field facility at CERN. Six PMI detectors were installed around a copper target being irradiated by a mixed hadron beam with a momentum of 120 GeV c(-1). Each of the chosen detector positions was defined by a different radiation field, varying in type and energy of the incident particles. For all positions, detailed measurements and FLUKA simulations of the detector response were performed. This paper presents the promising comparison between the measurements and simulations and analyses the influence of the different particle types on the resulting detector response.

  6. Nuclear data activities at the n_TOF facility at CERN

    NASA Astrophysics Data System (ADS)

    Gunsing, F.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Bacak, M.; Balibrea-Correa, J.; Barbagallo, M.; Barros, S.; Bečvář, F.; Beinrucker, C.; Belloni, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brugger, M.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Castelluccio, D. M.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Colonna, N.; Cortés-Giraldo, M. A.; Cortés, G.; Cosentino, L.; Damone, L. A.; Deo, K.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Frost, R. J. W.; Furman, V.; Ganesan, S.; García, A. R.; Gawlik, A.; Gheorghe, I.; Glodariu, T.; Gonçalves, I. F.; González, E.; Goverdovski, A.; Griesmayer, E.; Guerrero, C.; Göbel, K.; Harada, H.; Heftrich, T.; Heinitz, S.; Hernández-Prieto, A.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Katabuchi, T.; Kavrigin, P.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lerendegui, J.; Licata, M.; Lo Meo, S.; Lonsdale, S. J.; Losito, R.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Montesano, S.; Musumarra, A.; Nolte, R.; Oprea, A.; Palomo-Pinto, F. R.; Paradela, C.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Quesada, J. M.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego-Perez, A.; Robles, M.; Rout, P.; Radeck, D.; Rubbia, C.; Ryan, J. A.; Sabaté-Gilarte, M.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Stamatopoulos, A.; Suryanarayana, S. V.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Warren, S.; Weigand, M.; Weiss, C.; Wolf, C.; Woods, P. J.; Wright, T.; Žugec, P.

    2016-10-01

    Nuclear data in general, and neutron-induced reaction cross sections in particular, are important for a wide variety of research fields. They play a key role in the safety and criticality assessment of nuclear technology, not only for existing power reactors but also for radiation dosimetry, medical applications, the transmutation of nuclear waste, accelerator-driven systems, fuel cycle investigations and future reactor systems as in Generation IV. Applications of nuclear data are also related to research fields as the study of nuclear level densities and stellar nucleosynthesis. Simulations and calculations of nuclear technology applications largely rely on evaluated nuclear data libraries. The evaluations in these libraries are based both on experimental data and theoretical models. Experimental nuclear reaction data are compiled on a worldwide basis by the international network of Nuclear Reaction Data Centres (NRDC) in the EXFOR database. The EXFOR database forms an important link between nuclear data measurements and the evaluated data libraries. CERN's neutron time-of-flight facility n_TOF has produced a considerable amount of experimental data since it has become fully operational with the start of the scientific measurement programme in 2001. While for a long period a single measurement station (EAR1) located at 185 m from the neutron production target was available, the construction of a second beam line at 20 m (EAR2) in 2014 has substantially increased the measurement capabilities of the facility. An outline of the experimental nuclear data activities at CERN's neutron time-of-flight facility n_TOF will be presented.

  7. High duty factor plasma generator for CERN's Superconducting Proton Linac.

    PubMed

    Lettry, J; Kronberger, M; Scrivens, R; Chaudet, E; Faircloth, D; Favre, G; Geisser, J-M; Küchler, D; Mathot, S; Midttun, O; Paoluzzi, M; Schmitzer, C; Steyaert, D

    2010-02-01

    CERN's Linac4 is a 160 MeV linear accelerator currently under construction. It will inject negatively charged hydrogen ions into CERN's PS-Booster. Its ion source is a noncesiated rf driven H(-) volume source directly inspired from the one of DESY and is aimed to deliver pulses of 80 mA of H(-) during 0.4 ms at a 2 Hz repetition rate. The Superconducting Proton Linac (SPL) project is part of the luminosity upgrade of the Large Hadron Collider. It consists of an extension of Linac4 up to 5 GeV and is foreseen to deliver protons to a future 50 GeV synchrotron (PS2). For the SPL high power option (HP-SPL), the ion source would deliver pulses of 80 mA of H(-) during 1.2 ms and operate at a 50 Hz repetition rate. This significant upgrade motivates the design of the new water cooled plasma generator presented in this paper. Its engineering is based on the results of a finite element thermal study of the Linac4 H(-) plasma generator that identified critical components and thermal barriers. A cooling system is proposed which achieves the required heat dissipation and maintains the original functionality. Materials with higher thermal conductivity are selected and, wherever possible, thermal barriers resulting from low pressure contacts are removed by brazing metals on insulators. The AlN plasma chamber cooling circuit is inspired from the approach chosen for the cesiated high duty factor rf H(-) source operating at SNS.

  8. Beyond the Large Hadron Collider: A First Look at Cryogenics for CERN Future Circular Colliders

    NASA Astrophysics Data System (ADS)

    Lebrun, Philippe; Tavian, Laurent

    Following the first experimental discoveries at the Large Hadron Collider (LHC) and the recent update of the European strategy in particle physics, CERN has undertaken an international study of possible future circular colliders beyond the LHC. The study, conducted with the collaborative participation of interested institutes world-wide, considers several options for very high energy hadron-hadron, electron-positron and hadron-electron colliders to be installed in a quasi-circular underground tunnel in the Geneva basin, with a circumference of 80 km to 100 km. All these machines would make intensive use of advanced superconducting devices, i.e. high-field bending and focusing magnets and/or accelerating RF cavities, thus requiring large helium cryogenic systems operating at 4.5 K or below. Based on preliminary sets of parameters and layouts for the particle colliders under study, we discuss the main challenges of their cryogenic systems and present first estimates of the cryogenic refrigeration capacities required, with emphasis on the qualitative and quantitative steps to be accomplished with respect to the present state-of-the-art.

  9. Summary of Test Results of MQXFS1 - The First Short Model 150 mm Aperture $$Nb_3Sn$$ Quadrupole for the High-Luminosity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoynev, S.; et al.

    The development ofmore » $$Nb_3Sn$$ quadrupole magnets for the High-Luminosity LHC upgrade is a joint venture between the US LHC Accelerator Research Program (LARP)* and CERN with the goal of fabricating large aperture quadrupoles for the LHC in-teraction regions (IR). The inner triplet (low-β) NbTi quadrupoles in the IR will be replaced by the stronger Nb3Sn magnets boosting the LHC program of having 10-fold increase in integrated luminos-ity after the foreseen upgrades. Previously LARP conducted suc-cessful tests of short and long models with up to 120 mm aperture. The first short 150 mm aperture quadrupole model MQXFS1 was assembled with coils fabricated by both CERN and LARP. The magnet demonstrated strong performance at the Fermilab’s verti-cal magnet test facility reaching the LHC operating limits. This paper reports the latest results from MQXFS1 tests with changed pre-stress levels. The overall magnet performance, including quench training and memory, ramp rate and temperature depend-ence, is also summarized.« less

  10. Experimental Results Obtained with Air Liquide Cold Compression System: CERN LHC and SNS Projects

    NASA Astrophysics Data System (ADS)

    Delcayre, F.; Courty, J.-C.; Hamber, F.; Hilbert, B.; Monneret, E.; Toia, J.-L.

    2006-04-01

    Large scale collider facilities will make intensive use of superconducting magnets, operating below 2.0 K. This dictates high-capacity refrigeration systems operating below 2.0 K. These systems, making use of cryogenic centrifugal compressors in a series arrangement with room temperature screw compressors will be coupled to a refrigerator, providing a certain power at 4.5 K. A first Air Liquide Cold Compression System (CCS) unit was built and delivered to CERN in 2001. Installed at the beginning of 2002, it was commissioned and tested successfully during year 2002. A series of four sets of identical CCS were then tested in 2004. Another set of four cryogenic centrifugal compressors (CCC) has been delivered to Thomas Jefferson National Accelerator Facility (JLAB) for the Spallation Neutron Source (SNS) in 2002. These compressors were tested and commissioned from December 2004 to July 2005. The experimental results obtained with these systems will be presented and discussed: the characteristics of the CCC will be detailed. The principles of control for the CCC in series will be detailed.

  11. HiRadMat at CERN SPS - A test facility with high intensity beam pulses to material samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charitonidis, N.; Fabich, A.; Efthymiopoulos, I.

    2015-07-01

    HiRadMat (High Irradiation to Materials) is a facility at CERN designed to provide high-intensity pulsed beams to an irradiation area where material samples as well as accelerator component assemblies (e.g. vacuum windows, shock tests on high power targets, collimators) can be tested. The beam parameters (SPS 440 GeV protons with a pulse energy of up to 3.4 MJ, or alternatively lead/argon ions at the proton equivalent energy) can be tuned to match the needs of each experiment. It is a test area designed to perform single pulse experiments to evaluate the effect of high-intensity pulsed beams on materials in amore » dedicated environment, excluding long-time irradiation studies. The facility is designed for a 10{sup 16} maximum number of protons per year, in order to limit the activation to acceptable levels for human intervention. This paper will demonstrate the possibilities for research using this facility and showing examples of upcoming experiments scheduled in the beam period 2014/2015. (authors)« less

  12. Status and Planned Experiments of the Hiradmat Pulsed Beam Material Test Facility at CERN SPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charitonidis, Nikolaos; Efthymiopoulos, Ilias; Fabich, Adrian

    2015-06-01

    HiRadMat (High Irradiation to Materials) is a facility at CERN designed to provide high-intensity pulsed beams to an irradiation area where material samples as well as accelerator component assemblies (e.g. vacuum windows, shock tests on high power targets, collimators) can be tested. The beam parameters (SPS 440 GeV protons with a pulse energy of up to 3.4 MJ, or alternatively lead/argon ions at the proton equivalent energy) can be tuned to match the needs of each experiment. It is a test area designed to perform single pulse experiments to evaluate the effect of high-intensity pulsed beams on materials in amore » dedicated environment, excluding long-time irradiation studies. The facility is designed for a maximum number of 1016 protons per year, in order to limit the activation of the irradiated samples to acceptable levels for human intervention. This paper will demonstrate the possibilities for research using this facility and go through examples of upcoming experiments scheduled in the beam period 2015/2016.« less

  13. Installation and management of the SPS and LEP control system computers

    NASA Astrophysics Data System (ADS)

    Bland, Alastair

    1994-12-01

    Control of the CERN SPS and LEP accelerators and service equipment on the two CERN main sites is performed via workstations, file servers, Process Control Assemblies (PCAs) and Device Stub Controllers (DSCs). This paper describes the methods and tools that have been developed to manage the file servers, PCAs and DSCs since the LEP startup in 1989. There are five operational DECstation 5000s used as file servers and boot servers for the PCAs and DSCs. The PCAs consist of 90 SCO Xenix 386 PCs, 40 LynxOS 486 PCs and more than 40 older NORD 100s. The DSCs consist of 90 OS-968030 VME crates and 10 LynxOS 68030 VME crates. In addition there are over 100 development systems. The controls group is responsible for installing the computers, starting all the user processes and ensuring that the computers and the processes run correctly. The operators in the SPS/LEP control room and the Services control room have a Motif-based X window program which gives them, in real time, the state of all the computers and allows them to solve problems or reboot them.

  14. The SHiP experiment at CERN

    NASA Astrophysics Data System (ADS)

    De Lellis, G.; SHiP Collaboration

    2017-04-01

    The discovery of the Higgs boson has fully confirmed the Standard Model of particles and fields. Nevertheless, there are still fundamental phenomena, like the existence of dark matter and the baryon asymmetry of the Universe, which deserve an explanation that could come from the discovery of new particles. Searches for new physics with accelerators are performed at the LHC, looking for high massive particles coupled to matter with ordinary strength. A new experiment at CERN meant to search for very weakly coupled particles in the few GeV mass domain has been recently proposed. The existence of such particles, foreseen in different theoretical models beyond the Standard Model, is largely unexplored. A beam dump facility using high intensity 400 GeV protons is a copious source of such unknown particles in the GeV mass range. The beam dump is also a copious source of neutrinos and in particular it is an ideal source of tau neutrinos, the less known particle in the Standard Model. Indeed, tau anti-neutrinos have not been directly observed so far. We report the physics potential of such an experiment.

  15. The SHiP experiment at CERN

    NASA Astrophysics Data System (ADS)

    Bonivento, Walter M.

    2017-07-01

    The discovery of the Higgs boson has fully confirmed the Standard Model of particles and fields. Nevertheless, there are still fundamental phenomena, like the existence of dark matter and the baryon asymmetry of the Universe, deserving an explanation that could come from the discovery of new particles. Searches for new physics with accelerators are performed at the LHC, looking for high massive particles coupled to matter with ordinary strength. A new experiment at CERN meant to search for very weakly coupled particles in the few GeV mass domain has been recently proposed. The existence of such particles, foreseen in different theoretical models beyond the Standard Model, is largely unexplored. A beam dump facility using high intensity 400 GeV protons is a copious source of such unknown particles in the GeV mass range. The beam dump is also a copious source of neutrinos and in particular it is an ideal source of tau neutrinos, the less known particle in the Standard Model. The neutrino detector can also search for dark matter through its scattering off the electrons. We report the physics potential of the SHiP experiment.

  16. Assembly Tests of the First Nb 3 Sn Low-Beta Quadrupole Short Model for the Hi-Lumi LHC

    DOE PAGES

    Pan, H.; Felice, H.; Cheng, D. W.; ...

    2016-01-18

    In preparation for the high-luminosity upgrade of the Large Hadron Collider (LHC), the LHC Accelerator Research Program (LARP) in collaboration with CERN is pursuing the development of MQXF: a 150-mm-aperture high-field Nb3Sn quadrupole magnet. Moreover, the development phase starts with the fabrication and test of several short models (1.2-m magnetic length) and will continue with the development of several long prototypes. All of them are mechanically supported using a shell-based support structure, which has been extensively demonstrated on several R&D models within LARP. The first short model MQXFS-AT has been assembled at LBNL with coils fabricated by LARP and CERN.more » In our paper, we summarize the assembly process and show how it relies strongly on experience acquired during the LARP 120-mm-aperture HQ magnet series. We also present comparison between strain gauges data and finite-element model analysis. Finally, we present the implication of the MQXFS-AT experience on the design of the long prototype support structure.« less

  17. Transverse beam splitting made operational: Key features of the multiturn extraction at the CERN Proton Synchrotron

    NASA Astrophysics Data System (ADS)

    Huschauer, A.; Blas, A.; Borburgh, J.; Damjanovic, S.; Gilardoni, S.; Giovannozzi, M.; Hourican, M.; Kahle, K.; Le Godec, G.; Michels, O.; Sterbini, G.; Hernalsteens, C.

    2017-06-01

    Following a successful commissioning period, the multiturn extraction (MTE) at the CERN Proton Synchrotron (PS) has been applied for the fixed-target physics programme at the Super Proton Synchrotron (SPS) since September 2015. This exceptional extraction technique was proposed to replace the long-serving continuous transfer (CT) extraction, which has the drawback of inducing high activation in the ring. MTE exploits the principles of nonlinear beam dynamics to perform loss-free beam splitting in the horizontal phase space. Over multiple turns, the resulting beamlets are then transferred to the downstream accelerator. The operational deployment of MTE was rendered possible by the full understanding and mitigation of different hardware limitations and by redesigning the extraction trajectories and nonlinear optics, which was required due to the installation of a dummy septum to reduce the activation of the magnetic extraction septum. This paper focuses on these key features including the use of the transverse damper and the septum shadowing, which allowed a transition from the MTE study to a mature operational extraction scheme.

  18. PREFACE: International Workshop on Discovery Physics at the LHC (Kruger2012)

    NASA Astrophysics Data System (ADS)

    Cleymans, Jean

    2013-08-01

    The second conference on 'Discovery Physics at the LHC' was held on 3-7 December 2012 at the Kruger Gate Hotel in South Africa. In total there were 110 participants from Armenia, Belgium, Brazil, Canada, Czech Republic, France, Germany, Greece, Israel, Italy, Norway, Poland, USA, Russia, Slovakia, Spain, Sweden, United Kingdom, Switzerland and South Africa. The latest results from the Large Hadron Collider, Brookhaven National Laboratory, Jefferson Laboratory and BABAR experiments, as well as the latest theoretical insights were presented. Set against the backdrop of the majestic Kruger National Park a very stimulating conference with many exchanges took place. The proceedings reflect the high standard of the conference. The financial contributions from the National Institute for Theoretical Physics (NITHeP), the SA-CERN programme, the UCT-CERN Research Centre, the University of Johannesburg, the University of the Witwatersrand and iThemba Labs—Laboratory for Accelerator Based Science are gratefully acknowledged. Jean Cleymans Chair of the Local Organizing Committee Local Organizing Committee Oana Boeriu Jean Cleymans Simon H Connell Alan S Cornell William A Horowitz Andre Peshier Trevor Vickey Zeblon Z Vilakazi Group picture

  19. The measurement programme at the neutron time-of-flight facility n_TOF at CERN

    NASA Astrophysics Data System (ADS)

    Gunsing, F.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Bacak, M.; Balibrea-Correa, J.; Barbagallo, M.; Barros, S.; Bečvář, F.; Beinrucker, C.; Belloni, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Brugger, M.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Castelluccio, D. M.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Colonna, N.; Cortés-Giraldo, M. A.; Cortés, G.; Cosentino, L.; Damone, L. A.; Deo, K.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Frost, R. J. W.; Furman, V.; Ganesan, S.; García, A. R.; Gawlik, A.; Gheorghe, I.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Goverdovski, A.; Griesmayer, E.; Guerrero, C.; Göbel, K.; Harada, H.; Heftrich, T.; Heinitz, S.; Hernández-Prieto, A.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Katabuchi, T.; Kavrigin, P.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lerendegui, J.; Licata, M.; Meo, S. Lo; Lonsdale, S. J.; Losito, R.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Montesano, S.; Musumarra, A.; Nolte, R.; Negret, A.; Oprea, A.; Palomo-Pinto, F. R.; Paradela, C.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Quesada, J. M.; Radeck, D.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego-Perez, A.; Robles, M.; Rout, P.; Rubbia, C.; Ryan, J. A.; Sabaté-Gilarte, M.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Suryanarayana, S. V.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Warren, S.; Weigand, M.; Weiss, C.; Wolf, C.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    Neutron-induced reaction cross sections are important for a wide variety of research fields ranging from the study of nuclear level densities, nucleosynthesis to applications of nuclear technology like design, and criticality and safety assessment of existing and future nuclear reactors, radiation dosimetry, medical applications, nuclear waste transmutation, accelerator-driven systems and fuel cycle investigations. Simulations and calculations of nuclear technology applications largely rely on evaluated nuclear data libraries. The evaluations in these libraries are based both on experimental data and theoretical models. CERN's neutron time-of-flight facility n_TOF has produced a considerable amount of experimental data since it has become fully operational with the start of its scientific measurement programme in 2001. While for a long period a single measurement station (EAR1) located at 185 m from the neutron production target was available, the construction of a second beam line at 20 m (EAR2) in 2014 has substantially increased the measurement capabilities of the facility. An outline of the experimental nuclear data activities at n_TOF will be presented.

  20. Light ion production for a future radiobiological facility at CERN: Preliminary studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stafford-Haworth, Joshua, E-mail: Joshua.Stafford-Haworth@cern.ch; John Adams Institute at Royal Holloway, University of London, Egham, Surrey TW20 0EX; Bellodi, Giulia

    2014-02-15

    Recent medical applications of ions such as carbon and helium have proved extremely effective for the treatment of human patients. However, before now a comprehensive study of the effects of different light ions on organic targets has not been completed. There is a strong desire for a dedicated facility which can produce ions in the range of protons to neon in order to perform this study. This paper will present the proposal and preliminary investigations into the production of light ions, and the development of a radiobiological research facility at CERN. The aims of this project will be presented alongmore » with the modifications required to the existing linear accelerator (Linac3), and the foreseen facility, including the requirements for an ion source in terms of some of the specification parameters and the flexibility of operation for different ion types. Preliminary results from beam transport simulations will be presented, in addition to some planned tests required to produce some of the required light ions (lithium, boron) to be conducted in collaboration with the Helmholtz-Zentrum für Materialien und Energie, Berlin.« less

  1. RF low-level control for the Linac4 H{sup −} source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butterworth, A., E-mail: andrew.butterworth@cern.ch; Grudiev, A.; Lettry, J.

    2015-04-08

    The H{sup −} source for the Linac4 accelerator at CERN uses an RF driven plasma for the production of H{sup −}. The RF is supplied by a 2 MHz RF tube amplifier with a maximum power output of 100 kW and a pulse duration of up to 2 ms. The low-level RF signal generation and measurement system has been developed using standard CERN controls electronics in the VME form factor. The RF frequency and amplitude reference signals are generated using separate arbitrary waveform generator channels. The frequency and amplitude are both freely programmable over the duration of the RF pulse, which allowsmore » fine-tuning of the excitation. Measurements of the forward and reverse RF power signals are performed via directional couplers using high-speed digitizers, and permit the estimation of the plasma impedance and deposited power via an equivalent circuit model. The low-level RF hardware and software implementations are described, and experimental results obtained with the Linac4 ion sources in the test stand are presented.« less

  2. Final Report for U.S. DOE GRANT No. DEFG02-96ER41015 November 1, 2010 - April 30, 2013 entitled HIGH ENERGY ACCELERATOR AND COLLIDING BEAM USER GROUP at the UNIVERSITY of MARYLAND

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadley, Nicholas; Jawahery, Abolhassan; Eno, Sarah C

    2013-07-26

    We have finished the third year of a three year grant cycle with the U.S. Department of Energy for which we were given a five month extension (U.S. D.O.E. Grant No. DEFG02-96ER41015). This document is the fi nal report for this grant and covers the period from November 1, 2010 to April 30, 2013. The Maryland program is administered as a single task with Professor Nicholas Hadley as Principal Investigator. The Maryland experimental HEP group is focused on two major research areas. We are members of the CMS experiment at the LHC at CERN working on the physics of themore » Energy Frontier. We are also analyzing the data from the Babar experiment at SLAC while doing design work and R&D towards a Super B experiment as part of the Intensity Frontier. We have recently joined the LHCb experiment at CERN. We concluded our activities on the D experiment at Fermilab in 2009.« less

  3. Upgrade of the beam extraction system of the GTS-LHC electron cyclotron resonance ion source at CERN.

    PubMed

    Toivanen, V; Bellodi, G; Dimov, V; Küchler, D; Lombardi, A M; Maintrot, M

    2016-02-01

    Linac3 is the first accelerator in the heavy ion injector chain of the Large Hadron Collider (LHC), providing multiply charged heavy ion beams for the CERN experimental program. The ion beams are produced with GTS-LHC, a 14.5 GHz electron cyclotron resonance ion source, operated in afterglow mode. Improvement of the GTS-LHC beam formation and beam transport along Linac3 is part of the upgrade program of the injector chain in preparation for the future high luminosity LHC. A mismatch between the ion beam properties in the ion source extraction region and the acceptance of the following Low Energy Beam Transport (LEBT) section has been identified as one of the factors limiting the Linac3 performance. The installation of a new focusing element, an einzel lens, into the GTS-LHC extraction region is foreseen as a part of the Linac3 upgrade, as well as a redesign of the first section of the LEBT. Details of the upgrade and results of a beam dynamics study of the extraction region and LEBT modifications will be presented.

  4. Heavy-ion physics with the ALICE experiment at the CERN Large Hadron Collider.

    PubMed

    Schukraft, J

    2012-02-28

    After close to 20 years of preparation, the dedicated heavy-ion experiment A Large Ion Collider Experiment (ALICE) took first data at the CERN Large Hadron Collider (LHC) accelerator with proton collisions at the end of 2009 and with lead nuclei at the end of 2010. After a short introduction into the physics of ultra-relativistic heavy-ion collisions, this article recalls the main design choices made for the detector and summarizes the initial operation and performance of ALICE. Physics results from this first year of operation concentrate on characterizing the global properties of typical, average collisions, both in proton-proton (pp) and nucleus-nucleus reactions, in the new energy regime of the LHC. The pp results differ, to a varying degree, from most quantum chromodynamics-inspired phenomenological models and provide the input needed to fine tune their parameters. First results from Pb-Pb are broadly consistent with expectations based on lower energy data, indicating that high-density matter created at the LHC, while much hotter and larger, still behaves like a very strongly interacting, almost perfect liquid.

  5. Light ion production for a future radiobiological facility at CERN: preliminary studies.

    PubMed

    Stafford-Haworth, Joshua; Bellodi, Giulia; Küchler, Detlef; Lombardi, Alessandra; Röhrich, Jörg; Scrivens, Richard

    2014-02-01

    Recent medical applications of ions such as carbon and helium have proved extremely effective for the treatment of human patients. However, before now a comprehensive study of the effects of different light ions on organic targets has not been completed. There is a strong desire for a dedicated facility which can produce ions in the range of protons to neon in order to perform this study. This paper will present the proposal and preliminary investigations into the production of light ions, and the development of a radiobiological research facility at CERN. The aims of this project will be presented along with the modifications required to the existing linear accelerator (Linac3), and the foreseen facility, including the requirements for an ion source in terms of some of the specification parameters and the flexibility of operation for different ion types. Preliminary results from beam transport simulations will be presented, in addition to some planned tests required to produce some of the required light ions (lithium, boron) to be conducted in collaboration with the Helmholtz-Zentrum für Materialien und Energie, Berlin.

  6. Electronic neutron sources for compensated porosity well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, A. X.; Antolak, A. J.; Leung, K. -N.

    2012-08-01

    The viability of replacing Americium–Beryllium (Am–Be) radiological neutron sources in compensated porosity nuclear well logging tools with D–T or D–D accelerator-driven neutron sources is explored. The analysis consisted of developing a model for a typical well-logging borehole configuration and computing the helium-3 detector response to varying formation porosities using three different neutron sources (Am–Be, D–D, and D–T). The results indicate that, when normalized to the same source intensity, the use of a D–D neutron source has greater sensitivity for measuring the formation porosity than either an Am–Be or D–T source. The results of the study provide operational requirements that enablemore » compensated porosity well logging with a compact, low power D–D neutron generator, which the current state-of-the-art indicates is technically achievable.« less

  7. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    PubMed

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Design study of beam transport lines for BioLEIR facility at CERN

    NASA Astrophysics Data System (ADS)

    Ghithan, S.; Roy, G.; Schuh, S.

    2017-09-01

    The biomedical community has asked CERN to investigate the possibility to transform the Low Energy Ion Ring (LEIR) accelerator into a multidisciplinary, biomedical research facility (BioLEIR) that could provide ample, high-quality beams of a range of light ions suitable for clinically oriented, fundamental research on cell cultures and for radiation instrumentation development. The present LEIR machine uses fast beam extraction to the next accelerator in the chain, eventually leading to the Large Hadron Collider (LHC) . To provide beam for a biomedical research facility, a new slow extraction system must be installed. Two horizontal and one vertical experimental beamlines were designed for transporting the extracted beam to three experimental end-stations. The vertical beamline (pencil beam) was designed for a maximum energy of 75 MeV/u for low-energy radiobiological research, while the two horizontal beamlines could deliver up to 440 MeV/u. One horizontal beamline shall be used preferentially for biomedical experiments and shall provide pencil beam and a homogeneous broad beam, covering an area of 5 × 5 cm2 with a beam homogeneity of ±5%. The second horizontal beamline will have pencil beam only and is intended for hardware developments in the fields of (micro-)dosimetry and detector development. The minimum full aperture of the beamlines is approximately 100 mm at all magnetic elements, to accommodate the expected beam envelopes. Seven dipoles and twenty quadrupoles are needed for a total of 65 m of beamlines to provide the specified beams. In this paper we present the optical design for the three beamlines.

  9. Development of the cryogenic system of AEgIS at CERN

    NASA Astrophysics Data System (ADS)

    Derking, J. H.; Bremer, J.; Burghart, G.; Doser, M.; Dudarev, A.; Haider, S.

    2014-01-01

    The AEgIS (Antimatter Experiment: Gravity, Interferometry, Spectroscopy) experiment is located at the antiproton decelerator complex of CERN. The main goal of the experiment is to perform the first direct measurement of the Earth's gravitational acceleration on antihydrogen atoms within 1% precision. The antihydrogen is produced in a cylindrical Penning trap by combining antiprotons with positrons. To reach the precision of 1%, the antihydrogen has to be cooled to 100 mK to reduce its random velocity. A dilution refrigerator is selected to deliver the necessary cooling capacity of 100 μW at 50 mK. The AEgIS cryogenic system basically consists of cryostats for a 1-T and for a 5-T superconducting magnet, a central region cryostat, a dilution refrigerator cryostat and a measurement cryostat with a Moiré deflectometer to measure the gravitational acceleration. In autumn 2012, the 1-T cryostat, 5-T cryostat and central region cryostat were assembled and commissioned. The apparatus is cooled down in eight days using 2500 L of liquid helium and liquid nitrogen. During operation, the average consumption of liquid helium is 150 Lṡday-1 and of liquid nitrogen 5 Lṡday-1. The temperature sensors at the Penning traps measured 12 K to 18 K, which is higher than expected. Simulations show that this is caused by a bad thermalization of the trap wiring. The implementation of the sub-kelvin region is foreseen for mid-2015. The antihydrogen will be cooled down to 100 mK in an ultra-cold trap consisting of multiple high-voltage electrodes made of sapphire with gold plated electrode sectors.

  10. Development of a subway operation incident delay model using accelerated failure time approaches.

    PubMed

    Weng, Jinxian; Zheng, Yang; Yan, Xuedong; Meng, Qiang

    2014-12-01

    This study aims to develop a subway operational incident delay model using the parametric accelerated time failure (AFT) approach. Six parametric AFT models including the log-logistic, lognormal and Weibull models, with fixed and random parameters are built based on the Hong Kong subway operation incident data from 2005 to 2012, respectively. In addition, the Weibull model with gamma heterogeneity is also considered to compare the model performance. The goodness-of-fit test results show that the log-logistic AFT model with random parameters is most suitable for estimating the subway incident delay. First, the results show that a longer subway operation incident delay is highly correlated with the following factors: power cable failure, signal cable failure, turnout communication disruption and crashes involving a casualty. Vehicle failure makes the least impact on the increment of subway operation incident delay. According to these results, several possible measures, such as the use of short-distance and wireless communication technology (e.g., Wifi and Zigbee) are suggested to shorten the delay caused by subway operation incidents. Finally, the temporal transferability test results show that the developed log-logistic AFT model with random parameters is stable over time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. The ISOLDE facility and the HIE-HISOLDE project: Recent highlights

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borge, M. J. G.

    2014-07-23

    The ISOLDE facility at CERN has as objective the production, study and research of nuclei far from stability. The facility provides low energy radioactive beams and post-accelerated beams. In the last 45 years the ISOLDE facility has gathered unique expertise in research with radioactive beams. Over 700 isotopes of more than 70 elements have been used in a wide range of research domains, including cutting edge studies in nuclear structure, atomic physics, nuclear astrophysics, and fundamental interactions. These nuclear probes are also used to do frontier research in solid state and life sciences. There is an on-going upgrade of themore » facility, the HIE-ISOLDE project, which aims to improve the ISOLDE capabilities in a wide front, from an energy increase of the post-accelerated beam to improvements in beam quality and beam purity. The first phase of HIE-ISOLDE will start for physics in the autumn of 2015 with an upgrade of energy for all post-accelerated ISOLDE beams up to 5.5 MeV/u. In this contribution the most recent highlights of the facility are presented.« less

  12. Negative ion source development at the cooler synchrotron COSY/Jülich

    NASA Astrophysics Data System (ADS)

    Felden, O.; Gebel, R.; Maier, R.; Prasuhn, D.

    2013-02-01

    The Nuclear Physics Institute at the Forschungszentrum Jülich, a member of the Helmholtz Association, conducts experimental and theoretical basic research in the field of hadron, particle, and nuclear physics. It operates the cooler synchrotron COSY, an accelerator and storage ring, which provides unpolarized and polarized proton and deuteron beams with beam momenta of up to 3.7 GeV/c. Main activities of the accelerator division are the design and construction of the high energy storage ring HESR, a synchrotron and part of the international FAIR project, and the operation and development of COSY with injector cyclotron and ion sources. Filament driven volume sources and a charge exchange colliding beams source, based on a nuclear polarized atomic beam source, provide unpolarized and polarized H- or D- routinely for more than 6500 hours/year. Within the Helmholtz Association's initiative Accelerator Research and Development, ARD, the existing sources at COSY, as well as new sources for future programs, are investigated and developed. The paper reports about these plans, improved pulsed beams from the volume sources and the preparation of a source for the ELENA project at CERN.

  13. Theoretical and Computational Investigation of High-Brightness Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chiping

    Theoretical and computational investigations of adiabatic thermal beams have been carried out in parameter regimes relevant to the development of advanced high-brightness, high-power accelerators for high-energy physics research and for various applications such as light sources. Most accelerator applications require high-brightness beams. This is true for high-energy accelerators such as linear colliders. It is also true for energy recovery linacs (ERLs) and free electron lasers (FELs) such as x-ray free electron lasers (XFELs). The breakthroughs and highlights in our research in the period from February 1, 2013 to November 30, 2013 were: a) Completion of a preliminary theoretical and computationalmore » study of adiabatic thermal Child-Langmuir flow (Mok, 2013); and b) Presentation of an invited paper entitled ?Adiabatic Thermal Beams in a Periodic Focusing Field? at Space Charge 2013 Workshop, CERN, April 16-19, 2013 (Chen, 2013). In this report, an introductory background for the research project is provided. Basic theory of adiabatic thermal Child-Langmuir flow is reviewed. Results of simulation studies of adiabatic thermal Child-Langmuir flows are discussed.« less

  14. The effect of MLC speed and acceleration on the plan delivery accuracy of VMAT

    PubMed Central

    Park, J M; Wu, H-G; Kim, J H; Carlson, J N K

    2015-01-01

    Objective: To determine a new metric utilizing multileaf collimator (MLC) speeds and accelerations to predict plan delivery accuracy of volumetric modulated arc therapy (VMAT). Methods: To verify VMAT delivery accuracy, gamma evaluations, analysis of mechanical parameter difference between plans and log files, and analysis of changes in dose–volumetric parameters between plans and plans reconstructed with log files were performed with 40 VMAT plans. The average proportion of leaf speeds ranging from l to h cm s−1 (Sl–h and l–h = 0–0.4, 0.4–0.8, 0.8–1.2, 1.2–1.6 and 1.6–2.0), mean and standard deviation of MLC speeds were calculated for each VMAT plan. The same was carried out for accelerations in centimetre per second squared (Al–h and l–h = 0–4, 4–8, 8–12, 12–16 and 16–20). The correlations of those indicators to plan delivery accuracy were analysed with Spearman's correlation coefficient (rs). Results: The S1.2–1.6 and mean acceleration of MLCs showed generally higher correlations to plan delivery accuracy than did others. The highest rs values were observed between S1.2–1.6 and global 1%/2 mm (rs = −0.698 with p < 0.001) as well as mean acceleration and global 1%/2 mm (rs = −0.650 with p < 0.001). As the proportion of MLC speeds and accelerations >0.4 and 4 cm s−2 increased, the plan delivery accuracy of VMAT decreased. Conclusion: The variations in MLC speeds and accelerations showed considerable correlations to VMAT delivery accuracy. Advances in knowledge: As the MLC speeds and accelerations increased, VMAT delivery accuracy reduced. PMID:25734490

  15. A polyvalent harmonic coil testing method for small-aperture magnets

    NASA Astrophysics Data System (ADS)

    Arpaia, Pasquale; Buzio, Marco; Golluccio, Giancarlo; Walckiers, Louis

    2012-08-01

    A method to characterize permanent and fast-pulsed iron-dominated magnets with small apertures is presented. The harmonic coil measurement technique is enhanced specifically for small-aperture magnets by (1) in situ calibration, for facing search-coil production inaccuracy, (2) rotating the magnet around its axis, for correcting systematic effects, and (3) measuring magnetic fluxes by stationary coils at different angular positions for measuring fast pulsed magnets. This method allows a quadrupole magnet for particle accelerators to be characterized completely, by assessing multipole field components, magnetic axis position, and field direction. In this paper, initially the metrological problems arising from testing small-aperture magnets are highlighted. Then, the basic ideas of the proposed method and the architecture of the corresponding measurement system are illustrated. Finally, experimental validation results are shown for small-aperture permanent and fast-ramped quadrupole magnets for the new linear accelerator Linac4 at CERN (European Organization for Nuclear Research).

  16. High energy physics in cosmic rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Lawrence W.

    2013-02-07

    In the first half-century of cosmic ray physics, the primary research focus was on elementary particles; the positron, pi-mesons, mu-mesons, and hyperons were discovered in cosmic rays. Much of this research was carried out at mountain elevations; Pic du Midi in the Pyrenees, Mt. Chacaltaya in Bolivia, and Mt. Evans/Echo Lake in Colorado, among other sites. In the 1960s, claims of the observation of free quarks, and satellite measurements of a significant rise in p-p cross sections, plus the delay in initiating accelerator construction programs for energies above 100 GeV, motivated the Michigan-Wisconsin group to undertake a serious cosmic raymore » program at Echo Lake. Subsequently, with the succession of higher energy accelerators and colliders at CERN and Fermilab, cosmic ray research has increasingly focused on cosmology and astrophysics, although some groups continue to study cosmic ray particle interactions in emulsion chambers.« less

  17. Using the FLUKA Monte Carlo Code to Simulate the Interactions of Ionizing Radiation with Matter to Assist and Aid Our Understanding of Ground Based Accelerator Testing, Space Hardware Design, and Secondary Space Radiation Environments

    NASA Technical Reports Server (NTRS)

    Reddell, Brandon

    2015-01-01

    Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.

  18. Coulomb Excitation of the N = 50 nucleus 80Zn

    NASA Astrophysics Data System (ADS)

    van de Walle, J.; Aksouh, F.; Ames, F.; Behrens, T.; Bildstein, V.; Blazhev, A.; Cederkäll, J.; Clément, E.; Cocolios, T. E.; Davinson, T.; Delahaye, P.; Eberth, J.; Ekström, A.; Fedorov, D. V.; Fedosseev, V. N.; Fraile, L. M.; Franchoo, S.; Gernhauser, R.; Georgiev, G.; Habs, D.; Heyde, K.; Huber, G.; Huyse, M.; Ibrahim, F.; Ivanov, O.; Iwanicki, J.; Jolie, J.; Kester, O.; Köster, U.; Kröll, T.; Krücken, R.; Lauer, M.; Lisetskiy, A. F.; Lutter, R.; Marsh, B. A.; Mayet, P.; Niedermaier, O.; Nilsson, T.; Pantea, M.; Perru, O.; Raabe, R.; Reiter, P.; Sawicka, M.; Scheit, H.; Schrieder, G.; Schwalm, D.; Seliverstov, M. D.; Sieber, T.; Sletten, G.; Smirnova, N.; Stanoiu, M.; Stefanescu, I.; Thomas, J.-C.; Valiente-Dobón, J. J.; van Duppen, P.; Verney, D.; Voulot, D.; Warr, N.; Weisshaar, D.; Wenander, F.; Wolf, B. H.; Zielińska, M.

    2008-05-01

    Neutron rich Zinc isotopes, including the N = 50 nucleus 80Zn, were produced and post-accelerated at the Radioactive Ion Beam (RIB) facility REX-ISOLDE (CERN). Low-energy Coulomb excitation was induced on these isotopes after post-acceleration, yielding B(E2) strengths to the first excited 2+ states. For the first time, an excited state in 80Zn was observed and the 21+ state in 78Zn was established. The measured B(E2,21+-->01+) values are compared to two sets of large scale shell model calculations. Both calculations reproduce the observed B(E2) systematics for the full Zinc isotopic chain. The results for N = 50 isotones indicate a good N = 50 shell closure and a strong Z = 28 proton core polarization. The new results serve as benchmarks to establish theoretical models, predicting the nuclear properties of the doubly magic nucleus 78Ni.

  19. A New Concept of Controller for Accelerators' Magnet Power Supplies

    NASA Astrophysics Data System (ADS)

    Visintini, Roberto; Cleva, Stefano; Cautero, Marco; Ciesla, Tomasz

    2016-04-01

    The complexity of a particle accelerator implies the remote control of very large numbers of devices, with many different typologies, either distributed along the accelerator or concentrated in locations, often far away from each other. Local and global control systems handle the devices through dedicated communication channels and interfaces. Each controlled device is practically a “smart node” performing a specific task. In addition, very often, those tasks are managed in real-time mode. The performances required to the control interface has an influence on the cost of the distributed nodes as well as on their hardware and software implementation. In large facilities (e.g. CERN) the “smart nodes” derive from specific in-house developments. Alternatively, it is possible to find on the market commercial devices, whose performances (and prices) are spread over a broad range, and spanning from proprietary design (customizable to the user's needs) to open source/design. In this paper, we will describe some applications of smart nodes in the particle accelerators field, with special focus on the power supplies for magnets. In modern accelerators, in fact, magnets and their associated power supplies constitute systems distributed along the accelerator itself, and strongly interfaced with the remote control system as well as with more specific (and often more demanding) orbit/trajectory feedback systems. We will give examples of actual systems, installed and operational on two light sources, Elettra and FERMI, located in the Elettra Research Center in Trieste, Italy.

  20. Thin Film Approaches to the SRF Cavity Problem: Fabrication and Characterization of Superconducting Thin Films

    NASA Astrophysics Data System (ADS)

    Beringer, Douglas B.

    Superconducting Radio Frequency (SRF) cavities are responsible for the acceleration of charged particles to relativistic velocities in most modern linear accelerators, such as those employed at high-energy research facilities like Thomas Jefferson National Laboratory's CEBAF and the LHC at CERN. Recognizing SRF as primarily a surface phenomenon enables the possibility of applying thin films to the interior surface of SRF cavities, opening a formidable tool chest of opportunities by combining and designing materials that offer greater benefit. Thus, while improvements in radio frequency cavity design and refinements in cavity processing techniques have improved accelerator performance and efficiency - 1.5 GHz bulk niobium SRF cavities have achieved accelerating gradients in excess of 35 MV/m - there exist fundamental material bounds in bulk superconductors limiting the maximally sustained accelerating field gradient (approximately 45 MV/m for Niobium) where inevitable thermodynamic breakdown occurs. With state of the art niobium based cavity design fast approaching these theoretical limits, novel material innovations must be sought in order to realize next generation SRF cavities. One proposed method to improve SRF performance is to utilize thin film superconducting-insulating-superconducting (SIS) multilayer structures to effectively magnetically screen a bulk superconducting layer such that it can operate at higher field gradients before suffering critically detrimental SRF losses. This dissertation focuses on the production and characterization of thin film superconductors for such SIS layers for radio-frequency applications.

  1. Contributions to the mini-workshop on beam-beam compensation in the Tevatron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiltsev, V.

    1998-02-01

    The purpose of the Workshop was to assay the current understanding of compensation of the beam-beam effects in the Tevatron with use of low-energy high-current electron beam, relevant accelerator technology, along with other novel techniques of the compensation and previous attempts. About 30 scientists representing seven institutions from four countries--FNAL, SLAC, BNL, Novosibirsk, CERN, and Dubna were in attendance. Twenty one talks were presented. The event gave firm ground for wider collaboration on experimental test of the compensation at the Tevatron collider. This report consists of vugraphs of talks given at the meeting.

  2. 2016 FACET-II Science Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Mark J.

    The second in a series of FACET-II Science Workshops was held at SLAC National Accelerator Laboratory on October 17-19, 2016 [1]. The workshop drew thirty-five participants from eighteen different institutions including CERN, DESY, Ecole Polytechnique, FNAL, JAI, LBNL, LLNL, Radiabeam, Radiasoft, SLAC, Stony Brook, Strathclyde, Tech-X, Tsinghua, UC Boulder, UCLA and UT Austin. The 2015 workshop [2, 3] helped prioritize research directions for FACET-II. The 2016 workshop was focused on understanding what improvements are needed at the facility to support the next generation of experiments. All presentations are linked to the workshop website as a permanent record.

  3. Race for the Higgs hots up as Tevatron seeks extension

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2009-12-01

    With researchers at CERN's Large Hadron Collider (LHC) having circulated protons for the first time since last year's accident, the US Department of Energy (DOE) is requesting 25m so that the Tevatron collider at the Fermi National Accelerator Laboratory in Illinois can run for an extra year until 2011. If the additional funding is granted, it would give physicists in the US an extra 12 months to close in on discovering the elusive Higgs boson. The DOE's request will now be reviewed before being part of President Barack Obama's 2011 budget request, which will be sent to Congress in February.

  4. First Accelerator Test of the Kinematic Lightweight Energy Meter (KLEM) Prototype

    NASA Technical Reports Server (NTRS)

    Bashindzhagyan, G.; Adams, J. H.; Bashindzhagyan, P.; Chilingarian, A.; Donnelly, J.; Drury, L.; Egorov, N.; Golubkov, S.; Grebenyuk, V.; Kalinin, A.; hide

    2002-01-01

    The essence of the KLEM (Kinematic Lightweight Energy Meter) instrument is to directly measure the elemental energy spectra of high-energy cosmic rays by determining the angular distribution of secondary particles produced in a target. The first test of the simple KLEM prototype has been performed at the CERN SPS test-beam with 180 GeV pions during 2001. The results of the first test analysis confirm that, using the KLEM method, the energy of 180 GeV pions can be measured with a relative error of about 67%, which is very close to the results of the simulation (65%).

  5. SU-C-BRD-03: Analysis of Accelerator Generated Text Logs for Preemptive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    2014-06-15

    Purpose: To develop a model to analyze medical accelerator generated parameter and performance data that will provide an early warning of performance degradation and impending component failure. Methods: A robust 6 MV VMAT quality assurance treatment delivery was used to test the constancy of accelerator performance. The generated text log files were decoded and analyzed using statistical process control (SPC) methodology. The text file data is a single snapshot of energy specific and overall systems parameters. A total of 36 system parameters were monitored which include RF generation, electron gun control, energy control, beam uniformity control, DC voltage generation, andmore » cooling systems. The parameters were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and the parameter/system specification. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: the value of 1 standard deviation from the mean operating parameter of 483 TB systems, a small fraction (≤ 5%) of the operating range, or a fraction of the minor fault deviation. Results: There were 34 parameters in which synthetic errors were introduced. There were 2 parameters (radial position steering coil, and positive 24V DC) in which the errors did not exceed the limit of the I/MR chart. The I chart limit was exceeded for all of the remaining parameters (94.2%). The MR chart limit was exceeded in 29 of the 32 parameters (85.3%) in which the I chart limit was exceeded. Conclusion: Statistical process control I/MR evaluation of text log file parameters may be effective in providing an early warning of performance degradation or component failure for digital medical accelerator systems. Research is Supported by Varian Medical Systems, Inc.« less

  6. Viewpoint: the End of the World at the Large Hadron Collider?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peskin, Michael E.; /SLAC

    New arguments based on astrophysical phenomena constrain the possibility that dangerous black holes will be produced at the CERN Large Hadron Collider. On 8 August, the Large Hadron Collider (LHC) at CERN injected its first beams, beginning an experimental program that will produce proton-proton collisions at an energy of 14 TeV. Particle physicists are waiting expectantly. The reason is that the Standard Model of strong, weak, and electromagnetic interactions, despite its many successes, is clearly incomplete. Theory says that the holes in the model should be filled by new physics in the energy region that will be studied by themore » LHC. Some candidate theories are simple quick fixes, but the most interesting ones involve new concepts of spacetime waiting to be discovered. Look up the LHC on Wikipedia, however, and you will find considerable space devoted to safety concerns. At the LHC, we will probe energies beyond those explored at any previous accelerator, and we hope to create particles that have never been observed. Couldn't we, then, create particles that would actually be dangerous, for example, ones that would eat normal matter and eventually turn the earth into a blob of unpleasantness? It is morbid fun to speculate about such things, and candidates for such dangerous particles have been suggested. These suggestions have been analyzed in an article in Reviews of Modern Physics by Jaffe, Busza, Wilczek, and Sandweiss and excluded on the basis of constraints from observation and from the known laws of physics. These conclusions have been upheld by subsequent studies conducted at CERN.« less

  7. Sound waves in hadronic matter

    NASA Astrophysics Data System (ADS)

    Wilk, Grzegorz; Włodarczyk, Zbigniew

    2018-01-01

    We argue that recent high energy CERN LHC experiments on transverse momenta distributions of produced particles provide us new, so far unnoticed and not fully appreciated, information on the underlying production processes. To this end we concentrate on the small (but persistent) log-periodic oscillations decorating the observed pT spectra and visible in the measured ratios R = σdata(pT) / σfit (pT). Because such spectra are described by quasi-power-like formulas characterised by two parameters: the power index n and scale parameter T (usually identified with temperature T), the observed logperiodic behaviour of the ratios R can originate either from suitable modifications of n or T (or both, but such a possibility is not discussed). In the first case n becomes a complex number and this can be related to scale invariance in the system, in the second the scale parameter T exhibits itself log-periodic oscillations which can be interpreted as the presence of some kind of sound waves forming in the collision system during the collision process, the wave number of which has a so-called self similar solution of the second kind. Because the first case was already widely discussed we concentrate on the second one and on its possible experimental consequences.

  8. QALMA: A computational toolkit for the analysis of quality protocols for medical linear accelerators in radiation therapy

    NASA Astrophysics Data System (ADS)

    Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios

    2018-01-01

    Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.

  9. The dependence of PGA and PGV on distance and magnitude inferred from Northern California ShakeMap data

    USGS Publications Warehouse

    Boatwright, J.; Bundock, H.; Luetgert, J.; Seekins, L.; Gee, L.; Lombard, P.

    2003-01-01

    We analyze peak ground velocity (PGV) and peak ground acceleration (PGA) data from 95 moderate (3.5 ??? M 100 km, the peak motions attenuate more rapidly than a simple power law (that is, r-??) can fit. Instead, we use an attenuation function that combines a fixed power law (r-0.7) with a fitted exponential dependence on distance, which is estimated as expt(-0.0063r) and exp(-0.0073r) for PGV and PGA, respectively, for moderate earthquakes. We regress log(PGV) and log(PGA) as functions of distance and magnitude. We assume that the scaling of log(PGV) and log(PGA) with magnitude can differ for moderate and large earthquakes, but must be continuous. Because the frequencies that carry PGV and PGA can vary with earthquake size for large earthquakes, the regression for large earthquakes incorporates a magnitude dependence in the exponential attenuation function. We fix the scaling break between moderate and large earthquakes at M 5.5; log(PGV) and log(PGA) scale as 1.06M and 1.00M, respectively, for moderate earthquakes and 0.58M and 0.31M for large earthquakes.

  10. Line-driven disc wind model for ultrafast outflows in active galactic nuclei - scaling with luminosity

    NASA Astrophysics Data System (ADS)

    Nomura, M.; Ohsuga, K.

    2017-03-01

    In order to reveal the origin of the ultrafast outflows (UFOs) that are frequently observed in active galactic nuclei (AGNs), we perform two-dimensional radiation hydrodynamics simulations of the line-driven disc winds, which are accelerated by the radiation force due to the spectral lines. The line-driven winds are successfully launched for the range of MBH = 106-9 M⊙ and ε = 0.1-0.5, and the resulting mass outflow rate (dot{M_w}), momentum flux (dot{p_w}), and kinetic luminosity (dot{E_w}) are in the region containing 90 per cent of the posterior probability distribution in the dot{M}_w-Lbol plane, dot{p}_w-Lbol plane, and dot{E}_w-Lbol plane shown in Gofford et al., where MBH is the black hole mass, ε is the Eddington ratio, and Lbol is the bolometric luminosity. The best-fitting relations in Gofford et al., d log dot{M_w}/d log {L_bol}˜ 0.9, d log dot{p_w}/d log {L_bol}˜ 1.2, and d log dot{E_w}/d log {L_bol}˜ 1.5, are roughly consistent with our results, d log dot{M_w}/d log {L_bol}˜ 9/8, d log dot{p_w}/d log {L_bol}˜ 10/8, and d log dot{E_w}/d log {L_bol}˜ 11/8. In addition, our model predicts that no UFO features are detected for the AGNs with ε ≲ 0.01, since the winds do not appear. Also, only AGNs with MBH ≲ 108 M⊙ exhibit the UFOs when ε ∼ 0.025. These predictions nicely agree with the X-ray observations. These results support that the line-driven disc wind is the origin of the UFOs.

  11. Analysis of CERN computing infrastructure and monitoring data

    NASA Astrophysics Data System (ADS)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  12. High fidelity 3-dimensional models of beam-electron cloud interactions in circular accelerators

    NASA Astrophysics Data System (ADS)

    Feiz Zarrin Ghalam, Ali

    Electron cloud is a low-density electron profile created inside the vacuum chamber of circular machines with positively charged beams. Electron cloud limits the peak current of the beam and degrades the beams' quality through luminosity degradation, emittance growth and head to tail or bunch to bunch instability. The adverse effects of electron cloud on long-term beam dynamics becomes more and more important as the beams go to higher and higher energies. This problem has become a major concern in many future circular machines design like the Large Hadron Collider (LHC) under construction at European Center for Nuclear Research (CERN). Due to the importance of the problem several simulation models have been developed to model long-term beam-electron cloud interaction. These models are based on "single kick approximation" where the electron cloud is assumed to be concentrated at one thin slab around the ring. While this model is efficient in terms of computational costs, it does not reflect the real physical situation as the forces from electron cloud to the beam are non-linear contrary to this model's assumption. To address the existing codes limitation, in this thesis a new model is developed to continuously model the beam-electron cloud interaction. The code is derived from a 3-D parallel Particle-In-Cell (PIC) model (QuickPIC) originally used for plasma wakefield acceleration research. To make the original model fit into circular machines environment, betatron and synchrotron equations of motions have been added to the code, also the effect of chromaticity, lattice structure have been included. QuickPIC is then benchmarked against one of the codes developed based on single kick approximation (HEAD-TAIL) for the transverse spot size of the beam in CERN-LHC. The growth predicted by QuickPIC is less than the one predicted by HEAD-TAIL. The code is then used to investigate the effect of electron cloud image charges on the long-term beam dynamics, particularly on the transverse tune shift of the beam at CERN Super Proton Synchrotron (SPS) ring. The force from the electron cloud image charges on the beam cancels the force due to cloud compression formed on the beam axis and therefore the tune shift is mainly due to the uniform electron cloud density. (Abstract shortened by UMI.)

  13. Development of the cryogenic system of AEgIS at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Derking, J. H.; Bremer, J.; Burghart, G.

    2014-01-29

    The AEgIS (Antimatter Experiment: Gravity, Interferometry, Spectroscopy) experiment is located at the antiproton decelerator complex of CERN. The main goal of the experiment is to perform the first direct measurement of the Earth’s gravitational acceleration on antihydrogen atoms within 1% precision. The antihydrogen is produced in a cylindrical Penning trap by combining antiprotons with positrons. To reach the precision of 1%, the antihydrogen has to be cooled to 100 mK to reduce its random velocity. A dilution refrigerator is selected to deliver the necessary cooling capacity of 100 μW at 50 mK. The AEgIS cryogenic system basically consists of cryostatsmore » for a 1-T and for a 5-T superconducting magnet, a central region cryostat, a dilution refrigerator cryostat and a measurement cryostat with a Moiré deflectometer to measure the gravitational acceleration. In autumn 2012, the 1-T cryostat, 5-T cryostat and central region cryostat were assembled and commissioned. The apparatus is cooled down in eight days using 2500 L of liquid helium and liquid nitrogen. During operation, the average consumption of liquid helium is 150 L⋅day{sup −1} and of liquid nitrogen 5 L⋅day{sup −1}. The temperature sensors at the Penning traps measured 12 K to 18 K, which is higher than expected. Simulations show that this is caused by a bad thermalization of the trap wiring. The implementation of the sub-kelvin region is foreseen for mid-2015. The antihydrogen will be cooled down to 100 mK in an ultra-cold trap consisting of multiple high-voltage electrodes made of sapphire with gold plated electrode sectors.« less

  14. The radiation field measurement and analysis outside the shielding of A 10 MeV electron irradiation accelerator

    NASA Astrophysics Data System (ADS)

    Shang, Jing; Li, Juexin; Xu, Bing; Li, Yuxiong

    2011-10-01

    Electron accelerators are employed widely for diverse purposes in the irradiation-processing industry, from sterilizing medical products to treating gemstones. Because accelerators offer high efficiency, high power, and require little preventative maintenance, they are becoming more and more popular than using the 60Co isotope approach. However, the electron accelerator exposes potential radiation hazards. To protect workers and the public from exposure to radiation, the radiation field around the electronic accelerator must be assessed, especially that outside the shielding. Thus, we measured the radiation dose at different positions outside the shielding of a 10-MeV electron accelerator using a new data-acquisition unit named Mini-DDL (Mini-Digital Data Logging). The measurements accurately reflect the accelerator's radiation status. In this paper, we present our findings, results and compare them with our theoretical calculations. We conclude that the measurements taken outside the irradiation hall are consistent with the findings from our calculations, except in the maze outside the door of the accelerator room. We discuss the reason for this discrepancy.

  15. Hadro-Production Measurements to Characterize the T2K Neutrino Flux with the NA61 Experiment at the CERN SPS

    NASA Astrophysics Data System (ADS)

    Bravar, Alessandro

    2010-03-01

    As the intensity of neutrino beams produced at accelerators increases, the systematic errors due to the poor characterization of the neutrino flux become a limiting factor for high precision neutrino oscillation experiments like T2K. This limitation comes mainly from the poor knowledge of production cross sections for pions and kaons at the same energy and over the same phase-space yielding these neutrino beams. Therefore new hadro-production measurements are mandatory. The NA61/SHINE is a large acceptance hadron spectrometer at the CERN-SPS designed for the study of the hadronic final states produced in interactions of various beam particles (protons, π's, and heavy ions) with a variety of fixed targets at the SPS energies. Ongoing measurements with the NA61 detector for characterizing the neutrino beam of the T2K experiment at J-PARC are introduced. These measurements are performed using a 30 GeV proton beam impinging on carbon targets of different lengths, including a replica of the T2K target. The performance of the NA61 detector and preliminary NA61 measurements from the 2007 run are presented.

  16. Test strategies for industrial testers for converter controls equipment

    NASA Astrophysics Data System (ADS)

    Oleniuk, P.; Di Cosmo, M.; Kasampalis, V.; Nisbet, D.; Todd, B.; Uznański, S.

    2017-04-01

    Power converters and their controls electronics are key elements for the operation of the CERN accelerator complex, having a direct impact on its availability. To prevent early-life failures and provide means to verify electronics, a set of industrial testers is used throughout the converters controls electronics' life cycle. The roles of the testers are to validate mass production during the manufacturing phase and to provide means to diagnose and repair failed modules that are brought back from operation. In the converter controls electronics section of the power converters group in the technology department of CERN (TE/EPC/CCE), two main test platforms have been adopted: a PXI platform for mixed analogue-digital functional tests and a JTAG Boundary-Scan platform for digital interconnection and functional tests. Depending on the functionality of the device under test, the appropriate test platforms are chosen. This paper is a follow-up to results presented at the TWEPP 2015 conference, adding the boundary scan test platform and the first results from exploitation of the test system. This paper reports on the test software, hardware design and test strategy applied for a number of devices that has resulted in maximizing test coverage and minimizing test design effort.

  17. The SHiP project at CERN

    NASA Astrophysics Data System (ADS)

    De Lellis, G.; SHiP Collaboration

    2016-07-01

    The discovery of the Higgs boson has fully confirmed the Standard Model of particles and fields. Nevertheless, there are still fundamental phenomena, like the existence of dark matter and the baryon asymmetry, which deserve an explanation that could come from the discovery of new particles. Searches for new physics with accelerators are performed at the LHC, looking for high massive particles coupled to matter with ordinary strength. A new experimental facility at CERN meant to search for very weakly coupled particles in the few GeV mass domain has been recently proposed. The existence of such particles, foreseen in different theoretical models beyond the Standard Model, is largely unexplored. A beam dump facility using 400 GeV protons is a copious factory of charmed hadrons and could be used to probe the existence of such particles. The beam dump is also a copious source of neutrinos and in particular it is an ideal source of tau neutrinos, the less known particle in the Standard Model. Indeed, tau anti-neutrinos have not been directly observed so far. We report the physics potential of such an experiment. Resistive Plate Chambers could play a role in the SHiP detector.

  18. Upgrade of the beam extraction system of the GTS-LHC electron cyclotron resonance ion source at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toivanen, V., E-mail: ville.aleksi.toivanen@cern.ch; Bellodi, G.; Dimov, V.

    2016-02-15

    Linac3 is the first accelerator in the heavy ion injector chain of the Large Hadron Collider (LHC), providing multiply charged heavy ion beams for the CERN experimental program. The ion beams are produced with GTS-LHC, a 14.5 GHz electron cyclotron resonance ion source, operated in afterglow mode. Improvement of the GTS-LHC beam formation and beam transport along Linac3 is part of the upgrade program of the injector chain in preparation for the future high luminosity LHC. A mismatch between the ion beam properties in the ion source extraction region and the acceptance of the following Low Energy Beam Transport (LEBT)more » section has been identified as one of the factors limiting the Linac3 performance. The installation of a new focusing element, an einzel lens, into the GTS-LHC extraction region is foreseen as a part of the Linac3 upgrade, as well as a redesign of the first section of the LEBT. Details of the upgrade and results of a beam dynamics study of the extraction region and LEBT modifications will be presented.« less

  19. DOSE EFFECT OF THE 33S(n,α) 30SI REACTION IN BNCT USING THE NEW n_TOF-CERN DATA.

    PubMed

    Sabaté-Gilarte, M; Praena, J; Porras, I; Quesada, J M

    2017-09-23

    33S is a stable isotope of sulphur which is being studied as a potential cooperative target for Boron Neutron Capture Therapy (BNCT) in accelerator-based neutron sources because of its large (n,α) cross section in the epithermal neutron energy range. Previous measurements resolved the resonances with a discrepant description of the lowest-lying and strongest one (at 13.5 keV). However, the evaluations of the major databases do not include resonances, except EAF-2010 which shows smaller values in this range than the experimental data. Furthermore, the glaring lack of data below 10 keV down to thermal (25.3 meV) has motivated a new measurement at n_TOF at CERN in order to cover the whole energy range. The inclusion of this new 33S(n,α) cross section in Monte Carlo simulations provides a more accurate estimation of the deposited kerma rate in tissue due to the presence of 33S. The results of those simulations represent the goal of this work. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Accelerator science and technology in Europe 2008-2017

    NASA Astrophysics Data System (ADS)

    Romaniuk, Ryszard S.

    2013-10-01

    European Framework Research Projects have recently added a lot of meaning to the building process of the ERA - the European Research Area. Inside this, the accelerator technology plays an essential role. Accelerator technology includes large infrastructure and intelligent, modern instrumentation embracing mechatronics, electronics, photonics and ICT. During the realization of the European research and infrastructure project FP6 CARE 2004-2008 (Coordinated Accelerator Research in Europe), concerning the development of large accelerator infrastructure in Europe, it was decided that a scientific editorial series of peer-reviewed monographs from this research area will be published in close relation with the projects. It was a completely new and quite brave idea to combine a kind of a strictly research publisher with a transient project, lasting only four or five years. Till then nobody did something like that. The idea turned out to be a real success. The publications now known and valued in the accelerator world, as the (CERN-WUT) Editorial Series on Accelerator Science and Technology, is successfully continued in already the third European project EuCARD2 and has logistic guarantees, for the moment, till the 2017, when it will mature to its first decade. During the realization of the European projects EuCARD (European Coordination for Accelerator R&D 2009-2013 and TIARA (Test Infrastructure of Accelerator Research Area in Europe) there were published 18 volumes in this series. The ambitious plans for the nearest years is to publish, hopefully, a few tens of new volumes. Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. The paper presents a digest of the research results in the domain of accelerator science and technology in Europe, published in the monographs of the European Framework Projects (FP) on accelerator technology. The succession of CARE, EuCARD and EuCARD Projects is evidently creating a new quality in the European Accelerator Research. It is consolidating the technical and research communities in a new way, completely different than the traditional ones, for example via the periodic topical conferences.

  1. Does the Intel Xeon Phi processor fit HEP workloads?

    NASA Astrophysics Data System (ADS)

    Nowak, A.; Bitzes, G.; Dotti, A.; Lazzaro, A.; Jarp, S.; Szostek, P.; Valsan, L.; Botezatu, M.; Leduc, J.

    2014-06-01

    This paper summarizes the five years of CERN openlab's efforts focused on the Intel Xeon Phi co-processor, from the time of its inception to public release. We consider the architecture of the device vis a vis the characteristics of HEP software and identify key opportunities for HEP processing, as well as scaling limitations. We report on improvements and speedups linked to parallelization and vectorization on benchmarks involving software frameworks such as Geant4 and ROOT. Finally, we extrapolate current software and hardware trends and project them onto accelerators of the future, with the specifics of offline and online HEP processing in mind.

  2. Graphics Processing Units for HEP trigger systems

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.; Neri, I.; Paolucci, P. S.; Piandani, R.; Pontisso, L.; Rescigno, M.; Simula, F.; Sozzi, M.; Vicini, P.

    2016-07-01

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  3. High Energy Colliding Beams; What Is Their Future?

    NASA Astrophysics Data System (ADS)

    Richter, Burton

    The success of the first few years of LHC operations at CERN, and the expectation of more to come as the LHC's performance improves, are already leading to discussions of what should be next for both proton-proton and electron-positron colliders. In this discussion I see too much theoretical desperation caused by the so-far-unsuccessful hunt for what is beyond the Standard Model, and too little of the necessary interaction of the accelerator, experimenter, and theory communities necessary for a scientific and engineering success. Here, I give my impressions of the problem, its possible solution, and what is needed to have both a scientifically productive and financially viable future.

  4. High Energy Colliding Beams; What Is Their Future?

    NASA Astrophysics Data System (ADS)

    Richter, Burton

    2014-04-01

    The success of the first few years of LHC operations at CERN, and the expectation of more to come as the LHC's performance improves, are already leading to discussions of what should be next for both proton-proton and electron-positron colliders. In this discussion I see too much theoretical desperation caused by the so-far-unsuccessful hunt for what is beyond the Standard Model, and too little of the necessary interaction of the accelerator, experimenter, and theory communities necessary for a scientific and engineering success. Here, I give my impressions of the problem, its possible solution, and what is needed to have both a scientifically productive and financially viable future.

  5. High Energy Colliding Beams; What Is Their Future?

    NASA Astrophysics Data System (ADS)

    Richter, Burton

    2015-02-01

    The success of the first few years of LHC operations at CERN, and the expectation of more to come as the LHC's performance improves, are already leading to discussions of what should be next for both proton-proton and electron-positron colliders. In this discussion I see too much theoretical desperation caused by the so-far-unsuccessful hunt for what is beyond the Standard Model, and too little of the necessary interaction of the accelerator, experimenter, and theory communities necessary for a scientific and engineering success. Here, I give my impressions of the problem, its possible solution, and what is needed to have both a scientifically productive and financially viable future.

  6. High Energy Electron and Gamma - Ray Detection with ATIC

    NASA Technical Reports Server (NTRS)

    Chang, J.; Schmidt, W. K. H.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The Advanced Thin Ionization Calorimeter (ATIC) balloon borne ionization calorimeter is well suited to record and identify high energy cosmic ray electrons, and at very high energies gamma-ray photons as well. We have simulated the performance of the instrument, and compare the simulations with actual high energy electron exposures at the CERN accelerator. Simulations and measurements do not compare exactly, in detail, but overall the simulations have predicted actual measured behavior quite well. ATIC has had its first 16 day balloon flight at the turn of the year over Antarctica, and first results obtained using the analysis methods derived from simulations and calibrations will be reported.

  7. New directions in the CernVM file system

    NASA Astrophysics Data System (ADS)

    Blomer, Jakob; Buncic, Predrag; Ganis, Gerardo; Hardi, Nikola; Meusel, Rene; Popescu, Radu

    2017-10-01

    The CernVM File System today is commonly used to host and distribute application software stacks. In addition to this core task, recent developments expand the scope of the file system into two new areas. Firstly, CernVM-FS emerges as a good match for container engines to distribute the container image contents. Compared to native container image distribution (e.g. through the “Docker registry”), CernVM-FS massively reduces the network traffic for image distribution. This has been shown, for instance, by a prototype integration of CernVM-FS into Mesos developed by Mesosphere, Inc. We present a path for a smooth integration of CernVM-FS and Docker. Secondly, CernVM-FS recently raised new interest as an option for the distribution of experiment conditions data. Here, the focus is on improved versioning capabilities of CernVM-FS that allows to link the conditions data of a run period to the state of a CernVM-FS repository. Lastly, CernVM-FS has been extended to provide a name space for physics data for the LIGO and CMS collaborations. Searching through a data namespace is often done by a central, experiment specific database service. A name space on CernVM-FS can particularly benefit from an existing, scalable infrastructure and from the POSIX file system interface.

  8. Predicting the Rate of River Bank Erosion Caused by Large Wood Log

    NASA Astrophysics Data System (ADS)

    Zhang, N.; Rutherfurd, I.; Ghisalberti, M.

    2016-12-01

    When a single tree falls into a river channel, flow is deflected and accelerated between the tree roots and the bank face, increasing shear stress and scouring the bank. The scallop shaped erosion increases the diversity of the channel morphology, but also causes concern for adjacent landholders. Concern about increased bank erosion is one of the main reasons for large wood to still be removed from channels in SE Australia. Further, the hydraulic effect of many logs in the channel can reduce overall bank erosion rates. Although both phenomena have been described before, this research develops a hydraulic model that estimates their magnitude, and tests and calibrates this model with flume and field measurements, with logs with various configurations and sizes. Specifically, the model estimates the change in excess shear stress on the bank associated . The model addresses the effect of the log angle, distance from bank, and log size and flow condition by solving the mass continuity and energy conservation between the cross section at the approaching flow and contracted flow. Then, we evaluate our model against flume experiment preformed with semi-realistic log models to represent logs in different sizes and decay stages by comparing the measured and simulated velocity increase in the gap between the log and the bank. The log angle, distance from bank, and flow condition are systemically varied for each log model during the experiment. Final, the calibrated model is compared with the field data collected in anabranching channels of Murray River in SE Australia where there are abundant instream logs and regulated and consistent high flow for irrigation. Preliminary results suggest that a log can significantly increase the shear stress on the bank, especially when it positions perpendicular to the flow. The shear stress increases with the log angle in a rising curve (The log angle is the angle between log trunk and flow direction. 0o means log is parallel to flow with canopy pointing downstream). However, the shear stress shows insignificant changes as the log is being moved close to the bank.

  9. SU-E-T-784: Using MLC Log Files for Daily IMRT Delivery Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    2015-06-15

    Purpose: To verify daily intensity modulated radiation therapy (IMRT) treatments using multi-leaf collimator (MLC) log files. Methods: The MLC log files from a NovalisTX Varian linear accelerator were used in this study. The MLC files were recorded daily for all patients undergoing IMRT or volumetric modulated arc therapy (VMAT). The first record of each patient was used as reference and all records for subsequent days were compared against the reference. An in house MATLAB software code was used for the comparisons. Each MLC log file was converted to a fluence map (FM) and a gamma index (γ) analysis was usedmore » for the evaluation of each daily delivery for every patient. The tolerance for the gamma index was set to 2% dose difference and 2mm distance to agreement while points with signal of 10% or lower of the maximum value were excluded from the comparisons. Results: The γ between each of the reference FMs and the consecutive daily fraction FMs had an average value of 99.1% (ranged from 98.2 to 100.0%). The FM images were reconstructed at various resolutions in order to study the effect of the resolution on the γ and at the same time reduce the time for processing the images. We found that the comparison of images with the highest resolution (768×1024) yielded on average a lower γ (99.1%) than the ones with low resolution (192×256) (γ 99.5%). Conclusion: We developed an in-house software that allows us to monitor the quality of daily IMRT and VMAT treatment deliveries using information from the MLC log files of the linear accelerator. The information can be analyzed and evaluated as early as after the completion of each daily treatment. Such tool can be valuable to assess the effect of MLC positioning on plan quality, especially in the context of adaptive radiotherapy.« less

  10. Development work for a superconducting linear collider

    NASA Technical Reports Server (NTRS)

    Matheisen, Axel

    1995-01-01

    For future linear e(+)e(-) colliders in the TeV range several alternatives are under discussion. The TESLA approach is based on the advantages of superconductivity. High Q values of the accelerator structures give high efficiency for converting RF power into beam power. A low resonance frequency for the RF structures can be chosen to obtain a large number of electrons (positrons) per bunch. For a given luminosity the beam dimensions can be chosen conservatively which leads to relaxed beam emittance and tolerances at the final focus. Each individual superconducting accelerator component (resonator cavity) of this linear collider has to deliver an energy gain of 25 MeV/m to the beam. Today s.c. resonators are in use at CEBAF/USA, at DESY/Germany, Darmstadt/Germany KEK/Japan and CERN/Geneva. They show acceleration gradients between 5 MV/m and 10 MV/m. Encouraging experiments at CEA Saclay and Cornell University showed acceleration gradients of 20 MV/m and 25 MV/m in single and multicell structures. In an activity centered at DESY in Hamburg/Germany the TESLA collaboration is constructing a 500 MeV superconducting accelerator test facility (TTF) to demonstrate that a linear collider based on this technique can be built in a cost effective manner and that the necessary acceleration gradients of more than 15 MeV/m can be reached reproducibly. The test facility built at DESY covers an area of 3.000 m2 and is divided into 3 major activity areas: (1) The testlinac, where the performance ofthe modular components with an electron beam passing the 40 m long acceleration section can be demonstrated. (2) The test area, where all individual resonators are tested before installation into a module. (3) The preparation and assembly area, where assembly of cavities and modules take place. We report here on the design work to reach a reduction of costs compared to actual existing superconducting accelerator structures and on the facility set up to reach high acceleration gradients in a reproducible way.

  11. Programs of 1993 Winning Teams: Pioneering Partners.

    ERIC Educational Resources Information Center

    1993

    Pioneering Partners for Educational Technology was created to enhance learning in K-12 classrooms by accelerating the use of educational technology. This document outlines the projects of the 1993 winning teams. The Illinois programs are: "A Travel Log Via Computer"; "Weatherization Audit Training for Teachers and Students";…

  12. Run II of the LHC: The Accelerator Science

    NASA Astrophysics Data System (ADS)

    Redaelli, Stefano

    2015-04-01

    In 2015 the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) starts its Run II operation. After the successful Run I at 3.5 TeV and 4 TeV in the 2010-2013 period, a first long shutdown (LS1) was mainly dedicated to the consolidation of the LHC magnet interconnections, to allow the LHC to operate at its design beam energy of 7 TeV. Other key accelerator systems have also been improved to optimize the performance reach at higher beam energies. After a review of the LS1 activities, the status of the LHC start-up progress is reported, addressing in particular the status of the LHC hardware commissioning and of the training campaign of superconducting magnets that will determine the operation beam energy in 2015. Then, the plans for the Run II operation are reviewed in detail, covering choice of initial machine parameters and strategy to improve the Run II performance. Future prospects of the LHC and its upgrade plans are also presented.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lackner, Friedrich; Ferracin, Paolo; Todesco, Ezio

    The High luminosity LHC upgrade target is to increase the integrated luminosity by a factor 10, resulting in an integrated luminosity of 3000 fb-1. One major improvement foreseen is the reduction of the beam size at the collision points. This requires the development of 150 mm single aperture quadrupoles for the interaction regions. These quadrupoles are under development in a joint collaboration between CERN and the US-LHC Accelerator Research Program (LARP). The chosen approach for achieving a nominal quadrupole field gradient of 132.6 T/m is based on the Nb3Sn technology. The coils with a length of 7281 mm will bemore » the longest Nb3Sn coils fabricated so far for accelerator magnets. The production of the long coils was launched in 2016 based on practise coils made from copper. This paper provides a status of the production of the first low grade and full performance coils and describes the production process and applied quality control. Furthermore an outlook for the prototype assembly is provided.« less

  14. A large hadron electron collider at CERN

    DOE PAGES

    Abelleira Fernandez, J. L.

    2015-04-06

    This document provides a brief overview of the recently published report on the design of the Large Hadron Electron Collider (LHeC), which comprises its physics programme, accelerator physics, technology and main detector concepts. The LHeC exploits and develops challenging, though principally existing, accelerator and detector technologies. This summary is complemented by brief illustrations of some of the highlights of the physics programme, which relies on a vastly extended kinematic range, luminosity and unprecedented precision in deep inelastic scattering. Illustrations are provided regarding high precision QCD, new physics (Higgs, SUSY) and eletron-ion physics. The LHeC is designed to run synchronously withmore » the LHC in the twenties and to achieve an integrated luminosity of O(100)fb –1. It will become the cleanest high resolution microscope of mankind and will substantially extend as well as complement the investigation of the physics of the TeV energy scale, which has been enabled by the LHC.« less

  15. Modern Elementary Particle Physics

    NASA Astrophysics Data System (ADS)

    Kane, Gordon

    2017-02-01

    1. Introduction; 2. Relativistic notation, Lagrangians, and interactions; 3. Gauge invariance; 4. Non-abelian gauge theories; 5. Dirac notation for spin; 6. The Standard Model Lagrangian; 7. The electroweak theory and quantum chromodynamics; 8. Masses and the Higgs mechanism; 9. Cross sections, decay widths, and lifetimes: W and Z decays; 10. Production and properties of W± and Zᴼ; 11. Measurement of electroweak and QCD parameters: the muon lifetime; 12. Accelerators - present and future; 13. Experiments and detectors; 14. Low energy and non-accelerator experiments; 15. Observation of the Higgs boson at the CERN LHC: is it the Higgs boson?; 16. Colliders and tests of the Standard Model: particles are pointlike; 17. Quarks and gluons, confinement and jets; 18. Hadrons, heavy quarks, and strong isospin invariance; 19. Coupling strengths depend on momentum transfer and on virtual particles; 20. Quark (and lepton) mixing angles; 21. CP violation; 22. Overview of physics beyond the Standard Model; 23. Grand unification; 24. Neutrino masses; 25. Dark matter; 26. Supersymmetry.

  16. Numerical simulations of a proposed hollow electron beam collimator for the LHC upgrade at CERN.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Previtali, V.; Stancari, G.; Valishev, A.

    2013-07-12

    In the last years the LHC collimation system has been performing over the expectations, providing the machine with a nearly perfect e cient cleaning system[1]. Nonetheless, when trying to push the existing accelerators to - and over - their design limits, all the accelerator components are required to boost their performances. In particular, in view of the high luminosity frontier for the LHC, the increased intensity would ask for a more e cient cleaning system. In this framework innovative collimation solutions are under evaluation[2]: one option is the usage of an hollow electron lens for beam halo cleaning. This workmore » intends to study the applicability of an the hollow electron lens for the LHC collimation, by evaluating the case of the existing Tevatron e-lens applied to the nominal LHC 7 TeV beam. New e-lens operation modes are here proposed to standard enhance the electron lens halo removal e ect.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiCostanzo, D; Ayan, A; Woollard, J

    Purpose: To predict potential failures of hardware within the Varian TrueBeam linear accelerator in order to proactively replace parts and decrease machine downtime. Methods: Machine downtime is a problem for all radiation oncology departments and vendors. Most often it is the result of unexpected equipment failure, and increased due to lack of in-house clinical engineering support. Preventative maintenance attempts to assuage downtime, but often is ineffective at preemptively preventing many failure modes such as MLC motor failures, the need to tighten a gantry chain, or the replacement of a jaw motor, among other things. To attempt to alleviate downtime, softwaremore » was developed in house that determines the maximum value of each axis enumerated in the Truebeam trajectory log files. After patient treatments, this data is stored in a SQL database. Microsoft Power BI is used to plot the average maximum error of each day of each machine as a function of time. The results are then correlated with actual faults that occurred at the machine with the help of Varian service engineers. Results: Over the course of six months, 76,312 trajectory logs have been written into the database and plotted in Power BI. Throughout the course of analysis MLC motors have been replaced on three machines due to the early warning of the trajectory log analysis. The service engineers have also been alerted to possible gantry issues on one occasion due to the aforementioned analysis. Conclusion: Analyzing the trajectory log data is a viable and effective early warning system for potential failures of the TrueBeam linear accelerator. With further analysis and tightening of the tolerance values used to determine a possible imminent failure, it should be possible to pinpoint future issues more thoroughly and for more axes of motion.« less

  18. Section Editors

    NASA Astrophysics Data System (ADS)

    Groep, D. L.; Bonacorsi, D.

    2014-06-01

    1. Data Acquisition, Trigger and Controls Niko NeufeldCERNniko.neufeld@cern.ch Tassos BeliasDemokritosbelias@inp.demokritos.gr Andrew NormanFNALanorman@fnal.gov Vivian O'DellFNALodell@fnal.gov 2. Event Processing, Simulation and Analysis Rolf SeusterTRIUMFseuster@cern.ch Florian UhligGSIf.uhlig@gsi.de Lorenzo MonetaCERNLorenzo.Moneta@cern.ch Pete ElmerPrincetonpeter.elmer@cern.ch 3. Distributed Processing and Data Handling Nurcan OzturkU Texas Arlingtonnurcan@uta.edu Stefan RoiserCERNstefan.roiser@cern.ch Robert IllingworthFNAL Davide SalomoniINFN CNAFDavide.Salomoni@cnaf.infn.it Jeff TemplonNikheftemplon@nikhef.nl 4. Data Stores, Data Bases, and Storage Systems David LangeLLNLlange6@llnl.gov Wahid BhimjiU Edinburghwbhimji@staffmail.ed.ac.uk Dario BarberisGenovaDario.Barberis@cern.ch Patrick FuhrmannDESYpatrick.fuhrmann@desy.de Igor MandrichenkoFNALivm@fnal.gov Mark van de SandenSURF SARA sanden@sara.nl 5. Software Engineering, Parallelism & Multi-Core Solveig AlbrandLPSC/IN2P3solveig.albrand@lpsc.in2p3.fr Francesco GiacominiINFN CNAFfrancesco.giacomini@cnaf.infn.it Liz SextonFNALsexton@fnal.gov Benedikt HegnerCERNbenedikt.hegner@cern.ch Simon PattonLBNLSJPatton@lbl.gov Jim KowalkowskiFNAL jbk@fnal.gov 6. Facilities, Infrastructures, Networking and Collaborative Tools Maria GironeCERNMaria.Girone@cern.ch Ian CollierSTFC RALian.collier@stfc.ac.uk Burt HolzmanFNALburt@fnal.gov Brian Bockelman U Nebraskabbockelm@cse.unl.edu Alessandro de SalvoRoma 1Alessandro.DeSalvo@ROMA1.INFN.IT Helge MeinhardCERN Helge.Meinhard@cern.ch Ray PasetesFNAL rayp@fnal.gov Steven GoldfarbU Michigan Steven.Goldfarb@cern.ch

  19. Preliminary consideration of a double, 480 GeV, fast cycling proton accelerator for production of neutrino beams at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piekarz, Henryk; Hays, Steven; /Fermilab

    We propose to build the DSF-MR (Double Super-Ferric Main Ring), 480 GeV, fast-cycling (2 second repetition rate) two-beam proton accelerator in the Main Ring tunnel of Fermilab. This accelerator design is based on the super-ferric magnet technology developed for the VLHC, and extended recently to the proposed LER injector for the LHC and fast cycling SF-SPS at CERN. The DSF-MR accelerator system will constitute the final stage of the proton source enabling production of two neutrino beams separated by 2 second time period. These beams will be sent alternately to two detectors located at {approx} 3000 km and {approx} 7500more » km away from Fermilab. It is expected that combination of the results from these experiments will offer more than 3 order of magnitudes increased sensitivity for detection and measurement of neutrino oscillations with respect to expectations in any current experiment, and thus may truly enable opening the window into the physics beyond the Standard Model. We examine potential sites for the long baseline neutrino detectors accepting beams from Fermilab. The current injection system consisting of 400 MeV Linac, 8 GeV Booster and the Main Injector can be used to accelerate protons to 45 GeV before transferring them to the DSF-MR. The implementation of the DSF-MR will allow for an 8-fold increase in beam power on the neutrino production target. In this note we outline the proposed new arrangement of the Fermilab accelerator complex. We also briefly describe the DSF-MR magnet design and its power supply, and discuss necessary upgrade of the Tevatron RF system for the use with the DSF-MR accelerator. Finally, we outline the required R&D, cost estimate and possible timeline for the implementation of the DSF-MR accelerator.« less

  20. Automating linear accelerator quality assurance.

    PubMed

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.

  1. Analysis of Weibull Grading Test for Solid Tantalum Capacitors

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander

    2010-01-01

    Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This, model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.

  2. Reliability of High-Voltage Tantalum Capacitors. Parts 3 and 4)

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander

    2010-01-01

    Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.

  3. Thin Film Approaches to the SRF Cavity Problem Fabrication and Characterization of Superconducting Thin Films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beringer, Douglas

    Superconducting Radio Frequency (SRF) cavities are responsible for the acceleration of charged particles to relativistic velocities in most modern linear accelerators, such as those employed at high-energy research facilities like Thomas Jefferson National Laboratory’s CEBAF and the LHC at CERN. Recognizing SRF as primarily a surface phenomenon enables the possibility of applying thin films to the interior surface of SRF cavities, opening a formidable tool chest of opportunities by combining and designing materials that offer greater performance benefit. Thus, while improvements in radio frequency cavity design and refinements in cavity processing techniques have improved accelerator performance and efficiency – 1.5more » GHz bulk niobium SRF cavities have achieved accelerating gradients in excess of 35 MV/m – there exist fundamental material bounds in bulk superconductors limiting the maximally sustained accelerating field gradient (≈ 45 MV/m for Nb) where inevitable thermodynamic breakdown occurs. With state of the art Nb based cavity design fast approaching these theoretical limits, novel material innovations must be sought in order to realize next generation SRF cavities. One proposed method to improve SRF performance is to utilize thin film superconducting-insulating-superconducting (SIS) multilayer structures to effectively magnetically screen a bulk superconducting layer such that it can operate at higher field gradients before suffering critically detrimental SRF losses. This dissertation focuses on the production and characterization of thin film superconductors for such SIS layers for radio frequency applications. Correlated studies on structure, surface morphology and superconducting properties of epitaxial Nb and MgB2 thin films are presented.« less

  4. Indirect self-modulation instability measurement concept for the AWAKE proton beam

    NASA Astrophysics Data System (ADS)

    Turner, M.; Petrenko, A.; Biskup, B.; Burger, S.; Gschwendtner, E.; Lotov, K. V.; Mazzoni, S.; Vincke, H.

    2016-09-01

    AWAKE, the Advanced Proton-Driven Plasma Wakefield Acceleration Experiment, is a proof-of-principle R&D experiment at CERN using a 400 GeV / c proton beam from the CERN SPS (longitudinal beam size σz = 12 cm) which will be sent into a 10 m long plasma section with a nominal density of ≈ 7 ×1014 atoms /cm3 (plasma wavelength λp = 1.2 mm). In this paper we show that by measuring the time integrated transverse profile of the proton bunch at two locations downstream of the AWAKE plasma, information about the occurrence of the self-modulation instability (SMI) can be inferred. In particular we show that measuring defocused protons with an angle of 1 mrad corresponds to having electric fields in the order of GV/m and fully developed self-modulation of the proton bunch. Additionally, by measuring the defocused beam edge of the self-modulated bunch, information about the growth rate of the instability can be extracted. If hosing instability occurs, it could be detected by measuring a non-uniform defocused beam shape with changing radius. Using a 1 mm thick Chromox scintillation screen for imaging of the self-modulated proton bunch, an edge resolution of 0.6 mm and hence an SMI saturation point resolution of 1.2 m can be achieved.

  5. Data Acquisition Software for Experiments at the MAMI-C Tagged Photon Facility

    NASA Astrophysics Data System (ADS)

    Oussena, Baya; Annand, John

    2013-10-01

    Tagged-photon experiments at Mainz use the electron beam of the MAMI (Mainzer MIcrotron) accelerator, in combination with the Glasgow Tagged Photon Spectrometer. The AcquDAQ DAQ system is implemented in the C + + language and makes use of CERN ROOT software libraries and tools. Electronic hardware is characterized in C + + classes, based on a general purpose class TDAQmodule and implementation in an object-oriented framework makes the system very flexible. The DAQ system provides slow control and event-by-event readout of the Photon Tagger, the Crystal Ball 4-pi electromagnetic calorimeter, central MWPC tracker and plastic-scintillator, particle-ID systems and the TAPS forward-angle calorimeter. A variety of front-end controllers running Linux are supported, reading data from VMEbus, FASTBUS and CAMAC systems. More specialist hardware, based on optical communication systems and developed for the COMPASS experiment at CERN, is also supported. AcquDAQ also provides an interface to configure and control the Mainz programmable trigger system, which uses FPGA-based hardware developed at GSI. Currently the DAQ system runs at data rates of up to 3MB/s and, with upgrades to both hardware and software later this year, we anticipate a doubling of that rate. This work was supported in part by the U.S. DOE Grant No. DE-FG02-99ER41110.

  6. Measurement of shower development and its Molière radius with a four-plane LumiCal test set-up

    NASA Astrophysics Data System (ADS)

    Abramowicz, H.; Abusleme, A.; Afanaciev, K.; Benhammou, Y.; Bortko, L.; Borysov, O.; Borysova, M.; Bozovic-Jelisavcic, I.; Chelkov, G.; Daniluk, W.; Dannheim, D.; Elsener, K.; Firlej, M.; Firu, E.; Fiutowski, T.; Ghenescu, V.; Gostkin, M.; Hempel, M.; Henschel, H.; Idzik, M.; Ignatenko, A.; Ishikawa, A.; Kananov, S.; Karacheban, O.; Klempt, W.; Kotov, S.; Kotula, J.; Kozhevnikov, D.; Kruchonok, V.; Krupa, B.; Kulis, Sz.; Lange, W.; Leonard, J.; Lesiak, T.; Levy, A.; Levy, I.; Lohmann, W.; Lukic, S.; Moron, J.; Moszczynski, A.; Neagu, A. T.; Nuiry, F.-X.; Pandurovic, M.; Pawlik, B.; Preda, T.; Rosenblat, O.; Sailer, A.; Schumm, B.; Schuwalow, S.; Smiljanic, I.; Smolyanskiy, P.; Swientek, K.; Terlecki, P.; Uggerhoj, U. I.; Wistisen, T. N.; Wojton, T.; Yamamoto, H.; Zawiejski, L.; Zgura, I. S.; Zhemchugov, A.

    2018-02-01

    A prototype of a luminometer, designed for a future e^+e^- collider detector, and consisting at present of a four-plane module, was tested in the CERN PS accelerator T9 beam. The objective of this beam test was to demonstrate a multi-plane tungsten/silicon operation, to study the development of the electromagnetic shower and to compare it with MC simulations. The Molière radius has been determined to be 24.0 ± 0.6 (stat.) ± 1.5 (syst.) mm using a parametrization of the shower shape. Very good agreement was found between data and a detailed Geant4 simulation.

  7. Comprehensive study of beam focusing by crystal devices

    NASA Astrophysics Data System (ADS)

    Scandale, W.; Arduini, G.; Cerutti, F.; Garattini, M.; Gilardoni, S.; Masi, A.; Mirarchi, D.; Montesano, S.; Petrucci, S.; Redaelli, S.; Rossi, R.; Breton, D.; Burmistrov, L.; Dubos, S.; Maalmi, J.; Natochii, A.; Puill, V.; Stocchi, A.; Sukhonos, D.; Bagli, E.; Bandiera, L.; Guidi, V.; Mazzolari, A.; Romagnoni, M.; Murtas, F.; Addesa, F.; Cavoto, G.; Iacoangeli, F.; Galluccio, F.; Afonin, A. G.; Bulgakov, M. K.; Chesnokov, Yu. A.; Durum, A. A.; Maisheev, V. A.; Sandomirskiy, Yu. E.; Yanovich, A. A.; Kolomiets, A. A.; Kovalenko, A. D.; Taratin, A. M.; Smirnov, G. I.; Denisov, A. S.; Gavrikov, Yu. A.; Ivanov, Yu. M.; Lapina, L. P.; Malyarenko, L. G.; Skorobogatov, V. V.; Auzinger, G.; James, T.; Hall, G.; Pesaresi, M.; Raymond, M.

    2018-01-01

    This paper is devoted to an experimental study of focusing and defocusing positively charged particle beams with the help of specially bent single crystals. Four crystals have been fabricated for this purpose. The studies have been performed at the CERN SPS in 400 GeV /c proton and 180 GeV /c pion beams. The results of measurements of beam envelopes are presented. The rms size of the horizontal profile at the focus was 5-8 times smaller than at the exit of the crystals. The measured focal lengths were 4-21 m. The results of measurements are in good agreement with calculations. Possible applications of focusing crystals in present and future high energy accelerators are discussed.

  8. Big Science and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    2012-03-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  9. Bushmeat supply and consumption in a tropical logging concession in northern Congo.

    PubMed

    Poulsen, J R; Clark, C J; Mavah, G; Elkan, P W

    2009-12-01

    Unsustainable hunting of wildlife for food empties tropical forests of many species critical to forest maintenance and livelihoods of forest people. Extractive industries, including logging, can accelerate exploitation of wildlife by opening forests to hunters and creating markets for bushmeat. We monitored human demographics, bushmeat supply in markets, and household bushmeat consumption in five logging towns in the northern Republic of Congo. Over 6 years we recorded 29,570 animals in town markets and collected 48,920 household meal records. Development of industrial logging operations led to a 69% increase in the population of logging towns and a 64% increase in bushmeat supply. The immigration of workers, jobseekers, and their families altered hunting patterns and was associated with increased use of wire snares and increased diversity in the species hunted and consumed. Immigrants hunted 72% of all bushmeat, which suggests the short-term benefits of hunting accrue disproportionately to "outsiders" to the detriment of indigenous peoples who have prior, legitimate claims to wildlife resources. Our results suggest that the greatest threat of logging to biodiversity may be the permanent urbanization of frontier forests. Although enforcement of hunting laws and promotion of alternative sources of protein may help curb the pressure on wildlife, the best strategy for biodiversity conservation may be to keep saw mills and the towns that develop around them out of forests.

  10. CERN and high energy physics, the grand picture

    ScienceCinema

    Heuer, Rolf-Dieter

    2018-05-24

    The lecture will touch on several topics, to illustrate the role of CERN in the present and future of high-energy physics: how does CERN work? What is the role of the scientific community, of bodies like Council and SPC, and of international cooperation, in the definition of CERN's scientific programme? What are the plans for the future of the LHC and of the non-LHC physics programme? What is the role of R&D; and technology transfer at CERN?

  11. Aberration of a negative ion beam caused by space charge effect.

    PubMed

    Miyamoto, K; Wada, S; Hatayama, A

    2010-02-01

    Aberrations are inevitable when the charged particle beams are extracted, accelerated, transmitted, and focused with electrostatic and magnetic fields. In this study, we investigate the aberration of a negative ion accelerator for a neutral beam injector theoretically, especially the spherical aberration caused by the negative ion beam expansion due to the space charge effect. The negative ion current density profiles with the spherical aberration are compared with those without the spherical aberration. It is found that the negative ion current density profiles in a log scale are tailed due to the spherical aberration.

  12. Dissemination of CERN's Technology Transfer: Added Value from Regional Transfer Agents

    ERIC Educational Resources Information Center

    Hofer, Franz

    2005-01-01

    Technologies developed at CERN, the European Organization for Nuclear Research, are disseminated via a network of external technology transfer officers. Each of CERN's 20 member states has appointed at least one technology transfer officer to help establish links with CERN. This network has been in place since 2001 and early experiences indicate…

  13. Simulation of orientational coherent effects via Geant4

    NASA Astrophysics Data System (ADS)

    Bagli, E.; Asai, M.; Brandt, D.; Dotti, A.; Guidi, V.; Verderi, M.; Wright, D.

    2017-10-01

    Simulation of orientational coherent effects via Geant4 beam manipulation of high-and very-high-energy particle beams is a hot topic in accelerator physics. Coherent effects of ultra-relativistic particles in bent crystals allow the steering of particle trajectories thanks to the strong electrical field generated between atomic planes. Recently, a collimation experiment with bent crystals was carried out at the CERN-LHC, paving the way to the usage of such technology in current and future accelerators. Geant4 is a widely used object-oriented tool-kit for the Monte Carlo simulation of the interaction of particles with matter in high-energy physics. Moreover, its areas of application include also nuclear and accelerator physics, as well as studies in medical and space science. We present the first Geant4 extension for the simulation of orientational effects in straight and bent crystals for high energy charged particles. The model allows the manipulation of particle trajectories by means of straight and bent crystals and the scaling of the cross sections of hadronic and electromagnetic processes for channeled particles. Based on such a model, an extension of the Geant4 toolkit has been developed. The code and the model have been validated by comparison with published experimental data regarding the deflection efficiency via channeling and the variation of the rate of inelastic nuclear interactions.

  14. Design of a high power TM01 mode launcher optimized for manufacturing by milling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dal Forno, Massimo

    2016-12-15

    Recent research on high-gradient rf acceleration found that hard metals, such as hard copper and hard copper-silver, have lower breakdown rate than soft metals. Traditional high-gradient accelerating structures are manufactured with parts joined by high-temperature brazing. The high temperature used in brazing makes the metal soft; therefore, this process cannot be used to manufacture structures out of hard metal alloys. In order to build the structure with hard metals, the components must be designed for joining without high-temperature brazing. One method is to build the accelerating structures out of two halves, and join them by using a low-temperature technique, atmore » the symmetry plane along the beam axis. The structure has input and output rf power couplers. We use a TM01 mode launcher as a rf power coupler, which was introduced during the Next Linear Collider (NLC) work. The part of the mode launcher will be built in each half of the structure. This paper presents a novel geometry of a mode launcher, optimized for manufacturing by milling. The coupler was designed for the CERN CLIC working frequency f = 11.9942 GHz; the same geometry can be scaled to any other frequency.« less

  15. The Ultimate Monte Carlo: Studying Cross-Sections With Cosmic Rays

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.

    2007-01-01

    The high-energy physics community has been discussing for years the need to bring together the three principal disciplines that study hadron cross-section physics - ground-based accelerators, cosmic-ray experiments in space, and air shower research. Only recently have NASA investigators begun discussing the use of space-borne cosmic-ray payloads to bridge the gap between accelerator physics and air shower work using cosmic-ray measurements. The common tool used in these three realms of high-energy hadron physics is the Monte Carlo (MC). Yet the obvious has not been considered - using a single MC for simulating the entire relativistic energy range (GeV to EeV). The task is daunting due to large uncertainties in accelerator, space, and atmospheric cascade measurements. These include inclusive versus exclusive cross-section measurements, primary composition, interaction dynamics, and possible new physics beyond the standard model. However, the discussion of a common tool or ultimate MC might be the very thing that could begin to unify these independent groups into a common purpose. The Offline ALICE concept of a Virtual MC at CERN s Large Hadron Collider (LHC) will be discussed as a rudimentary beginning of this idea, and as a possible forum for carrying it forward in the future as LHC data emerges.

  16. SU-E-T-144: Effective Analysis of VMAT QA Generated Trajectory Log Files for Medical Accelerator Predictive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    Purpose: To determine the effectiveness of SPC analysis for a model predictive maintenance process that uses accelerator generated parameter and performance data contained in trajectory log files. Methods: Each trajectory file is decoded and a total of 131 axes positions are recorded (collimator jaw position, gantry angle, each MLC, etc.). This raw data is processed and either axis positions are extracted at critical points during the delivery or positional change over time is used to determine axis velocity. The focus of our analysis is the accuracy, reproducibility and fidelity of each axis. A reference positional trace of the gantry andmore » each MLC is used as a motion baseline for cross correlation (CC) analysis. A total of 494 parameters (482 MLC related) were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and parameter/system specifications. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: TG-142 and published analysis of VMAT delivery accuracy. Results: All errors introduced were detected. Synthetic positional errors of 2mm for collimator jaw and MLC carriage exceeded the chart limits. Gantry speed and each MLC speed are analyzed at two different points in the delivery. Simulated Gantry speed error (0.2 deg/sec) and MLC speed error (0.1 cm/sec) exceeded the speed chart limits. Gantry position error of 0.2 deg was detected by the CC maximum value charts. The MLC position error of 0.1 cm was detected by the CC maximum value location charts for every MLC. Conclusion: SPC I/MR evaluation of trajectory log file parameters may be effective in providing an early warning of performance degradation or component failure for medical accelerator systems.« less

  17. SU-G-BRB-02: An Open-Source Software Analysis Library for Linear Accelerator Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerns, J; Yaldo, D

    Purpose: Routine linac quality assurance (QA) tests have become complex enough to require automation of most test analyses. A new data analysis software library was built that allows physicists to automate routine linear accelerator quality assurance tests. The package is open source, code tested, and benchmarked. Methods: Images and data were generated on a TrueBeam linac for the following routine QA tests: VMAT, starshot, CBCT, machine logs, Winston Lutz, and picket fence. The analysis library was built using the general programming language Python. Each test was analyzed with the library algorithms and compared to manual measurements taken at the timemore » of acquisition. Results: VMAT QA results agreed within 0.1% between the library and manual measurements. Machine logs (dynalogs & trajectory logs) were successfully parsed; mechanical axis positions were verified for accuracy and MLC fluence agreed well with EPID measurements. CBCT QA measurements were within 10 HU and 0.2mm where applicable. Winston Lutz isocenter size measurements were within 0.2mm of TrueBeam’s Machine Performance Check. Starshot analysis was within 0.2mm of the Winston Lutz results for the same conditions. Picket fence images with and without a known error showed that the library was capable of detecting MLC offsets within 0.02mm. Conclusion: A new routine QA software library has been benchmarked and is available for use by the community. The library is open-source and extensible for use in larger systems.« less

  18. Intermediate-level disinfection with accelerated hydrogen peroxide prevents accumulation of bacteria in Versajet™ tubing during repeated daily debridement using simulated-use testing with an inoculated pork hock.

    PubMed

    Gawaziuk, J P; Alfa, M J; Olson, N; Logsetty, S

    2014-05-01

    This study assesses the feasibility of using the Versajet™ system (VJS) on an inoculated pork hock (PH) skin surface sequentially for 8 days with daily cleaning and intermediate-level disinfection (ILD). Daily, PHs were inoculated with bacteria suspended in artificial test soil (ATS). An ILD protocol with accelerated hydrogen peroxide (AHP, OxivirTB(®)) was employed to clean and disinfect the VJS between debridements. PH skin contains 6.1-6.8×10(6)cfu/cm(2) bacteria. Bacterial counts in the handpiece and discharge hoses immediately after debridement of the PHs, and before cleaning, increased throughout the study period (5.19-6.43log10cfu/mL). Cleaning with the ILD protocol was reduced bacterial counts on the VJS by 6-log. Protein, a surrogate marker of organic contamination, was also reduced post-cleaning and ILD. Compared to a maximum post-debridement level of protein (57.9 μg/mL) obtained before ILD, VJS protein levels dropped to 9.8 (handpiece) and 13.8 μg/mL (discharge hose). Disinfection of the handpiece and discharge hose after debridement with AHP resulted in a 6-log reduction in bacterial count and 4.2 fold reduction in protein. An ILD protocol with an AHP may be a feasible method for serial skin surface debridements with the VJS for up to eight days. Copyright © 2013 Elsevier Ltd and ISBI. All rights reserved.

  19. Scaling the CERN OpenStack cloud

    NASA Astrophysics Data System (ADS)

    Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.

    2015-12-01

    CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.

  20. An investigation into the effectiveness of smartphone experiments on students’ conceptual knowledge about acceleration

    NASA Astrophysics Data System (ADS)

    Mazzella, Alessandra; Testa, Italo

    2016-09-01

    This study is a first attempt to investigate effectiveness of smartphone-based activities on students’ conceptual understanding of acceleration. 143 secondary school students (15-16 years old) were involved in two types of activities: smartphone- and non-smartphone activities. The latter consisted in data logging and ‘cookbook’ activities. For the sake of comparison, all activities featured the same phenomena, i.e., the motion on an inclined plane and pendulum oscillations. A pre-post design was adopted, using open questionnaires as probes. Results show only weak statistical differences between the smartphone and non-smartphone groups. Students who followed smartphone activities were more able to design an experiment to measure acceleration and to correctly describe acceleration in a free fall motion. However, students of both groups had many difficulties in drawing acceleration vector along the trajectory of the studied motion. Results suggest that smartphone-based activities may be effective substitutes of traditional experimental settings and represent a valuable aid for teachers who want to implement laboratory activities at secondary school level. However, to achieve a deeper conceptual understanding of acceleration, some issues need to be addressed: what is the reference system of the built-in smartphone sensor; relationships between smartphone acceleration graphs and experimental setup; vector representation of the measured acceleration.

  1. The LHC timeline: a personal recollection (1980-2012)

    NASA Astrophysics Data System (ADS)

    Maiani, Luciano; Bonolis, Luisa

    2017-12-01

    The objective of this interview is to study the history of the Large Hadron Collider in the LEP tunnel at CERN, from first ideas to the discovery of the Brout-Englert-Higgs boson, seen from the point of view of a member of CERN scientific committees, of the CERN Council and a former Director General of CERN in the years of machine construction.

  2. Recovery of forest residues in the Southern United States

    Treesearch

    Bryce J. Stokes; Donald L. Sirois

    1989-01-01

    In the mid 1970's, the accelerated price increases for petroleum products forced rapid exploration into and adoption of alternative energy sources. A viable option for the forest industry was the recovery of woody biomass from unmerchantable trees and logging residues. Several studies estimated that an abundance of such forest materials existed in the southeastern...

  3. Emitting electron spectra and acceleration processes in the jet of PKS 0447-439

    NASA Astrophysics Data System (ADS)

    Zhou, Yao; Yan, Dahai; Dai, Benzhong; Zhang, Li

    2014-02-01

    We investigate the electron energy distributions (EEDs) and the corresponding acceleration processes in the jet of PKS 0447-439, and estimate its redshift through modeling its observed spectral energy distribution (SED) in the frame of a one-zone synchrotron-self Compton (SSC) model. Three EEDs formed in different acceleration scenarios are assumed: the power-law with exponential cut-off (PLC) EED (shock-acceleration scenario or the case of the EED approaching equilibrium in the stochastic-acceleration scenario), the log-parabolic (LP) EED (stochastic-acceleration scenario and the acceleration dominating), and the broken power-law (BPL) EED (no acceleration scenario). The corresponding fluxes of both synchrotron and SSC are then calculated. The model is applied to PKS 0447-439, and modeled SEDs are compared to the observed SED of this object by using the Markov Chain Monte Carlo method. The results show that the PLC model fails to fit the observed SED well, while the LP and BPL models give comparably good fits for the observed SED. The results indicate that it is possible that a stochastic acceleration process acts in the emitting region of PKS 0447-439 and the EED is far from equilibrium (acceleration dominating) or no acceleration process works (in the emitting region). The redshift of PKS 0447-439 is also estimated in our fitting: z = 0.16 ± 0.05 for the LP case and z = 0.17 ± 0.04 for BPL case.

  4. Cryogenic test facility instrumentation with fiber optic and fiber optic sensors for testing superconducting accelerator magnets

    NASA Astrophysics Data System (ADS)

    Chiuchiolo, A.; Bajas, H.; Bajko, M.; Castaldo, B.; Consales, M.; Cusano, A.; Giordano, M.; Giloux, C.; Perez, J. C.; Sansone, L.; Viret, P.

    2017-12-01

    The magnets for the next steps in accelerator physics, such as the High Luminosity upgrade of the LHC (HL- LHC) and the Future Circular Collider (FCC), require the development of new technologies for manufacturing and monitoring. To meet the HL-LHC new requirements, a large upgrade of the CERN SM18 cryogenic test facilities is ongoing with the implementation of new cryostats and cryogenic instrumentation. The paper deals with the advances in the development and the calibration of fiber optic sensors in the range 300 - 4 K using a dedicated closed-cycle refrigerator system composed of a pulse tube and a cryogen-free cryostat. The calibrated fiber optic sensors (FOS) have been installed in three vertical cryostats used for testing superconducting magnets down to 1.9 K or 4.2 K and in the variable temperature test bench (100 - 4.2 K). Some examples of FOS measurements of cryostat temperature evolution are presented as well as measurements of strain performed on a subscale of High Temperature Superconducting magnet during its powering tests.

  5. Numerical investigation of potential stratification caused by a cryogenic helium spill inside a tunnel

    NASA Astrophysics Data System (ADS)

    Sinclair, Cameron; Malecha, Ziemowit; Jedrusyna, Artur

    2018-04-01

    The sudden release of cryogenic fluid into an accelerator tunnel can pose a significant health and safety risk. For this reason, it is important to evaluate the consequences of such a spill. Previous publications concentrated on either Oxygen Deficiency Hazard or the evaluation of mathematical models using experimental data. No studies to date have focussed on the influence of cryogen inlet conditions on flow development. In this paper, the stratification behaviour of low-temperature helium released into an air-filled accelerator tunnel is investigated for varying helium inlet diameters. A numerical model was constructed using the OpenFOAM Toolbox of a generalised 3D geometry, with similar hydraulic characteristics to the CERN and SLAC tunnels. This model has been validated against published experimental and numerical data. A dimensionless parameter, based on Bakke number, was then determined for the onset of stratification, taking into account the helium inlet diameter; a dimensionless parameter for the degree of stratification was also employed. The simulated flow behaviour is described in terms of these dimensionless parameters, as well as the temperature and oxygen concentration at various heights throughout the tunnel.

  6. The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose

    NASA Astrophysics Data System (ADS)

    Grebe, A.; Leveling, A.; Lu, T.; Mokhov, N.; Pronskikh, V.

    2018-01-01

    The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay γ-quanta by the residuals in the activated structures and scoring the prompt doses of these γ-quanta at arbitrary distances from those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and against experimental data from the CERF facility at CERN, and FermiCORD showed reasonable agreement with these. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.

  7. Proof-of-principle demonstration of a virtual flow meter-based transducer for gaseous helium monitoring in particle accelerator cryogenics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arpaia, P.; Technology Department, European Organization for Nuclear Research; Blanco, E.

    2015-07-15

    A transducer based on a virtual flow meter is proposed for monitoring helium distribution and consumption in cryogenic systems for particle accelerators. The virtual flow meter allows technical and economical constraints, preventing installation of physical instruments in all the needed measurement points, to be overcome. Virtual flow meter performance for the alternative models of Samson [ http://www.samson.de (2015)] and Sereg-Schlumberger [ http://www.slb.com/ (2015)] is compared with the standard IEC 60534-2-1 [Industrial-process control valves—Part 2-1: Flow capacity—sizing equations for fluid flow under installed conditions (2011), https://webstore.iec.ch/publication/2461], for a large temperature range, for both gaseous and liquid helium phases, and for differentmore » pressure drops. Then, the calibration function of the transducer is derived. Finally, the experimental validation for the helium gaseous state on the test station for superconducting magnets in the laboratory SM18 [Pirotte et al., AIP Conf. Proc. 1573, 187 (2014)] at CERN is reported.« less

  8. A beam radiation monitor based on CVD diamonds for SuperB

    NASA Astrophysics Data System (ADS)

    Cardarelli, R.; Di Ciaccio, A.

    2013-08-01

    Chemical Vapor Deposition (CVD) diamond particle detectors are in use in the CERN experiments at LHC and at particle accelerator laboratories in Europe, USA and Japan mainly as beam monitors. Nowadays it is considered a proven technology with a very fast signal read-out and a very high radiation tolerance suitable for measurements in high radiation environment zones i.e. near the accelerators beam pipes. The specific properties of CVD diamonds make them a prime candidate for measuring single particles as well as high-intensity particle cascades, for timing measurements on the sub-nanosecond scale and for beam protection systems in hostile environments. A single-crystalline CVD (scCVD) diamond sensor, read out with a new generation of fast and high transition frequency SiGe bipolar transistor amplifiers, has been tested for an application as radiation monitor to safeguard the silicon vertex tracker in the SuperB detector from excessive radiation damage, cumulative dose and instantaneous dose rates. Test results with 5.5 MeV alpha particles from a 241Am radioactive source and from electrons from a 90Sr radioactive source are presented in this paper.

  9. Accelerating Radioactive Ion Beams With REX-ISOLDE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, F.; Emhofer, S.; Habs, D.

    2003-08-26

    The post accelerator REX-ISOLDE is installed at the ISOLDE facility at CERN, where a broad variety of radioactive ions can be addressed. Since the end of 2001 beams at the final energy of 2.2 MeV/u are available. REX-ISOLDE uses a unique system of beam bunching and charge breeding. First a Penning trap accumulates and bunches the ions, which are delivered as a quasi-continuous beam from the ISOLDE target-ion-source, and then an electron beam ion source (EBIS) charge-breeds them to a mass-to-charge ratio below 4.5. This enables a very compact design for the following LINAC, consisting of a 4 rod RFQ,more » an IH structure and three 7-gap-resonators. The later ones allow a variation of the final energy between 0.8 and 2.2 MeV/u. Although the machine is still in the commissioning phase, first physics experiments have been done with neutron rich Na and Mg isotopes and 9Li. A total efficiency of several percent has already been obtained.« less

  10. Electric field stimulated growth of Zn whiskers

    NASA Astrophysics Data System (ADS)

    Niraula, D.; McCulloch, J.; Warrell, G. R.; Irving, R.; Karpov, V. G.; Shvydka, Diana

    2016-07-01

    We have investigated the impact of strong (˜104 V/cm) electric fields on the development of Zn whiskers. The original samples, with considerable whisker infestation were cut from Zn-coated steel floors and then exposed to electric fields stresses for 10-20 hours at room temperature. We used various electric field sources, from charges accumulated in samples irradiated by: (1) the electron beam of a scanning electron microscope (SEM), (2) the electron beam of a medical linear accelerator, and (3) the ion beam of a linear accelerator; we also used (4) the electric field produced by a Van der Graaf generator. In all cases, the exposed samples exhibited a considerable (tens of percent) increase in whiskers concentration compared to the control sample. The acceleration factor defined as the ratio of the measured whisker growth rate over that in zero field, was estimated to approach several hundred. The statistics of lengths of e-beam induced whiskers was found to follow the log-normal distribution known previously for metal whiskers. The observed accelerated whisker growth is attributed to electrostatic effects. These results offer promise for establishing whisker-related accelerated life testing protocols.

  11. Metabolic rate and life stage of the mites Tetranychus cinnabarinus boisd. (Prostigmata) and Phytoseiulus persimilis A-H. (Mesostigmata).

    PubMed

    Thurling, D J

    1980-09-01

    Respiration rates (oxygen uptake per individual) of the herbivorous mite Tetranychus cinnabarinus Boisd. and of the predatory mite Phytoseiulus persimilis A-H. were measured at 25°C for all life stages, including eggs, using a Cartesian Diver micro-respirometer.Metabolic rates (oxygen uptake per unit weight) ranged between 0.27 and 2.32 nl O 2 μg -1 live weight h -1 in T. cinnabarinus, and between 0.99 and 3.69 nl O 2 μg -1 live weight h -1 in P. persimilis. The difference in metabolic rate ranges is attributed to different modes of life. The metabolic rates of both species are higher than those of comparable mite species, which is attributed to their small size, rapid development and limited sclerotization.Respiration-body weight regression gave the single equation log 10 R=-0.091+1.213 log 10 W for all post-embryonic stages of T. cinnabarinus but two equations for P. persimilis of log 10 R=0.394+1.116 log 10 W for gravid females and log 10 R=0.880+0.348 log 10 W for all non-reproducing post-embryonic stages. The single respiration-body weight relationship for T. cinnabarinus reflects the continuous growth pattern of this species, and the two relationships for P. persimilis reflect the accelerated growth following fertilization. The significance of these results for invertebrate population metabolism studies is discussed.

  12. Exergy Analysis of the Cryogenic Helium Distribution System for the Large Hadron Collider (lhc)

    NASA Astrophysics Data System (ADS)

    Claudet, S.; Lebrun, Ph.; Tavian, L.; Wagner, U.

    2010-04-01

    The Large Hadron Collider (LHC) at CERN features the world's largest helium cryogenic system, spreading over the 26.7 km circumference of the superconducting accelerator. With a total equivalent capacity of 145 kW at 4.5 K including 18 kW at 1.8 K, the LHC refrigerators produce an unprecedented exergetic load, which must be distributed efficiently to the magnets in the tunnel over the 3.3 km length of each of the eight independent sectors of the machine. We recall the main features of the LHC cryogenic helium distribution system at different temperature levels and present its exergy analysis, thus enabling to qualify second-principle efficiency and identify main remaining sources of irreversibility.

  13. Final Report: High Energy Physics Program (HEP), Physics Department, Princeton University

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Callan, Curtis G.; Gubser, Steven S.; Marlow, Daniel R.

    The activities of the Princeton Elementary particles group funded through Department of Energy Grant# DEFG02-91 ER40671 during the period October 1, 1991 through January 31, 2013 are summarized. These activities include experiments performed at Brookhaven National Lab; the CERN Lab in Geneva, Switzerland; Fermilab; KEK in Tsukuba City, Japan; the Stanford Linear Accelerator Center; as well as extensive experimental and the- oretical studies conducted on the campus of Princeton University. Funded senior personnel include: Curtis Callan, Stephen Gubser, Valerie Halyo, Daniel Marlow, Kirk McDonald, Pe- ter Meyers, James Olsen, Pierre Pirou e, Eric Prebys, A.J. Stewart Smith, Frank Shoemaker (deceased),more » Paul Steinhardt, David Stickland, Christopher Tully, and Liantao Wang.« less

  14. Measurements of total production cross sections for $$\\pi^{+}$$+C, $$\\pi^{+}$$+Al, $$K^{+}$$+C, and $$K^{+}$$+Al at 60 GeV/c and $$\\pi^{+}$$+C and $$\\pi^{+}$$+Al at 31 GeV/c

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aduszkiewicz, A.; et al.

    This paper presents several measurements of total production cross sections and total inelastic cross sections for the following reactions:more » $$\\pi^{+}$$+C, $$\\pi^{+}$$+Al, $$K^{+}$$+C, $$K^{+}$$+Al at 60 GeV/c, $$\\pi^{+}$$+C and $$\\pi^{+}$$+Al at 31 GeV/c . The measurements were made using the NA61/SHINE spectrometer at the CERN SPS. Comparisons with previous measurements are given and good agreement is seen. These interaction cross sections measurements are a key ingredient for neutrino flux prediction from the reinteractions of secondary hadrons in current and future accelerator-based long-baseline neutrino experiments.« less

  15. First Test Results of the 150 mm Aperture IR Quadrupole Models for the High Luminosity LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosio, G.; Chlachidze, G.; Wanderer, P.

    2016-10-06

    The High Luminosity upgrade of the LHC at CERN will use large aperture (150 mm) quadrupole magnets to focus the beams at the interaction points. The high field in the coils requires Nb3Sn superconductor technology, which has been brought to maturity by the LHC Accelerator Re-search Program (LARP) over the last 10 years. The key design targets for the new IR quadrupoles were established in 2012, and fabrication of model magnets started in 2014. This paper discusses the results from the first single short coil test and from the first short quadrupole model test. Remaining challenges and plans to addressmore » them are also presented and discussed.« less

  16. Simulations of the failure scenarios of the crab cavities for the nominal scheme of the LHC

    NASA Astrophysics Data System (ADS)

    Yee, B.; Calaga, R.; Zimmermann, F.; Lopez, R.

    2012-02-01

    The Crab Cavity (CC) represents a possible solution to the problem of the reduction in luminosity due to the impact angle of two colliding beams. The CC is a Radio Frequency (RF) superconducting cavity which applies a transversal kick into a bunch of particles producing a rotation in order to have a head-on collision to improve the luminosity. For this reason people at the Beams Department-Accelerators & Beams Physics of CERN (BE-ABP) have studied the implementation of the CC scheme at the LHC. It is essential to study the failure scenarios and the damage that can be produced to the lattice devices. We have performed simulations of these failures for the nominal scheme.

  17. Update on CERN Search based on SharePoint 2013

    NASA Astrophysics Data System (ADS)

    Alvarez, E.; Fernandez, S.; Lossent, A.; Posada, I.; Silva, B.; Wagner, A.

    2017-10-01

    CERN’s enterprise Search solution “CERN Search” provides a central search solution for users and CERN service providers. A total of about 20 million public and protected documents from a wide range of document collections is indexed, including Indico, TWiki, Drupal, SharePoint, JACOW, E-group archives, EDMS, and CERN Web pages. In spring 2015, CERN Search was migrated to a new infrastructure based on SharePoint 2013. In the context of this upgrade, the document pre-processing and indexing process was redesigned and generalised. The new data feeding framework allows to profit from new functionality and it facilitates the long term maintenance of the system.

  18. SU-F-T-462: Lessons Learned From a Machine Incident Reporting System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutlief, S; Hoisak, J

    Purpose: Linear accelerators must operate with minimal downtime. Machine incident logs are a crucial tool to meet this requirement. They providing a history of service and demonstrate whether a fix is working. This study investigates the information content of a large department linear accelerator incident log. Methods: Our department uses an electronic reporting system to provide immediate information to both key department staff and the field service department. This study examines reports for five linac logs during 2015. The report attributes for analysis include frequency, level of documentation, who solved the problem, and type of fix used. Results: Of themore » reports, 36% were documented as resolved. In another 25% the resolution allowed treatment to proceed although the reported problem recurred within days. In 5% only intermediate troubleshooting was documented. The remainder lacked documentation. In 60% of the reports, radiation therapists resolved the problem, often by clearing the appropriate faults or reinitializing a software or hardware service. 22% were resolved by physics and 10% by field service engineers. The remaining 8% were resolved by IT, Facilities, or resolved spontaneously. Typical fixes, in order of scope, included clearing the fault and moving on, closing and re-opening the patient session or software, cycling power to a sub-unit, recalibrating a device (e.g., optical surface imaging), and calling in Field Service (usually resolving the problem through maintenance or component replacement). Conclusion: The reports with undocumented resolution represent a missed opportunity for learning. Frequency of who resolves a problem scales with the proximity of the person’s role (therapist, physicist, or service engineer), which is inversely related to the permanence of the resolution. Review of lessons learned from machine incident logs can form the basis for guidance to radiation therapists and medical physicists to minimize equipment downtime and ensure safe operation.« less

  19. Effect of royal jelly ingestion for six months on healthy volunteers.

    PubMed

    Morita, Hiroyuki; Ikeda, Takahide; Kajita, Kazuo; Fujioka, Kei; Mori, Ichiro; Okada, Hideyuki; Uno, Yoshihiro; Ishizuka, Tatsuo

    2012-09-21

    Royal jelly is a widely ingested supplement for health, but its effects on humans are not well known. The objective was to evaluate the effects of long-term royal jelly ingestion on humans. We conducted a randomized placebo-controlled, double-blind trial. A total of 61 healthy volunteers aged 42-83 years were enrolled and were randomly divided into a royal jelly group (n = 31) and a control group (n = 30). Three thousand mg of royal jelly (RJ) or a placebo in 100 ml liquid/day were ingested for 6 months. The primary outcomes were changes in anthropometric measurements and biochemical indexes from baseline to 6 months after intervention. Thirty subjects in the RJ group and 26 in the control group were included in the analysis of endpoints. In an adjusted mean change of the variables from the baseline, significant differences between the two groups could be found in red blood cell counts (+0.16x10⁶/μL for the RJ group vs. -0.01x10⁶/μL for the control group, P = 0.0134), hematocrit (+0.9% vs. -0.8%, P = 0.0251), log (fasting plasma glucose) (+0.01 ± 0.01 log mg/dL vs. +0.05 ± 0.01 log mg/dL, P = 0.0297), log (insulinogenic index) (+0.25 vs. -0.13, P = 0.0319), log dehydroepiandrosterone sulfate (DHEA-S) (+0.08 log μg/dL vs. +0.20 log μg/dL, P = 0.0483), log testosterone (T) (+0.12 ± 0.04 log ng/mL vs. -0.02 ± 0.05 log ng/mL, P = 0.0416), log T/DHEA-S ratio (+0.05 ± 0.05 vs. -0.23 ± 0.59, P = 0.0015), and in one of the SF-36 subscale scores, mental health (MH) (+4 vs. -7, P = 0.0276). Six-month ingestion of RJ in humans improved erythropoiesis, glucose tolerance and mental health. Acceleration of conversion from DHEA-S to T by RJ may have been observed among these favorable effects.

  20. Big Bang Day: The Making of CERN (Episode 1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2009-10-06

    A two-part history of the CERN project. Quentin Cooper explores the fifty-year history of CERN, the European particle physics laboratory in Switzerland. The institution was created to bring scientists together after WW2 .......

  1. Big Bang Day: The Making of CERN (Episode 1)

    ScienceCinema

    None

    2017-12-09

    A two-part history of the CERN project. Quentin Cooper explores the fifty-year history of CERN, the European particle physics laboratory in Switzerland. The institution was created to bring scientists together after WW2 .......

  2. Simulation of the cabling process for Rutherford cables: An advanced finite element model

    NASA Astrophysics Data System (ADS)

    Cabanes, J.; Garlasche, M.; Bordini, B.; Dallocchio, A.

    2016-12-01

    In all existing large particle accelerators (Tevatron, HERA, RHIC, LHC) the main superconducting magnets are based on Rutherford cables, which are characterized by having: strands fully transposed with respect to the magnetic field, a significant compaction that assures a large engineering critical current density and a geometry that allows efficient winding of the coils. The Nb3Sn magnets developed in the framework of the HL-LHC project for improving the luminosity of the Large Hadron Collider (LHC) are also based on Rutherford cables. Due to the characteristics of Nb3Sn wires, the cabling process has become a crucial step in the magnet manufacturing. During cabling the wires experience large plastic deformations that strongly modify the geometrical dimensions of the sub-elements constituting the superconducting strand. These deformations are particularly severe on the cable edges and can result in a significant reduction of the cable critical current as well as of the Residual Resistivity Ratio (RRR) of the stabilizing copper. In order to understand the main parameters that rule the cabling process and their impact on the cable performance, CERN has developed a 3D Finite Element (FE) model based on the LS-Dyna® software that simulates the whole cabling process. In the paper the model is presented together with a comparison between experimental and numerical results for a copper cable produced at CERN.

  3. Fabrication and Analysis of 150-mm-Aperture Nb 3Sn MQXF Coils

    DOE PAGES

    Holik, E. F.; Ambrosio, G.; Anerella, M.; ...

    2016-01-12

    The U.S. LHC Accelerator Research Program (LARP) and CERN are combining efforts for the HiLumi-LHC upgrade to design and fabricate 150-mm-aperture, interaction region quadrupoles with a nominal gradient of 130 T/m using Nb 3Sn. To successfully produce the necessary long MQXF triplets, the HiLumi-LHC collaboration is systematically reducing risk and design modification by heavily relying upon the experience gained from the successful 120-mm-aperture LARP HQ program. First generation MQXF short (MQXFS) coils were predominately a scaling up of the HQ quadrupole design allowing comparable cable expansion during Nb 3Sn formation heat treatment and increased insulation fraction for electrical robustness. Amore » total of 13 first generation MQXFS coils were fabricated between LARP and CERN. Systematic differences in coil size, coil alignment symmetry, and coil length contraction during heat treatment are observed and likely due to slight variances in tooling and insulation/cable systems. Analysis of coil cross sections indicate that field-shaping wedges and adjacent coil turns are systematically displaced from the nominal location and the cable is expanding less than nominally designed. Lastly, a second generation MQXF coil design seeks to correct the expansion and displacement discrepancies by increasing insulation and adding adjustable shims at the coil pole and midplanes to correct allowed magnetic field harmonics.« less

  4. Strangeness Production in the ALICE Experiment at the LHC

    NASA Astrophysics Data System (ADS)

    Johnson, Harold; Fenner, Kiara; Harton, Austin; Garcia-Solis, Edmundo; Soltz, Ron

    2015-04-01

    The study of strange particle production is an important tool in understanding the properties of a hot and dense medium, the quark-gluon plasma, created in heavy-ion collisions at ultra-relativistic energies. This quark-gluon plasma (QGP) is believed to have been present just after the big bang. The standard model of physics contains six types of quarks. Strange quarks are not among the valence quarks found in protons and neutrons. Strange quark production is sensitive to the extremely high temperatures of the QGP. CERN's Large Hadron Collider accelerates particles to nearly the speed of light before colliding them to create this QGP state. In the results of high-energy particle collisions, hadrons are formed out of quarks and gluons when cooling from extremely high temperatures. Jets are a highly collimated cone of particles coming from the hadronization of a single quark or gluon. Understanding jet interactions may give us clues about the QGP. Using FastJet (a popular jet finder algorithm), we extracted strangeness, or strange particle characteristics of jets contained within proton-proton collisions during our research at CERN. We have identified jets with and without strange particles in proton-proton collisions and we will present a comparison of pT spectra in both cases. This material is based upon work supported by the National Science Foundation under grants PHY-1305280 and PHY-1407051.

  5. Protection of the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Schmidt, R.; Assmann, R.; Carlier, E.; Dehning, B.; Denz, R.; Goddard, B.; Holzer, E. B.; Kain, V.; Puccio, B.; Todd, B.; Uythoven, J.; Wenninger, J.; Zerlauth, M.

    2006-11-01

    The Large Hadron Collider (LHC) at CERN will collide two counter-rotating proton beams, each with an energy of 7 TeV. The energy stored in the superconducting magnet system will exceed 10 GJ, and each beam has a stored energy of 362 MJ which could cause major damage to accelerator equipment in the case of uncontrolled beam loss. Safe operation of the LHC will therefore rely on a complex system for equipment protection. The systems for protection of the superconducting magnets in case of quench must be fully operational before powering the magnets. For safe injection of the 450 GeV beam into the LHC, beam absorbers must be in their correct positions and specific procedures must be applied. Requirements for safe operation throughout the cycle necessitate early detection of failures within the equipment, and active monitoring of the beam with fast and reliable beam instrumentation, mainly beam loss monitors (BLM). When operating with circulating beams, the time constant for beam loss after a failure extends from apms to a few minutes—failures must be detected sufficiently early and transmitted to the beam interlock system that triggers a beam dump. It is essential that the beams are properly extracted on to the dump blocks at the end of a fill and in case of emergency, since the beam dump blocks are the only elements of the LHC that can withstand the impact of the full beam.

  6. CERN welcomes new members

    NASA Astrophysics Data System (ADS)

    2017-08-01

    Lithuania is on course to become an associate member of CERN, pending final approval by the Lithuanian parliament. Associate membership will allow representatives of the Baltic nation to take part in meetings of the CERN Council, which oversees the Geneva-based physics lab.

  7. Catching errors with patient-specific pretreatment machine log file analysis.

    PubMed

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  8. Statistical analysis of variability properties of the Kepler blazar W2R 1926+42

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Hu, Shaoming; Wiita, Paul J.; Gupta, Alok C.

    2018-04-01

    We analyzed Kepler light curves of the blazar W2R 1926+42 that provided nearly continuous coverage from quarter 11 through quarter 17 (589 days between 2011 and 2013) and examined some of their flux variability properties. We investigate the possibility that the light curve is dominated by a large number of individual flares and adopt exponential rise and decay models to investigate the symmetry properties of flares. We found that those variations of W2R 1926+42 are predominantly asymmetric with weak tendencies toward positive asymmetry (rapid rise and slow decay). The durations (D) and the amplitudes (F0) of flares can be fit with log-normal distributions. The energy (E) of each flare is also estimated for the first time. There are positive correlations between logD and logE with a slope of 1.36, and between logF0 and logE with a slope of 1.12. Lomb-Scargle periodograms are used to estimate the power spectral density (PSD) shape. It is well described by a power law with an index ranging between -1.1 and -1.5. The sizes of the emission regions, R, are estimated to be in the range of 1.1 × 1015cm - 6.6 × 1016cm. The flare asymmetry is difficult to explain by a light travel time effect but may be caused by differences between the timescales for acceleration and dissipation of high-energy particles in the relativistic jet. A jet-in-jet model also could produce the observed log-normal distributions.

  9. Solutions for acceleration measurement in vehicle crash tests

    NASA Astrophysics Data System (ADS)

    Dima, D. S.; Covaciu, D.

    2017-10-01

    Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.

  10. Design and performance of a high resolution, low latency stripline beam position monitor system

    NASA Astrophysics Data System (ADS)

    Apsimon, R. J.; Bett, D. R.; Blaskovic Kraljevic, N.; Burrows, P. N.; Christian, G. B.; Clarke, C. I.; Constance, B. D.; Dabiri Khah, H.; Davis, M. R.; Perry, C.; Resta López, J.; Swinson, C. J.

    2015-03-01

    A high-resolution, low-latency beam position monitor (BPM) system has been developed for use in particle accelerators and beam lines that operate with trains of particle bunches with bunch separations as low as several tens of nanoseconds, such as future linear electron-positron colliders and free-electron lasers. The system was tested with electron beams in the extraction line of the Accelerator Test Facility at the High Energy Accelerator Research Organization (KEK) in Japan. It consists of three stripline BPMs instrumented with analogue signal-processing electronics and a custom digitizer for logging the data. The design of the analogue processor units is presented in detail, along with measurements of the system performance. The processor latency is 15.6 ±0.1 ns . A single-pass beam position resolution of 291 ±10 nm has been achieved, using a beam with a bunch charge of approximately 1 nC.

  11. EFQPSK Versus CERN: A Comparative Study

    NASA Technical Reports Server (NTRS)

    Borah, Deva K.; Horan, Stephen

    2001-01-01

    This report presents a comparative study on Enhanced Feher's Quadrature Phase Shift Keying (EFQPSK) and Constrained Envelope Root Nyquist (CERN) techniques. These two techniques have been developed in recent times to provide high spectral and power efficiencies under nonlinear amplifier environment. The purpose of this study is to gain insights into these techniques and to help system planners and designers with an appropriate set of guidelines for using these techniques. The comparative study presented in this report relies on effective simulation models and procedures. Therefore, a significant part of this report is devoted to understanding the mathematical and simulation models of the techniques and their set-up procedures. In particular, mathematical models of EFQPSK and CERN, effects of the sampling rate in discrete time signal representation, and modeling of nonlinear amplifiers and predistorters have been considered in detail. The results of this study show that both EFQPSK and CERN signals provide spectrally efficient communications compared to filtered conventional linear modulation techniques when a nonlinear power amplifier is used. However, there are important differences. The spectral efficiency of CERN signals, with a small amount of input backoff, is significantly better than that of EFQPSK signals if the nonlinear amplifier is an ideal clipper. However, to achieve such spectral efficiencies with a practical nonlinear amplifier, CERN processing requires a predistorter which effectively translates the amplifier's characteristics close to those of an ideal clipper. Thus, the spectral performance of CERN signals strongly depends on the predistorter. EFQPSK signals, on the other hand, do not need such predistorters since their spectra are almost unaffected by the nonlinear amplifier, Ibis report discusses several receiver structures for EFQPSK signals. It is observed that optimal receiver structures can be realized for both coded and uncoded EFQPSK signals with not too much increase in computational complexity. When a nonlinear amplifier is used, the bit error rate (BER) performance of the CERN signals with a matched filter receiver is found to be more than one decibel (dB) worse compared to the bit error performance of EFQPSK signals. Although channel coding is found to provide BER performance improvement for both EFQPSK and CERN signals, the performance of EFQPSK signals remains better than that of CERN. Optimal receiver structures for CERN signals with nonlinear equalization is left as a possible future work. Based on the numerical results, it is concluded that, in nonlinear channels, CERN processing leads towards better bandwidth efficiency with a compromise in power efficiency. Hence for bandwidth efficient communications needs, CERN is a good solution provided effective adaptive predistorters can be realized. On the other hand, EFQPSK signals provide a good power efficient solution with a compromise in band width efficiency.

  12. Released advance reproduction of white and red fir. . . growth, damage, mortality

    Treesearch

    Donald T. Gordon

    1973-01-01

    Advance reproduction of white fir and red fir released by cutting overmature over-story was studied at the Swain Mountain Experimental Forest in northern California, at 6,300 feet elevation. Seedling and sapling height growth before logging was only 0.1-0.2 foot per year. Five years after cutting, seedling and sapling height growth had accelerated to about 0.5 to 0.8...

  13. Mesocarnivores as focal species for the restoration of post-logging aecond growth in the northern redwoods

    Treesearch

    Keith M. Slauson

    2012-01-01

    The management of second growth forests to accelerate the restoration of late-successional and old growth characteristics will be one of the greatest challenges for conservation in the redwood region over the next century. In the redwood region, the largest complex of protected areas exists in the north, however >50 percent of these forest reserves are composed...

  14. New Insights into the Consequences of Post-Windthrow Salvage Logging Revealed by Functional Structure of Saproxylic Beetles Assemblages

    PubMed Central

    Thorn, Simon; Bässler, Claus; Gottschalk, Thomas; Hothorn, Torsten; Bussler, Heinz; Raffa, Kenneth; Müller, Jörg

    2014-01-01

    Windstorms, bark beetle outbreaks and fires are important natural disturbances in coniferous forests worldwide. Wind-thrown trees promote biodiversity and restoration within production forests, but also cause large economic losses due to bark beetle infestation and accelerated fungal decomposition. Such damaged trees are often removed by salvage logging, which leads to decreased biodiversity and thus increasingly evokes discussions between economists and ecologists about appropriate strategies. To reveal the reasons behind species loss after salvage logging, we used a functional approach based on four habitat-related ecological traits and focused on saproxylic beetles. We predicted that salvage logging would decrease functional diversity (measured as effect sizes of mean pairwise distances using null models) as well as mean values of beetle body size, wood diameter niche and canopy cover niche, but would increase decay stage niche. As expected, salvage logging caused a decrease in species richness, but led to an increase in functional diversity by altering the species composition from habitat-filtered assemblages toward random assemblages. Even though salvage logging removes tree trunks, the most negative effects were found for small and heliophilous species and for species specialized on wood of small diameter. Our results suggested that salvage logging disrupts the natural assembly process on windthrown trees and that negative ecological impacts are caused more by microclimate alteration of the dead-wood objects than by loss of resource amount. These insights underline the power of functional approaches to detect ecosystem responses to anthropogenic disturbance and form a basis for management decisions in conservation. To mitigate negative effects on saproxylic beetle diversity after windthrows, we recommend preserving single windthrown trees or at least their tops with exposed branches during salvage logging. Such an extension of the green-tree retention approach to windthrown trees will preserve natural succession and associated communities of disturbed spruce forests. PMID:25050914

  15. New insights into the consequences of post-windthrow salvage logging revealed by functional structure of saproxylic beetles assemblages.

    PubMed

    Thorn, Simon; Bässler, Claus; Gottschalk, Thomas; Hothorn, Torsten; Bussler, Heinz; Raffa, Kenneth; Müller, Jörg

    2014-01-01

    Windstorms, bark beetle outbreaks and fires are important natural disturbances in coniferous forests worldwide. Wind-thrown trees promote biodiversity and restoration within production forests, but also cause large economic losses due to bark beetle infestation and accelerated fungal decomposition. Such damaged trees are often removed by salvage logging, which leads to decreased biodiversity and thus increasingly evokes discussions between economists and ecologists about appropriate strategies. To reveal the reasons behind species loss after salvage logging, we used a functional approach based on four habitat-related ecological traits and focused on saproxylic beetles. We predicted that salvage logging would decrease functional diversity (measured as effect sizes of mean pairwise distances using null models) as well as mean values of beetle body size, wood diameter niche and canopy cover niche, but would increase decay stage niche. As expected, salvage logging caused a decrease in species richness, but led to an increase in functional diversity by altering the species composition from habitat-filtered assemblages toward random assemblages. Even though salvage logging removes tree trunks, the most negative effects were found for small and heliophilous species and for species specialized on wood of small diameter. Our results suggested that salvage logging disrupts the natural assembly process on windthrown trees and that negative ecological impacts are caused more by microclimate alteration of the dead-wood objects than by loss of resource amount. These insights underline the power of functional approaches to detect ecosystem responses to anthropogenic disturbance and form a basis for management decisions in conservation. To mitigate negative effects on saproxylic beetle diversity after windthrows, we recommend preserving single windthrown trees or at least their tops with exposed branches during salvage logging. Such an extension of the green-tree retention approach to windthrown trees will preserve natural succession and associated communities of disturbed spruce forests.

  16. Prompt radiation, shielding and induced radioactivity in a high-power 160 MeV proton linac

    NASA Astrophysics Data System (ADS)

    Magistris, Matteo; Silari, Marco

    2006-06-01

    CERN is designing a 160 MeV proton linear accelerator, both for a future intensity upgrade of the LHC and as a possible first stage of a 2.2 GeV superconducting proton linac. A first estimate of the required shielding was obtained by means of a simple analytical model. The source terms and the attenuation lengths used in the present study were calculated with the Monte Carlo cascade code FLUKA. Detailed FLUKA simulations were performed to investigate the contribution of neutron skyshine and backscattering to the expected dose rate in the areas around the linac tunnel. An estimate of the induced radioactivity in the magnets, vacuum chamber, the cooling system and the concrete shield was performed. A preliminary thermal study of the beam dump is also discussed.

  17. Discrete anti-gravity

    NASA Astrophysics Data System (ADS)

    Noyes, H. Pierre; Starson, Scott

    1991-03-01

    Discrete physics, because it replaces time evolution generated by the energy operator with a global bit-string generator (program universe) and replaces fields with the relativistic Wheeler-Feynman action at a distance, allows the consistent formulation of the concept of signed gravitational charge for massive particles. The resulting prediction made by this version of the theory is that free anti-particles near the surface of the earth will fall up with the same acceleration that the corresponding particles fall down. So far as we can see, no current experimental information is in conflict with this prediction of our theory. The experiment crusis will be one of the anti-proton or anti-hydrogen experiments at CERN. Our prediction should be much easier to test than the small effects which those experiments are currently designed to detect or bound.

  18. Liulin-type spectrometry-dosimetry instruments.

    PubMed

    Dachev, Ts; Dimitrov, Pl; Tomov, B; Matviichuk, Yu; Spurny, F; Ploc, O; Brabcova, K; Jadrnickova, I

    2011-03-01

    The main purpose of Liulin-type spectrometry-dosimetry instruments (LSDIs) is cosmic radiation monitoring at the workplaces. An LSDI functionally is a low mass, low power consumption or battery-operated dosemeter. LSDIs were calibrated in a wide range of radiation fields, including radiation sources, proton and heavy-ion accelerators and CERN-EC high-energy reference field. Since 2000, LSDIs have been used in the scientific programmes of four manned space flights on the American Laboratory and ESA Columbus modules and on the Russian segment of the International Space Station, one Moon spacecraft and three spacecraft around the Earth, one rocket, two balloons and many aircraft flights. In addition to relative low price, LSDIs have proved their ability to qualify the radiation field on the ground and on the above-mentioned carriers.

  19. Controlling front-end electronics boards using commercial solutions

    NASA Astrophysics Data System (ADS)

    Beneyton, R.; Gaspar, C.; Jost, B.; Schmeling, S.

    2002-04-01

    LHCb is a dedicated B-physics experiment under construction at CERN's large hadron collider (LHC) accelerator. This paper will describe the novel approach LHCb is taking toward controlling and monitoring of electronics boards. Instead of using the bus in a crate to exercise control over the boards, we use credit-card sized personal computers (CCPCs) connected via Ethernet to cheap control PCs. The CCPCs will provide a simple parallel, I2C, and JTAG buses toward the electronics board. Each board will be equipped with a CCPC and, hence, will be completely independently controlled. The advantages of this scheme versus the traditional bus-based scheme will be described. Also, the integration of the controls of the electronics boards into a commercial supervisory control and data acquisition (SCADA) system will be shown.

  20. Learning with the ATLAS Experiment at CERN

    ERIC Educational Resources Information Center

    Barnett, R. M.; Johansson, K. E.; Kourkoumelis, C.; Long, L.; Pequenao, J.; Reimers, C.; Watkins, P.

    2012-01-01

    With the start of the LHC, the new particle collider at CERN, the ATLAS experiment is also providing high-energy particle collisions for educational purposes. Several education projects--education scenarios--have been developed and tested on students and teachers in several European countries within the Learning with ATLAS@CERN project. These…

  1. First experience with the new .cern Top Level Domain

    NASA Astrophysics Data System (ADS)

    Alvarez, E.; Malo de Molina, M.; Salwerowicz, M.; Silva De Sousa, B.; Smith, T.; Wagner, A.

    2017-10-01

    In October 2015, CERN’s core website has been moved to a new address, http://home.cern, marking the launch of the brand new top-level domain .cern. In combination with a formal governance and registration policy, the IT infrastructure needed to be extended to accommodate the hosting of Web sites in this new top level domain. We will present the technical implementation in the framework of the CERN Web Services that allows to provide virtual hosting, a reverse proxy solution and that also includes the provisioning of SSL server certificates for secure communications.

  2. Moment rate scaling for earthquakes 3.3 ≤ M ≤ 5.3 with implications for stress drop

    NASA Astrophysics Data System (ADS)

    Archuleta, Ralph J.; Ji, Chen

    2016-12-01

    We have determined a scalable apparent moment rate function (aMRF) that correctly predicts the peak ground acceleration (PGA), peak ground velocity (PGV), local magnitude, and the ratio of PGA/PGV for earthquakes 3.3 ≤ M ≤ 5.3. Using the NGA-West2 database for 3.0 ≤ M ≤ 7.7, we find a break in scaling of LogPGA and LogPGV versus M around M 5.3 with nearly linear scaling for LogPGA and LogPGV for 3.3 ≤ M ≤ 5.3. Temporal parameters tp and td—related to rise time and total duration—control the aMRF. Both scale with seismic moment. The Fourier amplitude spectrum of the aMRF has two corners between which the spectrum decays f- 1. Significant attenuation along the raypath results in a Brune-like spectrum with one corner fC. Assuming that fC ≅ 1/td, the aMRF predicts non-self-similar scaling M0∝fC3.3 and weak stress drop scaling Δσ∝M00.091. This aMRF can explain why stress drop is different from the stress parameter used to predict high-frequency ground motion.

  3. The comparison of proportional hazards and accelerated failure time models in analyzing the first birth interval survival data

    NASA Astrophysics Data System (ADS)

    Faruk, Alfensi

    2018-03-01

    Survival analysis is a branch of statistics, which is focussed on the analysis of time- to-event data. In multivariate survival analysis, the proportional hazards (PH) is the most popular model in order to analyze the effects of several covariates on the survival time. However, the assumption of constant hazards in PH model is not always satisfied by the data. The violation of the PH assumption leads to the misinterpretation of the estimation results and decreasing the power of the related statistical tests. On the other hand, the accelerated failure time (AFT) models do not assume the constant hazards in the survival data as in PH model. The AFT models, moreover, can be used as the alternative to PH model if the constant hazards assumption is violated. The objective of this research was to compare the performance of PH model and the AFT models in analyzing the significant factors affecting the first birth interval (FBI) data in Indonesia. In this work, the discussion was limited to three AFT models which were based on Weibull, exponential, and log-normal distribution. The analysis by using graphical approach and a statistical test showed that the non-proportional hazards exist in the FBI data set. Based on the Akaike information criterion (AIC), the log-normal AFT model was the most appropriate model among the other considered models. Results of the best fitted model (log-normal AFT model) showed that the covariates such as women’s educational level, husband’s educational level, contraceptive knowledge, access to mass media, wealth index, and employment status were among factors affecting the FBI in Indonesia.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rafat, M; Bazalova, M; Palma, B

    Purpose: To characterize the effect of very rapid dose delivery as compared to conventional therapeutic irradiation times on clonogenic cell survival. Methods: We used a Varian Trilogy linear accelerator to deliver doses up to 10 Gy using a 6 MV SRS photon beam. We irradiated four cancer cell lines in times ranging from 30 sec to 30 min. We also used a Varian TrueBeam linear accelerator to deliver 9 MeV electrons at 10 Gy in 10 s to 30 min to determine the effect of irradiation time on cell survival. We then evaluated the effect of using 60 and 120more » MeV electrons on cell survival using the Next Linear Collider Test Accelerator (NLCTA) beam line at the SLAC National Accelerator Laboratory. During irradiation, adherent cells were maintained at 37oC with 20%O2/5%CO2. Clonogenic assays were completed following irradiation to determine changes in cell survival due to dose delivery time and beam quality, and the survival data were fitted with the linear-quadratic model. Results: Cell lines varied in radiosensitivity, ranging from two to four logs of cell kill at 10 Gy for both conventional and very rapid irradiation. Delivering radiation in shorter times decreased survival in all cell lines. Log differences in cell kill ranged from 0.2 to 0.7 at 10 Gy for the short compared to the long irradiation time. Cell kill differences between short and long irradiations were more pronounced as doses increased for all cell lines. Conclusion: Our findings suggest that shortening delivery of therapeutic radiation doses to less than 1 minute may improve tumor cell kill. This study demonstrates the potential advantage of technologies under development to deliver stereotactic ablative radiation doses very rapidly. Bill Loo and Peter Maxim have received Honoraria from Varian and Research Support from Varian and RaySearch.« less

  5. Hangout with CERN: a direct conversation with the public

    NASA Astrophysics Data System (ADS)

    Rao, Achintya; Goldfarb, Steven; Kahle, Kate

    2016-04-01

    Hangout with CERN refers to a weekly, half-hour-long, topical webcast hosted at CERN. The aim of the programme is threefold: (i) to provide a virtual tour of various locations and facilities at CERN, (ii) to discuss the latest scientific results from the laboratory, and, most importantly, (iii) to engage in conversation with the public and answer their questions. For each ;episode;, scientists gather around webcam-enabled computers at CERN and partner institutes/universities, connecting to one another using the Google+ social network's ;Hangouts; tool. The show is structured as a conversation mediated by a host, usually a scientist, and viewers can ask questions to the experts in real time through a Twitter hashtag or YouTube comments. The history of Hangout with CERN can be traced back to ICHEP 2012, where several physicists crowded in front of a laptop connected to Google+, using a ;Hangout On Air; webcast to explain to the world the importance of the discovery of the Higgs-like boson, announced just two days before at the same conference. Hangout with CERN has also drawn inspiration from two existing outreach endeavours: (i) ATLAS Virtual Visits, which connected remote visitors with scientists in the ATLAS Control Room via video conference, and (ii) the Large Hangout Collider, in which CMS scientists gave underground tours via Hangouts to groups of schools and members of the public around the world. In this paper, we discuss the role of Hangout with CERN as a bi-directional outreach medium and an opportunity to train scientists in effective communication.

  6. First operational experience with the HIE-Isolde helium cryogenic system including several RF cryo-modules

    NASA Astrophysics Data System (ADS)

    Guillotin, N.; Dupont, T.; Gayet, Ph; Pirotte, O.

    2017-12-01

    The High Intensity and Energy ISOLDE (HIE-ISOLDE) upgrade project at CERN includes the deployment of new superconducting accelerating structures operated at 4.5 K (ultimately of six cryo-modules) installed in series, and the refurbishing of the helium cryo-plant previously used to cool the ALEPH magnet during the operation of the LEP accelerator from 1989 to 2000. The helium refrigerator is connected to a new cryogenic distribution line, supplying a 2000-liter storage dewar and six interconnecting valve boxes (i.e jumper boxes), one for each cryo-module. After a first operation period with one cryo-module during six months in 2015, a second cryo-module has been installed and operated during 2016. The operation of the cryo-plant with these two cryo-modules has required significant technical enhancements and tunings for the compressor station, the cold-box and the cryogenic distribution system in order to reach nominal and stable operational conditions. The present paper describes the commissioning results and the lessons learnt during the operation campaign of 2016 together with the preliminary experience acquired during the 2017 operation phase with a third cryo-module.

  7. INSTRUMENTS AND METHODS OF INVESTIGATION: Giant pulses of thermal neutrons in large accelerator beam dumps. Possibilities for experiments

    NASA Astrophysics Data System (ADS)

    Stavissky, Yurii Ya

    2006-12-01

    A short review is presented of the development in Russia of intense pulsed neutron sources for physical research — the pulsating fast reactors IBR-1, IBR-30, IBR-2 (Joint Institute for Nuclear Research, Dubna), and the neutron-radiation complex of the Moscow meson factory — the 'Troitsk Trinity' (RAS Institute for Nuclear Research, Troitsk, Moscow region). The possibility of generating giant neutron pulses in beam dumps of superhigh energy accelerators is discussed. In particular, the possibility of producing giant pulsed thermal neutron fluxes in modified beam dumps of the large hadron collider (LHD) under construction at CERN is considered. It is shown that in the case of one-turn extraction ov 7-TeV protons accumulated in the LHC main rings on heavy targets with water or zirconium-hydride moderators placed in the front part of the LHC graphite beam-dump blocks, every 10 hours relatively short (from ~100 µs) thermal neutron pulses with a peak flux density of up to ~1020 neutrons cm-2 s-1 may be produced. The possibility of applying such neutron pulses in physical research is discussed.

  8. Approaching the CDF Top Quark Mass Legacy Measurement in the Lepton+Jets channel with the Matrix Element Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tosciri, Cecilia

    2016-01-01

    The discovery of the bottom quark in 1977 at the Tevatron Collider triggered the search for its partner in the third fermion isospin doublet, the top quark, which was discovered 18 years later in 1995 by the CDF and D=0 experiments during the Tevatron Run I. By 1990, intensive efforts by many groups at several accelerators had lifted to over 90 GeV=c2 the lower mass limit, such that since then the Tevatron became the only accelerator with high-enough energy to possibly discover this amazingly massive quark. After its discovery, the determination of top quark properties has been one of themore » main goals of the Fermilab Tevatron Collider, and more recently also of the Large Hadron Collider (LHC) at CERN. Since the mass value plays an important role in a large number of theoretical calculations on fundamental processes, improving the accuracy of its measurement has been at any time a goal of utmost importance. The present thesis describes in detail the contributions given by the candidate to the massive preparation work needed to make the new analysis possible, during her 8 months long stay at Fermilab.« less

  9. Possibilities For The LAGUNA Projects At The Fréjus Site

    NASA Astrophysics Data System (ADS)

    Mosca, Luigi

    2010-11-01

    The present laboratory (LSM) at the Fréjus site and the project of a first extension of it, mainly aimed at the next generation of dark matter and double beta decay experiments, are briefly reviewed. Then the main characteristics of the LAGUNA cooperation and Design Study network are summarized. Seven underground sites in Europe are considered in LAGUNA and are under study as candidates for the installation of Megaton scale detectors using three different techniques: a liquid Argon TPC (GLACIER), a liquid scintillator detector (LENA) and a Water Cerenkov (MEMPHYS), all mainly aimed at investigation of proton decay and properties of neutrinos from SuperNovae and other astrophysical sources as well as from accelerators (Super-beams and/or Beta-beams from CERN). One of the seven sites is located at Fréjus, near the present LSM laboratory, and the results of its feasibility study are presented and discussed. Then the physics potential of a MEMPHYS detector installed in this site are emphasized both for non-accelerator and for neutrino beam based configurations. The MEMPHYNO prototype with its R&D programme is presented. Finally a possible schedule is sketched.

  10. M. Hildred Blewett and the Blewett Scholarship

    NASA Astrophysics Data System (ADS)

    Whitten, Barbara

    2011-03-01

    M. Hildred Blewett became a physicist at a time when few women were physicists. After beginning her career at General Electric, she became a respected accelerator physicist, working at Brookhaven, Argonne, and eventually CERN. Blewett was married for a time to John Blewett, another accelerator physicist, but the couple divorced without children and she never remarried. She felt that her career in physics was hampered by her gender, and when she died in 2004 at the age of 93, she left the bulk of her estate to the American Physical Society, to found a Scholarship for women in physics. Since 2005 the Blewett Scholarship has been awarded to women in physics who are returning to physics after a career break, usually for family reasons. Family/career conflicts are one of the most important reasons why young women in early careers leave physics---a loss for them as well as the physics community, which has invested time and money in their training. The Blewett Scholarship is one way for the physics community, under the leadership of CSWP, to help these young women resume their careers. I will discuss the life and work of Hildred Blewett, the Blewett Scholarship, and its benefits to the physics community.

  11. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-15

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network Constituents, Fundamental Forces and Symmetries of the Universe. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva.

  12. PREFACE: International Conference on Computing in High Energy and Nuclear Physics (CHEP 2010)

    NASA Astrophysics Data System (ADS)

    Lin, Simon C.; Shen, Stella; Neufeld, Niko; Gutsche, Oliver; Cattaneo, Marco; Fisk, Ian; Panzer-Steindel, Bernd; Di Meglio, Alberto; Lokajicek, Milos

    2011-12-01

    The International Conference on Computing in High Energy and Nuclear Physics (CHEP) was held at Academia Sinica in Taipei from 18-22 October 2010. CHEP is a major series of international conferences for physicists and computing professionals from the worldwide High Energy and Nuclear Physics community, Computer Science, and Information Technology. The CHEP conference provides an international forum to exchange information on computing progress and needs for the community, and to review recent, ongoing and future activities. CHEP conferences are held at roughly 18 month intervals, alternating between Europe, Asia, America and other parts of the world. Recent CHEP conferences have been held in Prauge, Czech Republic (2009); Victoria, Canada (2007); Mumbai, India (2006); Interlaken, Switzerland (2004); San Diego, California(2003); Beijing, China (2001); Padova, Italy (2000) CHEP 2010 was organized by Academia Sinica Grid Computing Centre. There was an International Advisory Committee (IAC) setting the overall themes of the conference, a Programme Committee (PC) responsible for the content, as well as Conference Secretariat responsible for the conference infrastructure. There were over 500 attendees with a program that included plenary sessions of invited speakers, a number of parallel sessions comprising around 260 oral and 200 poster presentations, and industrial exhibitions. We thank all the presenters, for the excellent scientific content of their contributions to the conference. Conference tracks covered topics on Online Computing, Event Processing, Software Engineering, Data Stores, and Databases, Distributed Processing and Analysis, Computing Fabrics and Networking Technologies, Grid and Cloud Middleware, and Collaborative Tools. The conference included excursions to various attractions in Northern Taiwan, including Sanhsia Tsu Shih Temple, Yingko, Chiufen Village, the Northeast Coast National Scenic Area, Keelung, Yehliu Geopark, and Wulai Aboriginal Village, as well as two banquets held at the Grand Hotel and Grand Formosa Regent in Taipei. The next CHEP conference will be held in New York, the United States on 21-25 May 2012. We would like to thank the National Science Council of Taiwan, the EU ACEOLE project, commercial sponsors, and the International Advisory Committee and the Programme Committee members for all their support and help. Special thanks to the Programme Committee members for their careful choice of conference contributions and enormous effort in reviewing and editing about 340 post conference proceedings papers. Simon C Lin CHEP 2010 Conference Chair and Proceedings Editor Taipei, Taiwan November 2011 Track Editors/ Programme Committee Chair Simon C Lin, Academia Sinica, Taiwan Online Computing Track Y H Chang, National Central University, Taiwan Harry Cheung, Fermilab, USA Niko Neufeld, CERN, Switzerland Event Processing Track Fabio Cossutti, INFN Trieste, Italy Oliver Gutsche, Fermilab, USA Ryosuke Itoh, KEK, Japan Software Engineering, Data Stores, and Databases Track Marco Cattaneo, CERN, Switzerland Gang Chen, Chinese Academy of Sciences, China Stefan Roiser, CERN, Switzerland Distributed Processing and Analysis Track Kai-Feng Chen, National Taiwan University, Taiwan Ulrik Egede, Imperial College London, UK Ian Fisk, Fermilab, USA Fons Rademakers, CERN, Switzerland Torre Wenaus, BNL, USA Computing Fabrics and Networking Technologies Track Harvey Newman, Caltech, USA Bernd Panzer-Steindel, CERN, Switzerland Antonio Wong, BNL, USA Ian Fisk, Fermilab, USA Niko Neufeld, CERN, Switzerland Grid and Cloud Middleware Track Alberto Di Meglio, CERN, Switzerland Markus Schulz, CERN, Switzerland Collaborative Tools Track Joao Correia Fernandes, CERN, Switzerland Philippe Galvez, Caltech, USA Milos Lokajicek, FZU Prague, Czech Republic International Advisory Committee Chair: Simon C. Lin , Academia Sinica, Taiwan Members: Mohammad Al-Turany , FAIR, Germany Sunanda Banerjee, Fermilab, USA Dario Barberis, CERN & Genoa University/INFN, Switzerland Lothar Bauerdick, Fermilab, USA Ian Bird, CERN, Switzerland Amber Boehnlein, US Department of Energy, USA Kors Bos, CERN, Switzerland Federico Carminati, CERN, Switzerland Philippe Charpentier, CERN, Switzerland Gang Chen, Institute of High Energy Physics, China Peter Clarke, University of Edinburgh, UK Michael Ernst, Brookhaven National Laboratory, USA David Foster, CERN, Switzerland Merino Gonzalo, CIEMAT, Spain John Gordon, STFC-RAL, UK Volker Guelzow, Deutsches Elektronen-Synchrotron DESY, Hamburg, Germany John Harvey, CERN, Switzerland Frederic Hemmer, CERN, Switzerland Hafeez Hoorani, NCP, Pakistan Viatcheslav Ilyin, Moscow State University, Russia Matthias Kasemann, DESY, Germany Nobuhiko Katayama, KEK, Japan Milos Lokajícek, FZU Prague, Czech Republic David Malon, ANL, USA Pere Mato Vila, CERN, Switzerland Mirco Mazzucato, INFN CNAF, Italy Richard Mount, SLAC, USA Harvey Newman, Caltech, USA Mitsuaki Nozaki, KEK, Japan Farid Ould-Saada, University of Oslo, Norway Ruth Pordes, Fermilab, USA Hiroshi Sakamoto, The University of Tokyo, Japan Alberto Santoro, UERJ, Brazil Jim Shank, Boston University, USA Alan Silverman, CERN, Switzerland Randy Sobie , University of Victoria, Canada Dongchul Son, Kyungpook National University, South Korea Reda Tafirout , TRIUMF, Canada Victoria White, Fermilab, USA Guy Wormser, LAL, France Frank Wuerthwein, UCSD, USA Charles Young, SLAC, USA

  13. CERN@school: bringing CERN into the classroom

    NASA Astrophysics Data System (ADS)

    Whyntie, T.; Cook, J.; Coupe, A.; Fickling, R. L.; Parker, B.; Shearer, N.

    2016-04-01

    CERN@school brings technology from CERN into the classroom to aid with the teaching of particle physics. It also aims to inspire the next generation of physicists and engineers by giving participants the opportunity to be part of a national collaboration of students, teachers and academics, analysing data obtained from detectors based on the ground and in space to make new, curiosity-driven discoveries at school. CERN@school is based around the Timepix hybrid silicon pixel detector developed by the Medipix 2 Collaboration, which features a 300 μm thick silicon sensor bump-bonded to a Timepix readout ASIC. This defines a 256-by-256 grid of pixels with a pitch of 55 μm, the data from which can be used to visualise ionising radiation in a very accessible way. Broadly speaking, CERN@school consists of a web portal that allows access to data collected by the Langton Ultimate Cosmic ray Intensity Detector (LUCID) experiment in space and the student-operated Timepix detectors on the ground; a number of Timepix detector kits for ground-based experiments, to be made available to schools for both teaching and research purposes; and educational resources for teachers to use with LUCID data and detector kits in the classroom. By providing access to cutting-edge research equipment, raw data from ground and space-based experiments, CERN@school hopes to provide the foundation for a programme that meets the many of the aims and objectives of CERN and the project's supporting academic and industrial partners. The work presented here provides an update on the status of the programme as supported by the UK Science and Technology Facilities Council (STFC) and the Royal Commission for the Exhibition of 1851. This includes recent results from work with the GridPP Collaboration on using grid resources with schools to run GEANT4 simulations of CERN@school experiments.

  14. News Conference: Serbia hosts teachers' seminar Resources: Teachers TV website closes for business Festival: Science takes to the stage in Denmark Research: How noise affects learning in secondary schools CERN: CERN visit inspires new teaching ideas Education: PLS aims to improve perception of science for school students Conference: Scientix conference discusses challenges in science education

    NASA Astrophysics Data System (ADS)

    2011-07-01

    Conference: Serbia hosts teachers' seminar Resources: Teachers TV website closes for business Festival: Science takes to the stage in Denmark Research: How noise affects learning in secondary schools CERN: CERN visit inspires new teaching ideas Education: PLS aims to improve perception of science for school students Conference: Scientix conference discusses challenges in science education

  15. News Particle Physics: ATLAS unveils mural at CERN Prize: Corti Trust invites essay entries Astrophysics: CERN holds cosmic-ray conference Researchers in Residence: Lord Winston returns to school Music: ATLAS scientists record physics music Conference: Champagne flows at Reims event Competition: Students triumph at physics olympiad Teaching: Physics proves popular in Japanese schools Forthcoming Events

    NASA Astrophysics Data System (ADS)

    2011-01-01

    Particle Physics: ATLAS unveils mural at CERN Prize: Corti Trust invites essay entries Astrophysics: CERN holds cosmic-ray conference Researchers in Residence: Lord Winston returns to school Music: ATLAS scientists record physics music Conference: Champagne flows at Reims event Competition: Students triumph at physics olympiad Teaching: Physics proves popular in Japanese schools Forthcoming Events

  16. Signature CERN-URSS

    ScienceCinema

    None

    2017-12-09

    Le DG W.Jentschke souhaite la bienvenue à l'assemblée et aux invités pour la signature du protocole entre le Cern et l'URSS qui est un événement important. C'est en 1955 que 55 visiteurs soviétiques ont visité le Cern pour la première fois. Le premier DG au Cern, F.Bloch, et Mons.Amaldi sont aussi présents. Tandis que le discours anglais de W.Jentschke est traduit en russe, le discours russe de Mons.Morozov est traduit en anglais.

  17. Nuclear Science Division annual report, October 1, 1984-September 30, 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahoney, J.

    1986-09-01

    This report summarizes the activities of the Nuclear Science Division during the period October 1, 1984 to September 30, 1985. As in previous years, experimental research has for the most part been carried out using three local accelerators, the Bevalac, the SuperHILAC and the 88-Inch Cyclotron. However, during this time, preparations began for a new generation of relativistic heavy-ion experiments at CERN. The Nuclear Science Division is involved in three major experiments at CERN and several smaller ones. The report is divided into 5 sections. Part I describes the research programs and operations, and Part II contains condensations of experimentalmore » papers arranged roughly according to program and in order of increasing energy, without any further subdivisions. Part III contains condensations of theoretical papers, again ordered according to program but in order of decreasing energy. Improvements and innovations in instrumentation and in experimental or analytical techniques are presented in Part IV. Part V consists of appendices, the first listing publications by author for this period, in which the LBL report number only is given for papers that have not yet appeared in journals; the second contains abstracts of PhD theses awarded during this period; and the third gives the titles and speakers of the NSD Monday seminars, the Bevatron Research Meetings and the theory seminars that were given during the report period. The last appendix is an author index for this report.« less

  18. A browser-based event display for the CMS Experiment at the LHC using WebGL

    NASA Astrophysics Data System (ADS)

    McCauley, T.

    2017-10-01

    Modern web browsers are powerful and sophisticated applications that support an ever-wider range of uses. One such use is rendering high-quality, GPU-accelerated, interactive 2D and 3D graphics in an HTML canvas. This can be done via WebGL, a JavaScript API based on OpenGL ES. Applications delivered via the browser have several distinct benefits for the developer and user. For example, they can be implemented using well-known and well-developed technologies, while distribution and use via a browser allows for rapid prototyping and deployment and ease of installation. In addition, delivery of applications via the browser allows for easy use on mobile, touch-enabled devices such as phones and tablets. iSpy WebGL is an application for visualization of events detected and reconstructed by the CMS Experiment at the Large Hadron Collider at CERN. The first event display developed for an LHC experiment to use WebGL, iSpy WebGL is a client-side application written in JavaScript, HTML, and CSS and uses the WebGL API three.js. iSpy WebGL is used for monitoring of CMS detector performance, for production of images and animations of CMS collisions events for the public, as a virtual reality application using Google Cardboard, and asa tool available for public education and outreach such as in the CERN Open Data Portal and the CMS masterclasses. We describe here its design, development, and usage as well as future plans.

  19. Indico 1.0

    NASA Astrophysics Data System (ADS)

    Gonzalez Lopez, J. B.; Avilés, A.; Baron, T.; Ferreira, P.; Kolobara, B.; Pugh, M. A.; Resco, A.; Trzaskoma, J. P.

    2014-06-01

    Indico has evolved into the main event organization software, room booking tool and collaboration hub for CERN. The growth in its usage has only accelerated during the past 9 years, and today Indico holds more that 215,000 events and 1,100,000 files. The growth was also substantial in terms of functionalities and improvements. In the last year alone, Indico has matured considerably in 3 key areas: enhanced usability, optimized performance and additional features, especially those related to meeting collaboration. Along the course of 2012, much activity has centred around consolidating all this effort and investment into "version 1.0", recently released in 2013.Version 1.0 brings along new features, such as the Microsoft Exchange calendar synchronization for participants, many new and clean interfaces (badges and poster generation, list of contributions, abstracts, etc) and so forth. But most importantly, it brings a message: Indico is now stable, consolidated and mature after more than 10 years of non-stop development. This message is addressed not only to CERN users but also to the many organisations, in or outside HEP, which have already installed the software, and to others who might soon join this community. In this document, we describe the current state of the art of Indico, and how it was built. This does not mean that the Indico software is complete, far from it! We have plenty of new ideas and projects that we are working on and which we have shared during CHEP 2013.

  20. VALORATE: fast and accurate log-rank test in balanced and unbalanced comparisons of survival curves and cancer genomics.

    PubMed

    Treviño, Victor; Tamez-Pena, Jose

    2017-06-15

    The association of genomic alterations to outcomes in cancer is affected by a problem of unbalanced groups generated by the low frequency of alterations. For this, an R package (VALORATE) that estimates the null distribution and the P -value of the log-rank based on a recent reformulation is presented. For a given number of alterations that define the size of survival groups, the log-rank density is estimated by a weighted sum of conditional distributions depending on a co-occurrence term of mutations and events. The estimations are accurately accelerated by sampling across co-occurrences allowing the analysis of large genomic datasets in few minutes. In conclusion, the proposed VALORATE R package is a valuable tool for survival analysis. The R package is available in CRAN at https://cran.r-project.org and in http://bioinformatica.mty.itesm.mx/valorateR . vtrevino@itesm.mx. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Treatment of an old-growth stand and its effects on birds, ants, and large woody debris: A case study. Forest Service general technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bull, E.L.; Torgersen, T.R.; Blumton, A.K.

    1995-09-01

    An old-structure stand with large amounts of tree mortality was treated to accelerate regeneration and reduce fuel loads but still maintain its function as old growth for selected bird species. The smll-diameter (less than 15 inches in diameter at breast height (d.b.h.)), dead trees were removed as was some of the down wood less than 15 inches in diameter at the large end. All live trees of any size and all dead trees equal to or greater than 15 inches d.b.h. were retained. Vaux`s swifts (Chaetura vauxi) and pileated woodpeckers (Dryocopus pileatus) continued to use the stand after harvest formore » nesting and roosting. Brown-headed cowbirds (molothrus ater) were more than twice as common in the treated stand as in an adjacent unlogged, control stand. In a comparison before and after harvest in the treated stand, the number of logs increased, the number of logs with ants increased, but the percentage of logs with ants decreased.« less

  2. Hands on CERN: A Well-Used Physics Education Project

    ERIC Educational Resources Information Center

    Johansson, K. E.

    2006-01-01

    The "Hands on CERN" education project makes it possible for students and teachers to get close to the forefront of scientific research. The project confronts the students with contemporary physics at its most fundamental level with the help of particle collisions from the DELPHI particle physics experiment at CERN. It now exists in 14 languages…

  3. Radio frequency multicusp ion source development (invited)

    NASA Astrophysics Data System (ADS)

    Leung, K. N.

    1996-03-01

    The radio-frequency (rf) driven multicusp source was originally developed for use in the Superconducting Super Collider injector. It has been demonstrated that the source can meet the H- beam current and emittance requirements for this application. By employing a porcelain-coated antenna, a clean plasma discharge with very long-life operation can be achieved. Today, the rf source is used to generate both positive and negative hydrogen ion beams and has been tested in various particle accelerator laboratories throughout the world. Applications of this ion source have been extended to other fields such as ion beam lithography, oil-well logging, ion implantation, accelerator mass spectrometry and medical therapy machines. This paper summarizes the latest rf ion source technology and development at the Lawrence Berkeley National Laboratory.

  4. Measurements on the gas desorption yield of the oxygen-free copper irradiated with low-energy Xe10+ and O+

    NASA Astrophysics Data System (ADS)

    Dong, Z. Q.; Li, P.; Yang, J. C.; Yuan, Y. J.; Xie, W. J.; Zheng, W. H.; Liu, X. J.; Chang, J. J.; Luo, C.; Meng, J.; Wang, J. C.; Wang, Y. M.; Yin, Y.; Chai, Z.

    2017-10-01

    Heavy ion beam lost on the accelerator vacuum wall will release quantity of gas molecules and make the vacuum system deteriorate seriously. This phenomenon is called dynamic vacuum effect, observed at CERN, GSI and BNL, leading to the decrease of beam lifetime when increasing beam intensity. Heavy ion-induced gas desorption, which results in dynamic vacuum effect, becomes one of the most important problems for future accelerators proposed to operate with intermediate charge state beams. In order to investigate the mechanism of this effect and find the solution method for the IMP future project High Intensity heavy-ion Accelerator Facility (HIAF), which is designed to extract 1 × 1011 uranium particles with intermediate charge state per cycle, two dedicated experiment setups have been installed at the beam line of the CSR and the 320 kV HV platform respectively. Recently, experiment was performed at the 320 kV HV platform to study effective gas desorption with oxygen-free copper target irradiated with continuous Xe10+ beam and O+ beam in low energy regime. Gas desorption yield in this energy regime was calculated and the link between gas desorption and electronic energy loss in Cu target was proved. These results will be used to support simulations about dynamic vacuum effect and optimizations about efficiency of collimators to be installed in the HIAF main synchrotron BRing, and will also provide guidance for future gas desorption measurements in high energy regime.

  5. A Multi-TeV Linear Collider Based on CLIC Technology : CLIC Conceptual Design Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aicheler, M; Burrows, P.; Draper, M.

    This report describes the accelerator studies for a future multi-TeV e +e - collider based on the Compact Linear Collider (CLIC) technology. The CLIC concept as described in the report is based on high gradient normal-conducting accelerating structures where the RF power for the acceleration of the colliding beams is extracted from a high-current Drive Beam that runs parallel with the main linac. The focus of CLIC R&D over the last years has been on addressing a set of key feasibility issues that are essential for proving the fundamental validity of the CLIC concept. The status of these feasibility studiesmore » are described and summarized. The report also includes a technical description of the accelerator components and R&D to develop the most important parts and methods, as well as a description of the civil engineering and technical services associated with the installation. Several larger system tests have been performed to validate the two-beam scheme, and of particular importance are the results from the CLIC test facility at CERN (CTF3). Both the machine and detector/physics studies for CLIC have primarily focused on the 3 TeV implementation of CLIC as a benchmark for the CLIC feasibility. This report also includes specific studies for an initial 500 GeV machine, and some discussion of possible intermediate energy stages. The performance and operation issues related to operation at reduced energy compared to the nominal, and considerations of a staged construction program are included in the final part of the report. The CLIC accelerator study is organized as an international collaboration with 43 partners in 22 countries. An associated report describes the physics potential and experiments at CLIC and a shorter report in preparation will focus on the CLIC implementation strategy, together with a plan for the CLIC R&D studies 2012–2016. Critical and important implementation issues such as cost, power and schedule will be addressed there.« less

  6. Logs of Paleoseismic Excavations Across the Central Range Fault, Trinidad

    USGS Publications Warehouse

    Crosby, Christopher J.; Prentice, Carol S.; Weber, John; Ragona, Daniel

    2009-01-01

    This publication makes available maps and trench logs associated with studies of the Central Range Fault, part of the South American-Caribbean plate boundary in Trinidad. Our studies were conducted in 2001 and 2002. We mapped geomorphic features indicative of active faulting along the right-lateral, Central Range Fault, part of the South American-Caribbean plate boundary in Trinidad. We excavated trenches at two sites, the Samlalsingh and Tabaquite sites. At the Samlalsingh site, sediments deposited after the most recent fault movement bury the fault, and the exact location of the fault was unknown until we exposed it in our excavations. At this site, we excavated a total of eleven trenches, six of which exposed the fault. The trenches exposed fluvial sediments deposited over a strath terrace developed on Miocene bedrock units. We cleaned the walls of the excavations, gridded the walls with either 1 m X 1 m or 1 m X 0.5 m nail and string grid, and logged the walls in detail at a scale of 1:20. Additionally, we described the different sedimentary units in the field, incorporating these descriptions into our trench logs. We mapped the locations of the trenches using a tape and compass. Our field logs were scanned, and unit contacts were traced in Adobe Illustrator. The final drafted logs of all the trenches are presented here, along with photographs showing important relations among faults and Holocene sedimentary deposits. Logs of south walls were reversed in Illustrator, so that all logs are drafted with the view direction to the north. We collected samples of various materials exposed in the trench walls, including charcoal samples for radiocarbon dating from both faulted and unfaulted deposits. The locations of all samples collected are shown on the logs. The ages of seventeen of the charcoal samples submitted for radiocarbon analysis at the University of Arizona Accelerator Mass Spectrometry Laboratory in Tucson, Ariz., are given in Table 1. Samples found in Table 1 are shown in red on the trench logs. All radiocarbon ages are calibrated and given with 2 standard deviation age ranges. Our studies suggest that the Central Range Fault is a Holocene fault capable of producing damaging earthquakes in Trinidad

  7. Subtidal hydrodynamics in a tropical lagoon: A dimensionless numbers approach

    NASA Astrophysics Data System (ADS)

    Tenorio-Fernandez, L.; Valle-Levinson, A.; Gomez-Valdes, J.

    2018-01-01

    Observations in a tropical lagoon of the Yucatan peninsula motivated a non-dimensional number analysis to examine the relative influence of tidal stress, density gradients and wind stress on subtidal hydrodynamics. A two-month observation period in Chelem Lagoon covered the transition from the dry to the wet season. Chelem Lagoon is influenced by groundwater inputs and exhibits a main sub-basin (central sub-basin), a west sub-basin and an east sub-basin. Subtidal hydrodynamics were associated with horizontal density gradients that were modified seasonally by evaporation, precipitation, and groundwater discharge. A tidal Froude number (Fr0), a Wedderburn number (W), and a Stress ratio (S0) were used to diagnose the relative importance of dominant subtidal driving forces. The Froude number (Fr0) compares tidal forcing and baroclinic forcing through the ratio of tidal stress to longitudinal baroclinic pressure gradient. The Wedderburn number (W) relates wind stress to baroclinicity. The stress ratio (S0) sizes tidal stress and wind stress. S0 is a new diagnostic tool for systems influenced by tides and winds, and represents the main contribution of this research. Results show that spring-tide subtidal flows in the tropical lagoon had log(Fr0) ≫ 0 and log(S0) > 0 , i.e., driven mainly by tidal stresses (advective accelerations). Neap tides showed log(Fr0) ≪ 0 and log(S0) < 0) , i.e., flows driven by baroclinicity, especially at the lagoon heads of the east and west sub-basins. However, when the wind stress intensified over the lagoon, the relative importance of baroclinicity decreased and the wind stress controlled the dynamics (log(W) ≫ 0). Each sub-basin exhibited a different subtidal response, according to the dimensionless numbers. The response depended on the fortnightly tidal cycle, the location and magnitude of groundwater input, and the direction and magnitude of the wind stress.

  8. Land use not litter quality is a stronger driver of decomposition in hyperdiverse tropical forest.

    PubMed

    Both, Sabine; Elias, Dafydd M O; Kritzler, Ully H; Ostle, Nick J; Johnson, David

    2017-11-01

    In hyperdiverse tropical forests, the key drivers of litter decomposition are poorly understood despite its crucial role in facilitating nutrient availability for plants and microbes. Selective logging is a pressing land use with potential for considerable impacts on plant-soil interactions, litter decomposition, and nutrient cycling. Here, in Borneo's tropical rainforests, we test the hypothesis that decomposition is driven by litter quality and that there is a significant "home-field advantage," that is positive interaction between local litter quality and land use. We determined mass loss of leaf litter, collected from selectively logged and old-growth forest, in a fully factorial experimental design, using meshes that either allowed or precluded access by mesofauna. We measured leaf litter chemical composition before and after the experiment. Key soil chemical and biological properties and microclimatic conditions were measured as land-use descriptors. We found that despite substantial differences in litter quality, the main driver of decomposition was land-use type. Whilst inclusion of mesofauna accelerated decomposition, their effect was independent of land use and litter quality. Decomposition of all litters was slower in selectively logged forest than in old-growth forest. However, there was significantly greater loss of nutrients from litter, especially phosphorus, in selectively logged forest. The analyses of several covariates detected minor microclimatic differences between land-use types but no alterations in soil chemical properties or free-living microbial composition. These results demonstrate that selective logging can significantly reduce litter decomposition in tropical rainforest with no evidence of a home-field advantage. We show that loss of key limiting nutrients from litter (P & N) is greater in selectively logged forest. Overall, the findings hint at subtle differences in microclimate overriding litter quality that result in reduced decomposition rates in selectively logged forests and potentially affect biogeochemical nutrient cycling in the long term.

  9. Discrete anti-gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noyes, H.P.; Starson, S.

    1991-03-01

    Discrete physics, because it replaces time evolution generated by the energy operator with a global bit-string generator (program universe) and replaces fields'' with the relativistic Wheeler-Feynman action at a distance,'' allows the consistent formulation of the concept of signed gravitational charge for massive particles. The resulting prediction made by this version of the theory is that free anti-particles near the surface of the earth will fall'' up with the same acceleration that the corresponding particles fall down. So far as we can see, no current experimental information is in conflict with this prediction of our theory. The experiment crusis willmore » be one of the anti-proton or anti-hydrogen experiments at CERN. Our prediction should be much easier to test than the small effects which those experiments are currently designed to detect or bound. 23 refs.« less

  10. An OS9-UNIX data acquisition system with ECL readout

    NASA Astrophysics Data System (ADS)

    Ziem, P.; Beschorner, C.; Bohne, W.; Drescher, B.; Friese, T.; Kiehne, T.; Kluge, Ch.

    1996-02-01

    A new data acquisition system has been developed at the Hahn-Meitner-Institut to handle almost 550 parameters of nuclear physics experiments. The system combines a UNIX host running a portable data buffer router and a VME front-end based on the OS9 real time operating system. Different kinds of pulse analyzers are located in several CAMAC crates which are controlled by the VME system via a VICbus connection. Data readout is performed by means of an ECL daisy chain. Besides controlling CAMAC the main purpose of the VME front-end is event data formatting and histogramming. Using TCP/IP services, the UNIX host receives formatted data packages for data storage and display. During a beam time at the antiproton accelerator LEAR/CERN, the PS208 experiment has accumulated about 100 Gbyte of event data [2

  11. An OS9-UNIX data acquisition system with ECL readout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziem, P.; Beschorner, C.; Bohne, W.

    1996-02-01

    A new data acquisition system has been developed at the Hahn-Meitner-Institut to handle almost 550 parameters of nuclear physics experiments. The system combines a UNIX host running a portable data buffer router and a VME front-end based on the OS9 real time operating system. Different kinds of pulse analyzers are located in several CAMAC crates which are controlled by the VME system via a VICbus connection. Data readout is performed by means of an ECL daisy chain. Besides controlling CAMAC the main purpose of the VME front-end is event data formatting and histogramming. Using TCP/IP services, the UNIX host receivesmore » formatted data packages for data storage and display. During a beam time at the antiproton accelerator LEAR/CERN, the PS208 experiment has accumulated about 100 Gbyte of event data.« less

  12. Pursuing the Secrets of Matter, Space and Time at the Energy Frontier

    NASA Astrophysics Data System (ADS)

    Grannis, Paul

    2003-04-01

    Particle physicists have made good progress in characterizing the fundamental forces of Nature and the elementary constituents of matter, and these phenomena shaped the universe in its earliest moments. However, what we know now is likely quite incomplete, and new ingredients are expected to surface in accelerator experiments over the coming twenty years. The new results are expected to give us insights into the nature of physics at much higher energies, and thus at earlier epochs in the universe, than are probed directly and may reveal new complexity in the nature of space and time. We will discuss the nature of the new results to be expected at the expanding energy frontier from experimental programs at the Fermilab Tevatron, the CERN Large Hadron Collider, and a TeV scale electron-positron linear collider.

  13. RF plasma modeling of the Linac4 H- ion source

    NASA Astrophysics Data System (ADS)

    Mattei, S.; Ohta, M.; Hatayama, A.; Lettry, J.; Kawamura, Y.; Yasumoto, M.; Schmitzer, C.

    2013-02-01

    This study focuses on the modelling of the ICP RF-plasma in the Linac4 H- ion source currently being constructed at CERN. A self-consistent model of the plasma dynamics with the RF electromagnetic field has been developed by a PIC-MCC method. In this paper, the model is applied to the analysis of a low density plasma discharge initiation, with particular interest on the effect of the external magnetic field on the plasma properties, such as wall loss, electron density and electron energy. The employment of a multi-cusp magnetic field effectively limits the wall losses, particularly in the radial direction. Preliminary results however indicate that a reduced heating efficiency results in such a configuration. The effect is possibly due to trapping of electrons in the multi-cusp magnetic field, preventing their continuous acceleration in the azimuthal direction.

  14. The LHC magnet system and its status of development

    NASA Technical Reports Server (NTRS)

    Bona, Maurizio; Perin, Romeo; Vlogaert, Jos

    1995-01-01

    CERN is preparing for the construction of a new high energy accelerator/collider, the Large Hadron Collider (LHC). This new facility will mainly consist of two superconducting magnetic beam channels, 27 km long, to be installed in the existing LEP tunnel. The magnetic system comprises about 1200 twin-aperture dipoles, 13.145 m long, with an operational field of 8.65 T, about 600 quadrupoles, 3 m long, and a very large number of other superconducting magnetic components. A general description of the system is given together with the main features of the design of the regular lattice magnets. The paper also describes the present state of the magnet R & D program. Results from short model work, as well as from full scale prototypes will be presented, including the recently tested 10 m long full-scale prototype dipole manufactured in industry.

  15. Design of robust microlinacs for wide replacement of radioisotope sources

    NASA Astrophysics Data System (ADS)

    Smirnov, A. V.; Agustsson, R. A.; Boucher, S.; Harrison, M.; Junge, K.; Savin, E.; Smirnov, A. Yu

    2017-12-01

    To improve public security and prevent the diversion of radioactive material for Radiation Dispersion Devices, development of an inexpensive, portable, easy-to-manufacture linac system is very important. The bremsstrahlung X-rays produced by relativistic electron beam on a high-Z converter can mimic X-rays radiated from various radioactive sources. Here we consider development of two designs: one matching a Ir-192 source used in radiography with ∼1-1.3 MeV electrons, and another one Cs137 source using 3.5-4 MeV electrons that can be considered for borehole logging. Both designs use standing wave, high group velocity, cm- wave, accelerating structure. The logging tool conceptual design is based on KlyLac concept combining a klystron and linac operating in self-oscillating mode and sharing the same vacuum envelop, and electron beam.

  16. 25th Birthday Cern- Amphi

    ScienceCinema

    None

    2017-12-09

    Cérémonie du 25ème anniversaire du Cern avec 2 orateurs: le Prof.Weisskopf parle de la signification et le rôle du Cern et le Prof.Casimir(?) fait un exposé sur les rélations entre la science pure et la science appliquée et la "big science" (science légère)

  17. A Varian DynaLog file-based procedure for patient dose-volume histogram-based IMRT QA.

    PubMed

    Calvo-Ortega, Juan F; Teke, Tony; Moragues, Sandra; Pozo, Miquel; Casals-Farran, Joan

    2014-03-06

    In the present study, we describe a method based on the analysis of the dynamic MLC log files (DynaLog) generated by the controller of a Varian linear accelerator in order to perform patient-specific IMRT QA. The DynaLog files of a Varian Millennium MLC, recorded during an IMRT treatment, can be processed using a MATLAB-based code in order to generate the actual fluence for each beam and so recalculate the actual patient dose distribution using the Eclipse treatment planning system. The accuracy of the DynaLog-based dose reconstruction procedure was assessed by introducing ten intended errors to perturb the fluence of the beams of a reference plan such that ten subsequent erroneous plans were generated. In-phantom measurements with an ionization chamber (ion chamber) and planar dose measurements using an EPID system were performed to investigate the correlation between the measured dose changes and the expected ones detected by the reconstructed plans for the ten intended erroneous cases. Moreover, the method was applied to 20 cases of clinical plans for different locations (prostate, lung, breast, and head and neck). A dose-volume histogram (DVH) metric was used to evaluate the impact of the delivery errors in terms of dose to the patient. The ionometric measurements revealed a significant positive correlation (R² = 0.9993) between the variations of the dose induced in the erroneous plans with respect to the reference plan and the corresponding changes indicated by the DynaLog-based reconstructed plans. The EPID measurements showed that the accuracy of the DynaLog-based method to reconstruct the beam fluence was comparable with the dosimetric resolution of the portal dosimetry used in this work (3%/3 mm). The DynaLog-based reconstruction method described in this study is a suitable tool to perform a patient-specific IMRT QA. This method allows us to perform patient-specific IMRT QA by evaluating the result based on the DVH metric of the planning CT image (patient DVH-based IMRT QA).

  18. Global EOS: exploring the 300-ms-latency region

    NASA Astrophysics Data System (ADS)

    Mascetti, L.; Jericho, D.; Hsu, C.-Y.

    2017-10-01

    EOS, the CERN open-source distributed disk storage system, provides the highperformance storage solution for HEP analysis and the back-end for various work-flows. Recently EOS became the back-end of CERNBox, the cloud synchronisation service for CERN users. EOS can be used to take advantage of wide-area distributed installations: for the last few years CERN EOS uses a common deployment across two computer centres (Geneva-Meyrin and Budapest-Wigner) about 1,000 km apart (∼20-ms latency) with about 200 PB of disk (JBOD). In late 2015, the CERN-IT Storage group and AARNET (Australia) set-up a challenging R&D project: a single EOS instance between CERN and AARNET with more than 300ms latency (16,500 km apart). This paper will report about the success in deploy and run a distributed storage system between Europe (Geneva, Budapest), Australia (Melbourne) and later in Asia (ASGC Taipei), allowing different type of data placement and data access across these four sites.

  19. HIGH ENERGY PHYSICS: Bulgarians Sue CERN for Leniency.

    PubMed

    Koenig, R

    2000-10-13

    In cash-strapped Bulgaria, scientists are wondering whether a ticket for a front-row seat in high-energy physics is worth the price: Membership dues in CERN, the European particle physics lab, nearly equal the country's entire budget for competitive research grants. Faced with that grim statistic and a plea for leniency from Bulgaria's government, CERN's governing council is considering slashing the country's membership dues for the next 2 years.

  20. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    McAllister, Liam

    2018-05-14

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  1. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-22

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons.Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  2. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-06-28

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  3. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-23

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  4. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2017-12-09

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  5. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    McAllister, Liam

    2018-05-24

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions".This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher

  6. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    Sen, Ashoke

    2018-04-27

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network". The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher.

  7. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-23

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  8. Service management at CERN with Service-Now

    NASA Astrophysics Data System (ADS)

    Toteva, Z.; Alvarez Alonso, R.; Alvarez Granda, E.; Cheimariou, M.-E.; Fedorko, I.; Hefferman, J.; Lemaitre, S.; Clavo, D. Martin; Martinez Pedreira, P.; Pera Mira, O.

    2012-12-01

    The Information Technology (IT) and the General Services (GS) departments at CERN have decided to combine their extensive experience in support for IT and non-IT services towards a common goal - to bring the services closer to the end user based on Information Technology Infrastructure Library (ITIL) best practice. The collaborative efforts have so far produced definitions for the incident and the request fulfilment processes which are based on a unique two-dimensional service catalogue that combines both the user and the support team views of all services. After an extensive evaluation of the available industrial solutions, Service-now was selected as the tool to implement the CERN Service-Management processes. The initial release of the tool provided an attractive web portal for the users and successfully implemented two basic ITIL processes; the incident management and the request fulfilment processes. It also integrated with the CERN personnel databases and the LHC GRID ticketing system. Subsequent releases continued to integrate with other third-party tools like the facility management systems of CERN as well as to implement new processes such as change management. Independently from those new development activities it was decided to simplify the request fulfilment process in order to achieve easier acceptance by the CERN user community. We believe that due to the high modularity of the Service-now tool, the parallel design of ITIL processes e.g., event management and non-ITIL processes, e.g., computer centre hardware management, will be easily achieved. This presentation will describe the experience that we have acquired and the techniques that were followed to achieve the CERN customization of the Service-Now tool.

  9. Vidyo@CERN: A Service Update

    NASA Astrophysics Data System (ADS)

    Fernandes, J.; Baron, T.

    2015-12-01

    We will present an overview of the current real-time video service offering for the LHC, in particular the operation of the CERN Vidyo service will be described in terms of consolidated performance and scale: The service is an increasingly critical part of the daily activity of the LHC collaborations, topping recently more than 50 million minutes of communication in one year, with peaks of up to 852 simultaneous connections. We will elaborate on the improvement of some front-end key features such as the integration with CERN Indico, or the enhancements of the Unified Client and also on new ones, released or in the pipeline, such as a new WebRTC client and CERN SSO/Federated SSO integration. An overview of future infrastructure improvements, such as virtualization techniques of Vidyo routers and geo-location mechanisms for load-balancing and optimum user distribution across the service infrastructure will also be discussed. The work done by CERN to improve the monitoring of its Vidyo network will also be presented and demoed. As a last point, we will touch the roadmap and strategy established by CERN and Vidyo with a clear objective of optimizing the service both on the end client and backend infrastructure to make it truly universal, to serve Global Science. To achieve those actions, the introduction of the multitenant concept to serve different communities is needed. This is one of the consequences of CERN's decision to offer the Vidyo service currently operated for the LHC, to other Sciences, Institutions and Virtual Organizations beyond HEP that might express interest for it.

  10. Development of Head Injury Assessment Reference Values Based on NASA Injury Modeling

    NASA Technical Reports Server (NTRS)

    Somers, Jeffrey T.; Melvin, John W.; Tabiei, Ala; Lawrence, Charles; Ploutz-Snyder, Robert; Granderson, Bradley; Feiveson, Alan; Gernhardt, Michael; Patalak, John

    2011-01-01

    NASA is developing a new capsule-based, crewed vehicle that will land in the ocean, and the space agency desires to reduce the risk of injury from impact during these landings. Because landing impact occurs for each flight and the crew might need to perform egress tasks, current injury assessment reference values (IARV) were deemed insufficient. Because NASCAR occupant restraint systems are more effective than the systems used to determine the current IARVs and are similar to NASA s proposed restraint system, an analysis of NASCAR impacts was performed to develop new IARVs that may be more relevant to NASA s context of vehicle landing operations. Head IARVs associated with race car impacts were investigated by completing a detailed analysis of all of the 2002-2008 NASCAR impact data. Specific inclusion and exclusion criteria were used to select 4071 impacts from the 4015 recorder files provided (each file could contain multiple impact events). Of the 4071 accepted impacts, 274 were selected for numerical simulation using a custom NASCAR restraint system and Humanetics Hybrid-III 50th percentile numerical dummy model in LS-DYNA. Injury had occurred in 32 of the 274 selected impacts, and 27 of those injuries involved the head. A majority of the head injuries were mild concussions with or without brief loss of consciousness. The 242 non-injury impacts were randomly selected and representative of the range of crash dynamics present in the total set of 4071 impacts. Head dynamics data (head translational acceleration, translational change in velocity, rotational acceleration, rotational velocity, HIC-15, HIC-36, and the Head 3ms clip) were filtered according to SAE J211 specifications and then transformed to a log scale. The probability of head injury was estimated using a separate logistic regression analysis for each log-transformed predictor candidate. Using the log transformation constrains the estimated probability of injury to become negligible as IARVs approach zero. For the parameters head translational acceleration, head translational velocity change, head rotational acceleration, HIC-15, and HIC-36, conservative values (in the lower 95% confidence interval) that gave rise to a 5% risk of any injury occurring were estimated as 40.0 G, 7.9 m/s, 2200 rad/s2, 98.4, and 77.4 respectively. Because NASA is interested in the consequence of any particular injury on the ability of the crew to perform egress tasks, the head injuries that occurred in the NASCAR dataset were classified according to a NASA-developed scale (Classes I - III) for operationally relevant injuries, which classifies injuries on the basis of their operational significance. Additional analysis of the data was performed to determine the probability of each injury class occurring, and this was estimated using an ordered probit model. For head translational acceleration, head translational velocity change, head rotational acceleration, head rotational velocity, HIC-36, and head 3ms clip, conservative values of IARVs that produced a 5% risk of Class II injury were estimated as 50.7 G, 9.5 m/s, 2863 rad/s2, 11.0 rad/s, 30.3, and 46.4 G respectively. The results indicate that head IARVs developed from the NASCAR dataset may be useful to protect crews during landing impact.

  11. RELATIVE DISTRIBUTIONS OF FLUENCES OF {sup 3}He AND {sup 4}He IN SOLAR ENERGETIC PARTICLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrosian, Vahe; Jiang Yanwei; Liu Siming

    2009-08-10

    Solar energetic particles show a rich variety of spectra and relative abundances of many ionic species and their isotopes. A long-standing puzzle has been the extreme enrichments of {sup 3}He ions. The most extreme enrichments are observed in low-fluence, the so-called impulsive, events which are believed to be produced at the flare site in the solar corona with little scattering and acceleration during transport to the Earth. In such events, {sup 3}He ions show a characteristic concave curved spectra in a log-log plot. In two earlier papers of Liu et al., we showed how such extreme enrichments and spectra canmore » result in the model developed by Petrosian and Liu, where ions are accelerated stochastically by plasma waves or turbulence. In this paper, we address the relative distributions of the fluences of {sup 3}He and {sup 4}He ions presented by Ho et al., which show that while the distribution of {sup 4}He fluence (which we believe is a good measure of the flare strength) like many other extensive characteristics of solar flare is fairly broad, the {sup 3}He fluence is limited to a narrow range. These characteristics introduce a strong anticorrelation between the ratio of the fluences and the {sup 4}He fluence. One of the predictions of our model presented in the 2006 paper was the presence of steep variation of the fluence ratio with the level of turbulence or the rate of acceleration. We show here that this feature of the model can reproduce the observed distribution of the fluences with very few free parameters. The primary reason for the success of the model in both fronts is because fully ionized {sup 3}He ion, with its unique charge-to-mass ratio, can resonantly interact with plasma modes not accessible to {sup 4}He and be accelerated more readily than {sup 4}He. Essentially in most flares, all background {sup 3}He ions are accelerated to few MeV/nucleon range, while this happens for {sup 4}He ions only in very strong events. A much smaller fraction of {sup 4}He ions reach such energies in weaker events.« less

  12. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  13. Public Lecture

    ScienceCinema

    None

    2017-12-09

    An outreach activity is being organized by the Turkish community at CERN, on 5 June 2010 at CERN Main Auditorium. The activity consists of several talks that will take 1.5h in total. The main goal of the activity will be describing the CERN based activities and experiments as well as stimulating the public's attention to the science related topics. We believe the wide communication of the event has certain advantages especially for the proceeding membership process of Turkey.

  14. Prospects for observation at CERN in NA62

    NASA Astrophysics Data System (ADS)

    Hahn, F.; NA62 Collaboration; Aglieri Rinella, G.; Aliberti, R.; Ambrosino, F.; Angelucci, B.; Antonelli, A.; Anzivino, G.; Arcidiacono, R.; Azhinenko, I.; Balev, S.; Bendotti, J.; Biagioni, A.; Biino, C.; Bizzeti, A.; Blazek, T.; Blik, A.; Bloch-Devaux, B.; Bolotov, V.; Bonaiuto, V.; Bragadireanu, M.; Britton, D.; Britvich, G.; Brook, N.; Bucci, F.; Butin, F.; Capitolo, E.; Capoccia, C.; Capussela, T.; Carassiti, V.; Cartiglia, N.; Cassese, A.; Catinaccio, A.; Cecchetti, A.; Ceccucci, A.; Cenci, P.; Cerny, V.; Cerri, C.; Chikilev, O.; Ciaranfi, R.; Collazuol, G.; Cooke, P.; Cooper, P.; Corradi, G.; Cortina Gil, E.; Costantini, F.; Cotta Ramusino, A.; Coward, D.; D'Agostini, G.; Dainton, J.; Dalpiaz, P.; Danielsson, H.; Degrange, J.; De Simone, N.; Di Filippo, D.; Di Lella, L.; Dixon, N.; Doble, N.; Duk, V.; Elsha, V.; Engelfried, J.; Enik, T.; Falaleev, V.; Fantechi, R.; Federici, L.; Fiorini, M.; Fry, J.; Fucci, A.; Fulton, L.; Gallorini, S.; Gatignon, L.; Gianoli, A.; Giudici, S.; Glonti, L.; Goncalves Martins, A.; Gonnella, F.; Goudzovski, E.; Guida, R.; Gushchin, E.; Hahn, F.; Hallgren, B.; Heath, H.; Herman, F.; Hutchcroft, D.; Iacopini, E.; Jamet, O.; Jarron, P.; Kampf, K.; Kaplon, J.; Karjavin, V.; Kekelidze, V.; Kholodenko, S.; Khoriauli, G.; Khudyakov, A.; Kiryushin, Yu; Kleinknecht, K.; Kluge, A.; Koval, M.; Kozhuharov, V.; Krivda, M.; Kudenko, Y.; Kunze, J.; Lamanna, G.; Lazzeroni, C.; Leitner, R.; Lenci, R.; Lenti, M.; Leonardi, E.; Lichard, P.; Lietava, R.; Litov, L.; Lomidze, D.; Lonardo, A.; Lurkin, N.; Madigozhin, D.; Maire, G.; Makarov, A.; Mannelli, I.; Mannocchi, G.; Mapelli, A.; Marchetto, F.; Massarotti, P.; Massri, K.; Matak, P.; Mazza, G.; Menichetti, E.; Mirra, M.; Misheva, M.; Molokanova, N.; Morant, J.; Morel, M.; Moulson, M.; Movchan, S.; Munday, D.; Napolitano, M.; Newson, F.; Norton, A.; Noy, M.; Nuessle, G.; Obraztsov, V.; Padolski, S.; Page, R.; Palladino, V.; Pardons, A.; Pedreschi, E.; Pepe, M.; Perez Gomez, F.; Perrin-Terrin, M.; Petrov, P.; Petrucci, F.; Piandani, R.; Piccini, M.; Pietreanu, D.; Pinzino, J.; Pivanti, M.; Polenkevich, I.; Popov, I.; Potrebenikov, Yu; Protopopescu, D.; Raffaelli, F.; Raggi, M.; Riedler, P.; Romano, A.; Rubin, P.; Ruggiero, G.; Russo, V.; Ryjov, V.; Salamon, A.; Salina, G.; Samsonov, V.; Santovetti, E.; Saracino, G.; Sargeni, F.; Schifano, S.; Semenov, V.; Sergi, A.; Serra, M.; Shkarovskiy, S.; Sotnikov, A.; Sougonyaev, V.; Sozzi, M.; Spadaro, T.; Spinella, F.; Staley, R.; Statera, M.; Sutcliffe, P.; Szilasi, N.; Tagnani, D.; Valdata-Nappi, M.; Valente, P.; Vasile, M.; Vassilieva, V.; Velghe, B.; Veltri, M.; Venditti, S.; Vormstein, M.; Wahl, H.; Wanke, R.; Wertelaers, P.; Winhart, A.; Winston, R.; Wrona, B.; Yushchenko, O.; Zamkovsky, M.; Zinchenko, A.

    2015-07-01

    The rare decays are excellent processes to probe the Standard Model and indirectly search for new physics complementary to the direct LHC searches. The NA62 experiment at CERN SPS aims to collect and analyse O(1013) kaon decays before the CERN long-shutdown 2 (in 2018). This will allow to measure the branching ratio to a level of 10% accuracy. The experimental apparatus has been commissioned during a first run in autumn 2014.

  15. "Infinitos"

    NASA Astrophysics Data System (ADS)

    1994-04-01

    On Friday, 22 April 1994, a new science exhibition ``Infinitos", arranged jointly by Lisboa'94, CERN and ESO, will open at the Museu de Electricidade on the waterfront of Lisbon, the capital of Portugal. In a series of spectacular displays, it illustrates man's current understanding of how the Universe works - from the tiniest structures of matter to the most far flung galaxies. On this day, it will be inaugurated by the President of Lisboa'94, Prof. Vitor Constancio, the Portuguese Science Minister, Prof. L. Valente de Oliveira, Prof. C. Llewellyn Smith, Director General of CERN [2] and Dr. P. Creola, President of ESO Council. This exhibition is part of a rich cultural programme taking place at Lisbon during 1994 in the frame of ``Lisboa 94 - European City of Culture", after which it will travel to major cities around Europe. The frontiers of our knowledge push into inner space - the structure of the smallest components of matter - and into outer space - the dramatic phenomena of distant galaxies. Two of Europe's leading science organisations are playing a crucial role in this great human adventure. The European Laboratory for Particle Physics, CERN, operates the mighty accelerators and colliding beam machines to penetrate deep into matter and recreate the conditions which prevailed in the Universe a tiny fraction of a second after the Big Bang. The European Southern Observatory, ESO, operates the largest optical observatory in the world with a range of advanced telescopes searching the sky to study the evolution and content of our Universe. The ``Infinitos'' exhibition uses many modern exhibition techniques, including sophisticated audio-visual presentations and interactive video programmes. Visitors enter through a gallery of portraits of the most celebrated scientists from the 16th to 20th centuries and an exhibition of art inspired by scientific research. After passing a cosmic ray detector showing the streams of particles which pour down constantly from outer space, visitors continue into a central area where they are confronted with the essential questions of astro- and particle physics, f.inst. ``What is the Universe made of?'', ``How was the Universe created?'', ``What is in the sky?'', ``What is Dark Matter?'', ``Where does the stuff in our bodies come from?'', and ``Are we alone in the Universe?'' A central theme of this display is ``What we don't know''. In the second part of the exhibition visitors are shown the instruments and techniques used in today's big science research which will help to provide the answers. There are special displays on Europe's future large research projects such as the Large Hadron Collider (LHC) at CERN, which will bring protons into head-on collision at higher energies (14 TeV) than ever before to allow scientists to penetrate still further into the structure of matter and recreate the conditions prevailing in the Universe just 10-12 seconds after the "Big Bang" when the temperature was 10^16 degrees. Another highlight is a large interactive model of ESO's Very Large Telescope (VLT), the world's most ambitious optical telescope project, now under construction. The telescope's unequalled potential for exciting astronomical observations at the outer reaches of the Universe is clearly explained. Special emphasis is given to the contribution of Portuguese research institutes to the work of CERN and ESO, and particle physicists and astronomers from Portugal will be present at the exhibition to talk to visitors about their work. This exhibition will remain open until 12 June 1994 and will be a major attraction, also to the many tourists visiting this year's European City of Culture. 1. This is a joint Press Release of Lisboa'94, CERN and ESO. 2. CERN, the European Laboratory for Particle Physics, has its headquarters in Geneva. At present, its Member States are Austria, Belgium, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, Netherlands, Norway, Poland, Portugal, the Slovak Republic, Spain, Sweden, Switzerland and the United Kingdom. Israel, the Russian Federation, Turkey, Yugoslavia (status suspended after UN embargo, June 1992), the European Commission and Unesco have observer status.

  16. The Forward Endcap of the Electromagnetic Calorimeter for the PANDA Detector at FAIR

    NASA Astrophysics Data System (ADS)

    Albrecht, Malte; PANDA Collaboration

    2015-02-01

    The versatile 4π-detector PANDA will be built at the Facility for Antiproton and Ion Research (FAIR), an accelerator complex, currently under construction near Darmstadt, Germany. A cooled antiproton beam in a momentum range of 1.5 - 15GeV/c will be provided by the High Energy Storage Ring (HESR). All measurements at PANDA rely on an excellent performance of the detector with respect to tracking, particle identification and energy measurement. The electromagnetic calorimeter (EMC) of the PANDA detector will be equipped with 15744 PbWO4 crystals (PWO-II), which will be operated at a temperature of - 25° C in order to increase the light output. The design of the forward endcap of the EMC has been finalized. The crystals will be read out with Large Area Avalanche Photo Diodes (LAAPDs) in the outer regions and with Vacuum Photo Tetrodes (VPTTs) in the innermost part. Production of photosensor units utilizing charge integrating preamplifiers has begun. A prototype comprised of 216 PbWO4 crystals has been built and tested at various accelerators (CERN SPS, ELSA/Bonn, MAMI/Mainz), where the crystals have been exposed to electron and photon beams of 25MeV up to 15GeV. The results of these test measurements regarding the energy and position resolution are presented.

  17. Quench simulations for superconducting elements in the LHC accelerator

    NASA Astrophysics Data System (ADS)

    Sonnemann, F.; Schmidt, R.

    2000-08-01

    The design of the protection system for the superconducting elements in an accelerator such as the large Hadron collider (LHC), now under construction at CERN, requires a detailed understanding of the thermo-hydraulic and electrodynamic processes during a quench. A numerical program (SPQR - simulation program for quench research) has been developed to evaluate temperature and voltage distributions during a quench as a function of space and time. The quench process is simulated by approximating the heat balance equation with the finite difference method in presence of variable cooling and powering conditions. The simulation predicts quench propagation along a superconducting cable, forced quenching with heaters, impact of eddy currents induced by a magnetic field change, and heat transfer through an insulation layer into helium, an adjacent conductor or other material. The simulation studies allowed a better understanding of experimental quench data and were used for determining the adequate dimensioning and protection of the highly stabilised superconducting cables for connecting magnets (busbars), optimising the quench heater strip layout for the main magnets, and studying quench back by induced eddy currents in the superconductor. After the introduction of the theoretical approach, some applications of the simulation model for the LHC dipole and corrector magnets are presented and the outcome of the studies is compared with experimental data.

  18. Current Lead Design for the Accelerator Project for Upgrade of LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, Jeffrey S.; Cheban, Sergey; Feher, Sandor

    2010-01-01

    The Accelerator Project for Upgrade of LHC (APUL) is a U.S. project participating in and contributing to CERN's Large Hadron Collider (LHC) upgrade program. In collaboration with Brookhaven National Laboratory, Fermilab is developing sub-systems for an upgrade of the LHC final focus magnet systems. A concept of main and auxiliary helium flow was developed that allows the superconductor to remain cold while the lead body warms up to prevent upper section frosting. The auxiliary flow will subsequently cool the thermal shields of the feed box and the transmission line cryostats. A thermal analysis of the current lead central heat exchangemore » section was performed using analytic and FEA techniques. A method of remote soldering was developed that allows the current leads to be field replaceable. The remote solder joint was designed to be made without flux or additional solder, and able to be remade up to ten full cycles. A method of upper section attachment was developed that allows high pressure sealing of the helium volume. Test fixtures for both remote soldering and upper section attachment for the 13 kA lead were produced. The cooling concept, thermal analyses, and test results from both remote soldering and upper section attachment fixtures are presented.« less

  19. FIELD CALIBRATION OF A TLD ALBEDO DOSEMETER IN THE HIGH-ENERGY NEUTRON FIELD OF CERF.

    PubMed

    Haninger, T; Kleinau, P; Haninger, S

    2017-04-28

    The new albedo dosemeter-type AWST-TL-GD 04 has been calibrated in the CERF neutron field (Cern-EU high-energy Reference Field). This type of albedo dosemeter is based on thermoluminescent detectors (TLDs) and used by the individual monitoring service of the Helmholtz Zentrum München (AWST) since 2015 for monitoring persons, who are exposed occupationally against photon and neutron radiation. The motivation for this experiment was to gain a field specific neutron correction factor Nn for workplaces at high-energy particle accelerators. Nn is a dimensionless factor relative to a basic detector calibration with 137Cs and is used to calculate the personal neutron dose in terms of Hp(10) from the neutron albedo signal. The results show that the sensitivity of the albedo dosemeter for this specific neutron field is not significantly lower as for fast neutrons of a radionuclide source like 252Cf. The neutron correction factor varies between 0.73 and 1.16 with a midrange value of 0.94. The albedo dosemeter is therefore appropriate to monitor persons, which are exposed at high-energy particle accelerators. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Conductor Specification and Validation for High-Luminosity LHC Quadrupole Magnets

    DOE PAGES

    Cooley, L. D.; Ghosh, A. K.; Dietderich, D. R.; ...

    2017-06-01

    The High Luminosity Upgrade of the Large Hadron Collider (HL-LHC) at CERN will replace the main ring inner triplet quadrupoles, identified by the acronym MQXF, adjacent to the main ring intersection regions. For the past decade, the U.S. LHC Accelerator R&D Program, LARP, has been evaluating conductors for the MQXFA prototypes, which are the outer magnets of the triplet. Recently, the requirements for MQXF magnets and cables have been published in P. Ferracin et al., IEEE Trans. Appl. Supercond., vol. 26, no. 4, 2016, Art. no.4000207, along with the final specification for Ti-alloyed Nb3Sn conductor determined jointly by CERN andmore » LARP. This paper describes the rationale beneath the 0.85 mm diameter strand’s chief parameters, which are 108 or more sub-elements, a copper fraction not less than 52.4%, strand critical current at 4.22 K not less than 631 A at 12 T and 331 A at 15 T, and residual resistance ratio of not less than 150. This paper also compares the performance for ~100 km production lots of the five most recent LARP conductors to the first 163 km of strand made according to the HL-LHC specification. Two factors emerge as significant for optimizing performance and minimizing risk: a modest increase of the sub-element diameter from 50 to 55 μm, and a Nb:Sn molar ratio of 3.6 instead of 3.4. Furthermore, the statistics acquired so far give confidence that the present conductor can balance competing demands in production for the HL-LHC project.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooley, L. D.; Ghosh, A. K.; Dietderich, D. R.

    The High Luminosity Upgrade of the Large Hadron Collider (HL-LHC) at CERN will replace the main ring inner triplet quadrupoles, identified by the acronym MQXF, adjacent to the main ring intersection regions. For the past decade, the U.S. LHC Accelerator R&D Program, LARP, has been evaluating conductors for the MQXFA prototypes, which are the outer magnets of the triplet. Recently, the requirements for MQXF magnets and cables have been published in P. Ferracin et al., IEEE Trans. Appl. Supercond., vol. 26, no. 4, 2016, Art. no.4000207, along with the final specification for Ti-alloyed Nb3Sn conductor determined jointly by CERN andmore » LARP. This paper describes the rationale beneath the 0.85 mm diameter strand’s chief parameters, which are 108 or more sub-elements, a copper fraction not less than 52.4%, strand critical current at 4.22 K not less than 631 A at 12 T and 331 A at 15 T, and residual resistance ratio of not less than 150. This paper also compares the performance for ~100 km production lots of the five most recent LARP conductors to the first 163 km of strand made according to the HL-LHC specification. Two factors emerge as significant for optimizing performance and minimizing risk: a modest increase of the sub-element diameter from 50 to 55 μm, and a Nb:Sn molar ratio of 3.6 instead of 3.4. Furthermore, the statistics acquired so far give confidence that the present conductor can balance competing demands in production for the HL-LHC project.« less

  2. Leak checker data logging system

    DOEpatents

    Gannon, J.C.; Payne, J.J.

    1996-09-03

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time. 18 figs.

  3. Leak checker data logging system

    DOEpatents

    Gannon, Jeffrey C.; Payne, John J.

    1996-01-01

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.

  4. Comparison of hypertabastic survival model with other unimodal hazard rate functions using a goodness-of-fit test.

    PubMed

    Tahir, M Ramzan; Tran, Quang X; Nikulin, Mikhail S

    2017-05-30

    We studied the problem of testing a hypothesized distribution in survival regression models when the data is right censored and survival times are influenced by covariates. A modified chi-squared type test, known as Nikulin-Rao-Robson statistic, is applied for the comparison of accelerated failure time models. This statistic is used to test the goodness-of-fit for hypertabastic survival model and four other unimodal hazard rate functions. The results of simulation study showed that the hypertabastic distribution can be used as an alternative to log-logistic and log-normal distribution. In statistical modeling, because of its flexible shape of hazard functions, this distribution can also be used as a competitor of Birnbaum-Saunders and inverse Gaussian distributions. The results for the real data application are shown. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Advanced Thin Ionization Calorimeter (ATIC) Balloon Experiment

    NASA Technical Reports Server (NTRS)

    Wefel, John P.; Guzik, T. Gregory

    2001-01-01

    During grant NAG5-5064, Louisiana State University (LSU) led the ATIC team in the development, construction, testing, accelerator validation, pre-deployment integration and flight operations of the Advanced Thin Ionization Calorimeter (ATIC) Balloon Experiment. This involved interfacing among the ATIC collaborators (UMD, NRL/MSFC, SU, MSU, WI, SNU) to develop a new balloon payload based upon a fully active calorimeter, a carbon target, a scintillator strip hodoscope and a pixilated silicon solid state detector for a detailed investigation of the very high energy cosmic rays to energies beyond 10(exp 14) eV/nucleus. It is in this very high energy region that theory predicts changes in composition and energy spectra related to the Supernova Remnant Acceleration model for cosmic rays below the "knee" in the all-particle spectrum. This report provides a documentation list, details the anticipated ATIC science return, describes the particle detection principles on which the experiment is based, summarizes the simulation results for the system, describes the validation work at the CERN SPS accelerator and details the balloon flight configuration. The ATIC experiment had a very successful LDB flight from McMurdo, Antarctica in 12/00 - 1/01. The instrument performed well for the entire 15 days. Preliminary data analysis shows acceptable charge resolution and an all-particle power law energy deposition distribution not inconsistent with previous measurements. Detailed analysis is underway and will result in new data on the cosmic ray charge and energy spectra in the GeV - TeV energy range. ATIC is currently being refurbished in anticipation of another LDB flight in the 2002-03 period.

  6. Meeting Jentschke

    ScienceCinema

    None

    2018-05-18

    After an introduction about the latest research and news at CERN, the DG W. Jentschke speaks about future management of CERN with two new general managers, who will be in charge for the next 5 years: Dr. J.B. Adams who will focus on the administration of CERN and also the construction of buildings and equipment, and Dr. L. Van Hove who will be responsible for research activities. The DG speaks about expected changes, shared services, different divisions and their leaders, etc.

  7. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    Sen, Ashoke

    2017-12-18

    Part 7.The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  8. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-02-09

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental InteractionS". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  9. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-01-22

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauer, Gerry; et al.

    The DAQ system of the CMS experiment at CERN collects data from more than 600 custom detector Front-End Drivers (FEDs). During 2013 and 2014 the CMS DAQ system will undergo a major upgrade to address the obsolescence of current hardware and the requirements posed by the upgrade of the LHC accelerator and various detector components. For a loss-less data collection from the FEDs a new FPGA based card implementing the TCP/IP protocol suite over 10Gbps Ethernet has been developed. To limit the TCP hardware implementation complexity the DAQ group developed a simplified and unidirectional but RFC 793 compliant version ofmore » the TCP protocol. This allows to use a PC with the standard Linux TCP/IP stack as a receiver. We present the challenges and protocol modifications made to TCP in order to simplify its FPGA implementation. We also describe the interaction between the simplified TCP and Linux TCP/IP stack including the performance measurements.« less

  11. SPS Beam Steering for LHC Extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gianfelice-Wendt, Eliana; Bartosik, Hannes; Cornelis, Karel

    2014-07-01

    The CERN Super Proton Synchrotron accelerates beams for the Large Hadron Collider to 450 GeV. In addition it produces beams for fixed target facilities which adds complexity to the SPS operation. During the run 2012-2013 drifts of the extracted beam trajectories have been observed and lengthy optimizations in the transfer lines were performed to reduce particle losses in the LHC. The observed trajectory drifts are consistent with the measured SPS orbit drifts at extraction. While extensive studies are going on to understand, and possibly suppress, the source of such SPS orbit drifts the feasibility of an automatic beam steering towardsmore » a “golden” orbit at the extraction septa, by means of the interlocked correctors, is also being investigated. The challenges and constraints related to the implementation of such a correction in the SPS are described. Simulation results are presented and a possible operational steering strategy is proposed.« less

  12. X-ray energy selected imaging with Medipix II

    NASA Astrophysics Data System (ADS)

    Ludwig, J.; Zwerger, A.; Benz, K.-W.; Fiederle, M.; Braml, H.; Fauler, A.; Konrath, J.-P.

    2004-09-01

    Two different X-ray tube accelerating voltages (60 and 70kV) are used for diagnosis of front teeth and molars. Different energy ranges are necessary as function of tooth thickness to obtain similar contrast for imaging. This technique drives the costs for the X-ray tube up and allows for just two optimized settings. Energy range selection for the detection of the penetrating X-rays would overcome these severe setbacks. The single photon counting chip MEDIPIX2 http://www.cern.ch/medipix exhibits exactly this feature.First simulations and measurements have been carried out using a dental X-ray source. As a demonstrator a real tooth has been used with different cavities and filling materials. Simulations showed in general larger improvements as compared to measurements regarding SNR and contrast: A beneficial factor of 4% wrt SNR and 25% for contrast, measurements showed factors of 2.5 and up to 10%, respectively.

  13. Medical beam monitor—Pre-clinical evaluation and future applications

    NASA Astrophysics Data System (ADS)

    Frais-Kölbl, Helmut; Griesmayer, Erich; Schreiner, Thomas; Georg, Dietmar; Pernegger, Heinz

    2007-10-01

    Future medical ion beam applications for cancer therapy which are based on scanning technology will require advanced beam diagnostics equipment. For a precise analysis of beam parameters we want to resolve time structures in the range of microseconds to nanoseconds. A prototype of an advanced beam monitor was developed by the University of Applied Sciences Wiener Neustadt and its research subsidiary Fotec in co-operation with CERN RD42, Ohio State University and the Jožef Stefan Institute in Ljubljana. The detector is based on polycrystalline Chemical Vapor Deposition (pCVD) diamond substrates and is equipped with readout electronics up to 2 GHz analog bandwidth. In this paper we present the design of the pCVD-detector system and results of tests performed in various particle accelerator based facilities. Measurements performed in clinical high energy photon beams agreed within 1.2% with results obtained by standard ionization chambers.

  14. Performance of a fast digital integrator in on-field magnetic measurements for particle accelerators

    NASA Astrophysics Data System (ADS)

    Arpaia, P.; Bottura, L.; Fiscarelli, L.; Walckiers, L.

    2012-02-01

    The fast digital integrator has been conceived to face most demanding magnet test requirements with a resolution of 10 ppm, a signal-to-noise ratio of 105 dB at 20 kHz, a time resolution of 50 ns, an offset of 10 ppm, and on-line processing. In this paper, the on-field achievements of the fast digital integrator are assessed by a specific measurement campaign at the European Organization for Nuclear Research (CERN). At first, the architecture and the metrological specifications of the instrument are reported. Then, the recent on-field achievements of (i) ±10 ppm of uncertainty in the measurement of the main field for superconducting magnets characterization, (ii) ±0.02 % of field uncertainty in quality assessment of small-aperture permanent magnets, and (iii) ±0.15 % of drift, in an excitation current measurement of 600 s under cryogenic conditions, are presented and discussed.

  15. Two-Layer 16 Tesla Cosθ Dipole Design for the FCC

    DOE PAGES

    Holik, Eddie Frank; Ambrosio, Giorgio; Apollinari, G.

    2018-02-13

    The Future Circular Collider or FCC is a study aimed at exploring the possibility to reach 100 TeV total collision energy which would require 16 tesla dipoles. Upon the conclusion of the High Luminosity Upgrade, the US LHC Accelerator Upgrade Pro-ject in collaboration with CERN will have extensive Nb3Sn magnet fabrication experience. This experience includes robust Nb3Sn conductor and insulation scheming, 2-layer cos2θ coil fabrication, and bladder-and-key structure and assembly. By making im-provements and modification to existing technology the feasibility of a two-layer 16 tesla dipole is investigated. Preliminary designs indicate that fields up to 16.6 tesla are feasible withmore » conductor grading while satisfying the HE-LHC and FCC specifications. Key challenges include accommodating high-aspect ratio conductor, narrow wedge design, Nb3Sn conductor grading, and especially quench protection of a 16 tesla device.« less

  16. Two-Layer 16 T Cos θ Dipole Design for the FCC

    DOE PAGES

    Holik, Eddie Frank; Ambrosio, Giorgio; Apollinari, Giorgio

    2018-02-22

    Here, the Future Circular Collider or FCC is a study aimed at exploring the possibility to reach 100 TeV total collision energy which would require 16 tesla dipoles. Upon the conclusion of the High Luminosity Upgrade, the US LHC Accelerator Upgrade Pro-ject in collaboration with CERN will have extensive Nb 3Sn magnet fabrication experience. This experience includes robust Nb 3Sn conductor and insulation scheming, 2-layer cos2θ coil fabrication, and bladder-and-key structure and assembly. By making im-provements and modification to existing technology the feasibility of a two-layer 16 tesla dipole is investigated. Preliminary designs indicate that fields up to 16.6 teslamore » are feasible with conductor grading while satisfying the HE-LHC and FCC specifications. Key challenges include accommodating high-aspect ratio conductor, narrow wedge design, Nb 3Sn conductor grading, and especially quench protection of a 16 tesla device.« less

  17. Search for Hidden Particles: a new experiment proposal

    NASA Astrophysics Data System (ADS)

    De Lellis, G.

    2015-08-01

    Searches for new physics with accelerators are being performed at the LHC, looking for high massive particles coupled to matter with ordinary strength. We propose a new experiment meant to search for very weakly coupled particles in the few GeV mass domain. The existence of such particles, foreseen in different models beyond the Standard Model, is largely unexplored from the experimental point of view. A beam dump facility, built at CERN in the north area, using 400 GeV protons is a copious factory of charmed hadrons and it could be used to probe the existence of such particles. The beam dump is also an ideal source of tau neutrinos, the less known particle in the Standard Model. In particular, tau anti-neutrinos have not been observed so far. We therefore propose an experiment to search for hidden particles and study tau neutrino physics at the same time.

  18. Search for Hidden Particles (SHiP): a new experiment proposal

    NASA Astrophysics Data System (ADS)

    De Lellis, G.

    2015-06-01

    Searches for new physics with accelerators are being performed at the LHC, looking for high massive particles coupled to matter with ordinary strength. We propose a new experimental facility meant to search for very weakly coupled particles in the few GeV mass domain. The existence of such particles, foreseen in different theoretical models beyond the Standard Model, is largely unexplored from the experimental point of view. A beam dump facility, built at CERN in the north area, using 400 GeV protons is a copious factory of charmed hadrons and could be used to probe the existence of such particles. The beam dump is also an ideal source of tau neutrinos, the less known particle in the Standard Model. In particular, tau anti-neutrinos have not been observed so far. We therefore propose an experiment to search for hidden particles and study tau neutrino physics at the same time.

  19. Two-Layer 16 T Cos θ Dipole Design for the FCC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holik, Eddie Frank; Ambrosio, Giorgio; Apollinari, Giorgio

    Here, the Future Circular Collider or FCC is a study aimed at exploring the possibility to reach 100 TeV total collision energy which would require 16 tesla dipoles. Upon the conclusion of the High Luminosity Upgrade, the US LHC Accelerator Upgrade Pro-ject in collaboration with CERN will have extensive Nb 3Sn magnet fabrication experience. This experience includes robust Nb 3Sn conductor and insulation scheming, 2-layer cos2θ coil fabrication, and bladder-and-key structure and assembly. By making im-provements and modification to existing technology the feasibility of a two-layer 16 tesla dipole is investigated. Preliminary designs indicate that fields up to 16.6 teslamore » are feasible with conductor grading while satisfying the HE-LHC and FCC specifications. Key challenges include accommodating high-aspect ratio conductor, narrow wedge design, Nb 3Sn conductor grading, and especially quench protection of a 16 tesla device.« less

  20. Short and Long Baseline Neutrino Experiments

    NASA Astrophysics Data System (ADS)

    Autiero, Dario

    2005-04-01

    These two lectures discuss the past and current neutrino oscillation experiments performed with man-made neutrino sources, like accelerators and nuclear reactors. The search for neutrino oscillations is a remarkable effort, which has been performed over three decades. It is therefore interesting to discuss the short and long baseline neutrino experiments in their historical context and to see how this line of research evolved up to the present generation of experiments, looking at what was learnt from past experiments and how this experience is used in the current ones. The first lecture focuses on the past generation of short baseline experiments (NOMAD and CHORUS) performed at CERN and ends with LSND and MINIBOONE. The second lecture discusses how after the CHOOZ and the atmospheric neutrino results the line of the long baseline experiments developed and presents in details the K2K and MINOS experiments and the CNGS program.

  1. Status of diamond particle detectors

    NASA Astrophysics Data System (ADS)

    Krammer, M.; Adam, W.; Bauer, C.; Berdermann, E.; Bogani, F.; Borchi, E.; Bruzzi, M.; Colledani, C.; Conway, J.; Dabrowski, W.; Delpierre, P.; Deneuville, A.; Dulinski, W.; van Eijk, B.; Fallou, A.; Fish, D.; Foulon, F.; Friedl, M.; Gan, K. K.; Gheeraert, E.; Grigoriev, E.; Hallewell, G.; Hall-Wilton, R.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H.; Kania, D.; Kaplon, J.; Kass, R.; Knöpfle, K. T.; Manfredi, P. F.; Meier, D.; Mishina, M.; LeNormand, F.; Pan, L. S.; Pernegger, H.; Pernicka, M.; Re, V.; Riester, G. L.; Roe, S.; Roff, D.; Rudge, A.; Schnetzer, S.; Sciortino, S.; Speziali, V.; Stelzer, H.; Stone, R.; Tapper, R. J.; Tesarek, R.; Thomson, G. B.; Trawick, M.; Trischuk, W.; Turchetta, R.; Walsh, A. M.; Wedenig, R.; Weilhammer, P.; Ziock, H.; Zoeller, M.

    1998-11-01

    To continue the exciting research in the field of particle physics new accelerators and experiments are under construction. In some of these experiments, e.g. ATLAS and CMS at the Large Hadron Collider at CERN or HERA-B at DESY, the detectors have to withstand an extreme environment. The detectors must be radiation hard, provide a very fast signal, and be as thin as possible. The properties of CVD diamond allow to fulfill these requirements and make it an ideal material for the detectors close to the interaction region of these experiments, i.e. the vertex detectors or the inner trackers. The RD42 collaboration is developing diamond detectors for these applications. The program of RD42 includes the improvement of the charge collection properties of CVD diamond, the study of the radiation hardness and the development of low-noise radiation hard readout electronics. An overview of the progress achieved during the last years will be given.

  2. Test beam studies of possibilities to separate particles with gamma factors above 103 with straw based Transition Radiation Detector

    NASA Astrophysics Data System (ADS)

    Belyaev, N.; Cherry, M. L.; Doronin, S. A.; Filippov, K.; Fusco, P.; Konovalov, S.; Krasnopevtsev, D.; Kramarenko, V.; Loparco, F.; Mazziotta, M. N.; Ponomarenko, D.; Pyatiizbyantseva, D.; Radomskii, R.; Rembser, C.; Romaniouk, A.; Savchenko, A.; Shulga, E.; Smirnov, S.; Smirnov, Yu; Sosnovtsev, V.; Spinelli, P.; Teterin, P.; Tikhomirov, V.; Vorobev, K.; Zhukov, K.

    2017-12-01

    Measurements of hadron production in the TeV energy range are one of the tasks of the future studies at the Large Hadron Collider (LHC). The main goal of these experiments is a study of the fundamental QCD processes at this energy range, which is very important not only for probing of the Standard Model but also for ultrahigh-energy cosmic particle physics. One of the key elements of these experiments measurements are hadron identification. The only detector technology which has a potential ability to separate hadrons in this energy range is Transition Radiation Detector (TRD) technology. TRD prototype based on straw proportional chambers combined with a specially assembled radiator has been tested at the CERN SPS accelerator beam. The test beam results and comparison with detailed Monte Carlo simulations are presented here.

  3. SHiP: a new facility to search for heavy neutrinos and study ντ properties

    NASA Astrophysics Data System (ADS)

    De Serio, M.; SHiP Collaboration

    2016-05-01

    SHiP (Search for Hidden Particles) is a newly designed fixed target facility, proposed at the CERN SPS accelerator, with the aim of complementing searches for New Physics at LHC by searching for light long-lived exotic particles with masses below a few GeV/c2. The sensitivity to Heavy Neutrinos will allow for the first time probing a region of the parameter space where Baryogenesis and active neutrino masses and oscillation could also be explained. A dedicated detector, based on OPERA-like bricks, will provide the first observation of the tau anti-neutrino. Moreover, ντ and ν¯τ cross-sections will be measured with a statistics 1000 times larger than currently available data and will allow extracting the F4 and F5 structure functions, never measured so far. Charm physics studies will be performed with significantly improved accuracy with respect to past experiments.

  4. The influence of train leakage currents on the LEP dipole field

    NASA Astrophysics Data System (ADS)

    Bravin, E.; Brun, G.; Dehning, B.; Drees, A.; Galbraith, P.; Geitz, M.; Henrichsen, K.; Koratzinos, M.; Mugnai, G.; Tonutti, M.

    The determination of the mass and the width of the Z boson at CERN's LEP accelerator, an e+e- storage ring with a circumference of approximately 27 km, imposes heavy demands on the knowledge of the LEP counter-rotating electron and positron beam energies. The precision required is of the order of 1 MeV or ≈ 20 ppm. Due to its size, the LEP collider is influenced by various macroscopic and regional factors such as the position of the moon or seasonal changes of the rainfall in the area, as reported earlier. A new and not less surprising effect on the LEP energy was observed in 1995: railroad trains in the Geneva region perturb the dipole field. A parasitic flow of electricity, originating from the trains, travels along the LEP vacuum chamber, affecting the LEP dipole field. An account of the phenomenon with its explanation substantiated by dedicated measurements is presented.

  5. The Sabah Biodiversity Experiment: a long-term test of the role of tree diversity in restoring tropical forest structure and functioning

    PubMed Central

    Hector, Andy; Philipson, Christopher; Saner, Philippe; Chamagne, Juliette; Dzulkifli, Dzaeman; O'Brien, Michael; Snaddon, Jake L.; Ulok, Philip; Weilenmann, Maja; Reynolds, Glen; Godfray, H. Charles J.

    2011-01-01

    Relatively, little is known about the relationship between biodiversity and ecosystem functioning in forests, especially in the tropics. We describe the Sabah Biodiversity Experiment: a large-scale, long-term field study on the island of Borneo. The project aims at understanding the relationship between tree species diversity and the functioning of lowland dipterocarp rainforest during restoration following selective logging. The experiment is planned to run for several decades (from seed to adult tree), so here we focus on introducing the project and its experimental design and on assessing initial conditions and the potential for restoration of the structure and functioning of the study system, the Malua Forest Reserve. We estimate residual impacts 22 years after selective logging by comparison with an appropriate neighbouring area of primary forest in Danum Valley of similar conditions. There was no difference in the alpha or beta species diversity of transect plots in the two forest types, probably owing to the selective nature of the logging and potential effects of competitive release. However, despite equal total stem density, forest structure differed as expected with a deficit of large trees and a surfeit of saplings in selectively logged areas. These impacts on structure have the potential to influence ecosystem functioning. In particular, above-ground biomass and carbon pools in selectively logged areas were only 60 per cent of those in the primary forest even after 22 years of recovery. Our results establish the initial conditions for the Sabah Biodiversity Experiment and confirm the potential to accelerate restoration by using enrichment planting of dipterocarps to overcome recruitment limitation. What role dipterocarp diversity plays in restoration only will become clear with long-term results. PMID:22006970

  6. Test-bench system for a borehole azimuthal acoustic reflection imaging logging tool

    NASA Astrophysics Data System (ADS)

    Liu, Xianping; Ju, Xiaodong; Qiao, Wenxiao; Lu, Junqiang; Men, Baiyong; Liu, Dong

    2016-06-01

    The borehole azimuthal acoustic reflection imaging logging tool (BAAR) is a new generation of imaging logging tool, which is able to investigate stratums in a relatively larger range of space around the borehole. The BAAR is designed based on the idea of modularization with a very complex structure, so it has become urgent for us to develop a dedicated test-bench system to debug each module of the BAAR. With the help of a test-bench system introduced in this paper, test and calibration of BAAR can be easily achieved. The test-bench system is designed based on the client/server model. The hardware system mainly consists of a host computer, an embedded controlling board, a bus interface board, a data acquisition board and a telemetry communication board. The host computer serves as the human machine interface and processes the uploaded data. The software running on the host computer is designed based on VC++. The embedded controlling board uses Advanced Reduced Instruction Set Machines 7 (ARM7) as the micro controller and communicates with the host computer via Ethernet. The software for the embedded controlling board is developed based on the operating system uClinux. The bus interface board, data acquisition board and telemetry communication board are designed based on a field programmable gate array (FPGA) and provide test interfaces for the logging tool. To examine the feasibility of the test-bench system, it was set up to perform a test on BAAR. By analyzing the test results, an unqualified channel of the electronic receiving cabin was discovered. It is suggested that the test-bench system can be used to quickly determine the working condition of sub modules of BAAR and it is of great significance in improving production efficiency and accelerating industrial production of the logging tool.

  7. The Sabah Biodiversity Experiment: a long-term test of the role of tree diversity in restoring tropical forest structure and functioning.

    PubMed

    Hector, Andy; Philipson, Christopher; Saner, Philippe; Chamagne, Juliette; Dzulkifli, Dzaeman; O'Brien, Michael; Snaddon, Jake L; Ulok, Philip; Weilenmann, Maja; Reynolds, Glen; Godfray, H Charles J

    2011-11-27

    Relatively, little is known about the relationship between biodiversity and ecosystem functioning in forests, especially in the tropics. We describe the Sabah Biodiversity Experiment: a large-scale, long-term field study on the island of Borneo. The project aims at understanding the relationship between tree species diversity and the functioning of lowland dipterocarp rainforest during restoration following selective logging. The experiment is planned to run for several decades (from seed to adult tree), so here we focus on introducing the project and its experimental design and on assessing initial conditions and the potential for restoration of the structure and functioning of the study system, the Malua Forest Reserve. We estimate residual impacts 22 years after selective logging by comparison with an appropriate neighbouring area of primary forest in Danum Valley of similar conditions. There was no difference in the alpha or beta species diversity of transect plots in the two forest types, probably owing to the selective nature of the logging and potential effects of competitive release. However, despite equal total stem density, forest structure differed as expected with a deficit of large trees and a surfeit of saplings in selectively logged areas. These impacts on structure have the potential to influence ecosystem functioning. In particular, above-ground biomass and carbon pools in selectively logged areas were only 60 per cent of those in the primary forest even after 22 years of recovery. Our results establish the initial conditions for the Sabah Biodiversity Experiment and confirm the potential to accelerate restoration by using enrichment planting of dipterocarps to overcome recruitment limitation. What role dipterocarp diversity plays in restoration only will become clear with long-term results.

  8. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file

    NASA Astrophysics Data System (ADS)

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-01

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.

  9. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file.

    PubMed

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-21

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network Constituents, Fundamental Forces and Symmetries of the Universe. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva.« less

  11. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeMarco, J; McCloskey, S; Low, D

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less

  12. The DAQ system for the AEḡIS experiment

    NASA Astrophysics Data System (ADS)

    Prelz, F.; Aghion, S.; Amsler, C.; Ariga, T.; Bonomi, G.; Brusa, R. S.; Caccia, M.; Caravita, R.; Castelli, F.; Cerchiari, G.; Comparat, D.; Consolati, G.; Demetrio, A.; Di Noto, L.; Doser, M.; Ereditato, A.; Evans, C.; Ferragut, R.; Fesel, J.; Fontana, A.; Gerber, S.; Giammarchi, M.; Gligorova, A.; Guatieri, F.; Haider, S.; Hinterberger, A.; Holmestad, H.; Kellerbauer, A.; Krasnický, D.; Lagomarsino, V.; Lansonneur, P.; Lebrun, P.; Malbrunot, C.; Mariazzi, S.; Matveev, V.; Mazzotta, Z.; Müller, S. R.; Nebbia, G.; Nedelec, P.; Oberthaler, M.; Pacifico, N.; Pagano, D.; Penasa, L.; Petracek, V.; Prevedelli, M.; Ravelli, L.; Rienaecker, B.; Robert, J.; Røhne, O. M.; Rotondi, A.; Sacerdoti, M.; Sandaker, H.; Santoro, R.; Scampoli, P.; Simon, M.; Smestad, L.; Sorrentino, F.; Testera, G.; Tietje, I. C.; Widmann, E.; Yzombard, P.; Zimmer, C.; Zmeskal, J.; Zurlo, N.

    2017-10-01

    In the sociology of small- to mid-sized (O(100) collaborators) experiments the issue of data collection and storage is sometimes felt as a residual problem for which well-established solutions are known. Still, the DAQ system can be one of the few forces that drive towards the integration of otherwise loosely coupled detector systems. As such it may be hard to complete with off-the-shelf components only. LabVIEW and ROOT are the (only) two software systems that were assumed to be familiar enough to all collaborators of the AEḡIS (AD6) experiment at CERN: working out of the GXML representation of LabVIEW Data types, a semantically equivalent representation as ROOT TTrees was developed for permanent storage and analysis. All data in the experiment is cast into this common format and can be produced and consumed on both systems and transferred over TCP and/or multicast over UDP for immediate sharing over the experiment LAN. We describe the setup that has been able to cater to all run data logging and long term monitoring needs of the AEḡIS experiment so far.

  13. ALICE Expert System

    NASA Astrophysics Data System (ADS)

    Ionita, C.; Carena, F.

    2014-06-01

    The ALICE experiment at CERN employs a number of human operators (shifters), who have to make sure that the experiment is always in a state compatible with taking Physics data. Given the complexity of the system and the myriad of errors that can arise, this is not always a trivial task. The aim of this paper is to describe an expert system that is capable of assisting human shifters in the ALICE control room. The system diagnoses potential issues and attempts to make smart recommendations for troubleshooting. At its core, a Prolog engine infers whether a Physics or a technical run can be started based on the current state of the underlying sub-systems. A separate C++ component queries certain SMI objects and stores their state as facts in a Prolog knowledge base. By mining the data stored in different system logs, the expert system can also diagnose errors arising during a run. Currently the system is used by the on-call experts for faster response times, but we expect it to be adopted as a standard tool by regular shifters during the next data taking period.

  14. Orthos, an alarm system for the ALICE DAQ operations

    NASA Astrophysics Data System (ADS)

    Chapeland, Sylvain; Carena, Franco; Carena, Wisla; Chibante Barroso, Vasco; Costa, Filippo; Denes, Ervin; Divia, Roberto; Fuchs, Ulrich; Grigore, Alexandru; Simonetti, Giuseppe; Soos, Csaba; Telesca, Adriana; Vande Vyvre, Pierre; von Haller, Barthelemy

    2012-12-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The DAQ (Data Acquisition System) facilities handle the data flow from the detectors electronics up to the mass storage. The DAQ system is based on a large farm of commodity hardware consisting of more than 600 devices (Linux PCs, storage, network switches), and controls hundreds of distributed hardware and software components interacting together. This paper presents Orthos, the alarm system used to detect, log, report, and follow-up abnormal situations on the DAQ machines at the experimental area. The main objective of this package is to integrate alarm detection and notification mechanisms with a full-featured issues tracker, in order to prioritize, assign, and fix system failures optimally. This tool relies on a database repository with a logic engine, SQL interfaces to inject or query metrics, and dynamic web pages for user interaction. We describe the system architecture, the technologies used for the implementation, and the integration with existing monitoring tools.

  15. A crisis in the making: responses of Amazonian forests to land use and climate change.

    PubMed

    Laurance, W F

    1998-10-01

    At least three global-change phenomena are having major impacts on Amazonian forests: (1) accelerating deforestation and logging; (2) rapidly changing patterns of forest loss; and (3) interactions between human land-use and climatic variability. Additional alterations caused by climatic change, rising concentrations of atmospheric carbon dioxide, mining, overhunting and other large-scale phenomena could also have important effects on the Amazon ecosystem. Consequently, decisions regarding Amazon forest use in the next decade are crucial to its future existence.

  16. CERN: A European laboratory for a global project

    NASA Astrophysics Data System (ADS)

    Voss, Rüdiger

    2015-06-01

    In the most important shift of paradigm of its membership rules in 60 years, CERN in 2010 introduced a policy of “Geographical Enlargement” which for the first time opened the door for membership of non-European States in the Organization. This short article reviews briefly the history of CERN's membership rules, discusses the rationale behind the new policy, its relationship with the emerging global roadmap of particle physics, and gives a short overview of the status of the enlargement process.

  17. Review of CERN Data Centre Infrastructure

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Bell, T.; van Eldik, J.; McCance, G.; Panzer-Steindel, B.; Coelho dos Santos, M.; Traylen and, S.; Schwickerath, U.

    2012-12-01

    The CERN Data Centre is reviewing strategies for optimizing the use of the existing infrastructure and expanding to a new data centre by studying how other large sites are being operated. Over the past six months, CERN has been investigating modern and widely-used tools and procedures used for virtualisation, clouds and fabric management in order to reduce operational effort, increase agility and support unattended remote data centres. This paper gives the details on the project's motivations, current status and areas for future investigation.

  18. PARTICLE PHYSICS: CERN Gives Higgs Hunters Extra Month to Collect Data.

    PubMed

    Morton, O

    2000-09-22

    After 11 years of banging electrons and positrons together at higher energies than any other machine in the world, CERN, the European laboratory for particle physics, had decided to shut down the Large Electron-Positron collider (LEP) and install a new machine, the Large Hadron Collider (LHC), in its 27-kilometer tunnel. In 2005, the LHC will start bashing protons together at even higher energies. But tantalizing hints of a long-sought fundamental particle have forced CERN managers to grant LEP a month's reprieve.

  19. Analysis of the Non-LTE Lithium Abundance for a Large Sample of F-, G-, and K-Giants and Supergiants

    NASA Astrophysics Data System (ADS)

    Lyubimkov, L. S.; Petrov, D. V.

    2017-09-01

    A five-dimensional interpolation method and corresponding computer program are developed for using published calculations to determine the non-LTE correction ΔNLTE to the lithium abundance logɛ(Li) derived from the Li I 6707.8 Å line. The ΔNLTE value is determined from the following five parameters: the effective temperature Teff, the acceleration of gravity logg, the metallicity index [Fe/H], the microturbulent velocity Vt, and the LTE Li abundance logɛ(Li) . The program is used to calculate values of ΔNLTE and the non-LTE Li abundance for 91 single bright giants from the list of Lebre, et al. By combining these results with data for 55 stars from the previous paper, we obtain the non-LTE values of logɛ(Li) for 146 FGK-giants and supergiants. We confirm that, because of the absence of the Li line in the spectra of most of these stars, it is only possible to estimate for them an upper bound for the Li abundance. A large spread is confirmed in logɛ(Li) for stars with masses M ≤ 6M ⦿ . A comparison of these results with model calculations of stars confirms the unique sensitivity of the lithium abundance to the initial rotation velocity V0. We discuss the giants and supergiants with lithium abundances logɛ(Li) = 1.4 ± 0.3 , which could have a rotational velocity V0=0 km/s and have already undergone deep convective mixing. Li-rich giants with lithium abundances logɛ(Li) ≥ 2 and nearly up to the initial value of logɛ(Li) = 3.2 ± 0.1 are examined. It is shown that the fraction of Li-rich giants with V0 ≈ 0 - 50 km/s is consistent with current evolutionary models. The other stars of this type, as well as all of the "super Li-rich" giants, for which the standard theory is untenable, can be explained by invoking the hypothesis of recent lithium synthesis in the star or an alternative hypothesis according to which a giant planet is engulfed by the star.

  20. Comparison of parametric and bootstrap method in bioequivalence test.

    PubMed

    Ahn, Byung-Jin; Yim, Dong-Seok

    2009-10-01

    The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption.

  1. Comparison of Parametric and Bootstrap Method in Bioequivalence Test

    PubMed Central

    Ahn, Byung-Jin

    2009-01-01

    The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption. PMID:19915699

  2. Réunion publique HR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-04-30

    Chers Collègues,Je me permets de vous rappeler qu'une réunion publique organisée par le Département HR se tiendra aujourd'hui:Vendredi 30 avril 2010 à 9h30 dans l'Amphithéâtre principal (café offert dès 9h00).Durant cette réunion, des informations générales seront données sur:le CERN Admin e-guide, qui est un nouveau guide des procédures administratives du CERN ayant pour but de faciliter la recherche d'informations pratiques et d'offrir un format de lecture convivial;le régime d'Assurance Maladie de l'Organisation (présentation effectuée par Philippe Charpentier, Président du CHIS Board) et;la Caisse de Pensions (présentation effectuée par Théodore Economou, Administrateur de la Caisse de Pensions du CERN).Une transmission simultanéemore » de cette réunion sera assurée dans l'Amphithéâtre BE de Prévessin et également disponible à l'adresse suivante: http://webcast.cern.chJe me réjouis de votre participation!Meilleures salutations,Anne-Sylvie CatherinChef du Département des Ressources humaines__________________________________________________________________________________Dear Colleagues,I should like to remind you that a plublic meeting organised by HR Department will be held today:Friday 30 April 2010 at 9:30 am in the Main Auditorium (coffee from 9:00 am).During this meeting, general information will be given about:the CERN Admin e-guide which is a new guide to the Organization's administrative procedures, drawn up to facilitate the retrieval of practical information and to offer a user-friendly format;the CERN Health Insurance System (presentation by Philippe Charpentier, President of the CHIS Board) and;the Pension Fund (presentation by Theodore Economou, Administrator of the CERN Pension Fund).A simultaneous transmission of this meeting will be broadcast in the BE Auditorium at Prévessin and will also be available at the following address. http://webcast.cern.chI look forward to your participation!Best regards,Anne-Sylvie CatherinHead, Human Resources Department« less

  3. Réunion publique HR

    ScienceCinema

    None

    2017-12-09

    Chers Collègues,Je me permets de vous rappeler qu'une réunion publique organisée par le Département HR se tiendra aujourd'hui:Vendredi 30 avril 2010 à 9h30 dans l'Amphithéâtre principal (café offert dès 9h00).Durant cette réunion, des informations générales seront données sur:le CERN Admin e-guide, qui est un nouveau guide des procédures administratives du CERN ayant pour but de faciliter la recherche d'informations pratiques et d'offrir un format de lecture convivial;le régime d'Assurance Maladie de l'Organisation (présentation effectuée par Philippe Charpentier, Président du CHIS Board) et;la Caisse de Pensions (présentation effectuée par Théodore Economou, Administrateur de la Caisse de Pensions du CERN).Une transmission simultanée de cette réunion sera assurée dans l'Amphithéâtre BE de Prévessin et également disponible à l'adresse suivante: http://webcast.cern.chJe me réjouis de votre participation!Meilleures salutations,Anne-Sylvie CatherinChef du Département des Ressources humaines__________________________________________________________________________________Dear Colleagues,I should like to remind you that a plublic meeting organised by HR Department will be held today:Friday 30 April 2010 at 9:30 am in the Main Auditorium (coffee from 9:00 am).During this meeting, general information will be given about:the CERN Admin e-guide which is a new guide to the Organization's administrative procedures, drawn up to facilitate the retrieval of practical information and to offer a user-friendly format;the CERN Health Insurance System (presentation by Philippe Charpentier, President of the CHIS Board) and;the Pension Fund (presentation by Theodore Economou, Administrator of the CERN Pension Fund).A simultaneous transmission of this meeting will be broadcast in the BE Auditorium at Prévessin and will also be available at the following address. http://webcast.cern.chI look forward to your participation!Best regards,Anne-Sylvie CatherinHead, Human Resources Department

  4. Degradation of Leakage Currents and Reliability Prediction for Tantalum Capacitors

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander

    2016-01-01

    Two types of failures in solid tantalum capacitors, catastrophic and parametric, and their mechanisms are described. Analysis of voltage and temperature reliability acceleration factors reported in literature shows a wide spread of results and requires more investigation. In this work, leakage currents in two types of chip tantalum capacitors were monitored during highly accelerated life testing (HALT) at different temperatures and voltages. Distributions of degradation rates were approximated using a general log-linear Weibull model and yielded voltage acceleration constants B = 9.8 +/- 0.5 and 5.5. The activation energies were Ea = 1.65 eV and 1.42 eV. The model allows for conservative estimations of times to failure and was validated by long-term life test data. Parametric degradation and failures are reversible and can be annealed at high temperatures. The process is attributed to migration of charged oxygen vacancies that reduce the barrier height at the MnO2/Ta2O5 interface and increase injection of electrons from the MnO2 cathode. Analysis showed that the activation energy of the vacancies' migration is 1.1 eV.

  5. A Brief Hydrodynamic Investigation of a 1/24-Scale Model of the DR-77 Seaplane

    NASA Technical Reports Server (NTRS)

    Fisher, Lloyd J.; Hoffman, Edward L.

    1953-01-01

    A limited investigation of a 1/24-scale dynamically similar model of the Navy Bureau of Aeronautics DR-77 design was conducted in Langley tank no. 2 to determine the calm-water take-off and the rough-water landing characteristics of the design with particular regard to the take-off resistance and the landing accelerations. During the take-off tests, resistance, trim, and rise were measured and photographs were taken to study spray. During the landing tests, motion-picture records and normal-acceleration records were obtained. A ratio of gross load to maximum resistance of 3.2 was obtained with a 30 deg. dead-rise hydro-ski installation. The maximum normal accelerations obtained with a 30 deg. dead-rise hydro-ski installation were of the order of 8g to log in waves 8 feet high (full scale). A yawing instability that occurred just prior to hydro-ski emergence was improved by adding an afterbody extension, but adding the extension reduced the ratio of gross load to maximum resistance to 2.9.

  6. CERN launches high-school internship programme

    NASA Astrophysics Data System (ADS)

    Johnston, Hamish

    2017-07-01

    The CERN particle-physics lab has hosted 22 high-school students from Hungary in a pilot programme designed to show teenagers how science, technology, engineering and mathematics is used at the particle-physics lab.

  7. Review of hydrodynamic tunneling issues in high power particle accelerators

    NASA Astrophysics Data System (ADS)

    Tahir, N. A.; Burkart, F.; Schmidt, R.; Shutov, A.; Piriz, A. R.

    2018-07-01

    Full impact of one Large Hadron Collider (LHC) 7 TeV proton beam on solid targets made of different materials including copper and carbon, was simulated using an energy deposition code, FLUKA and a two-dimensional hydrodynamic code, BIG2, iteratively. These studies showed that the penetration depth of the entire beam comprised of 2808 proton bunches significantly increases due to a phenomenon named hydrodynamic tunneling of the protons and the shower. For example, the static range of a single 7 TeV proton and its shower is about 1 m in solid copper, but the full LHC beam will penetrate up to about 35 m in the target, if the hydrodynamic effects were included. Due to the potential implications of this result on the machine protection considerations, it was decided to have an experimental verification of the hydrodynamic tunneling effect. For this purpose, experiments were carried out at the CERN HiRadMat (High Radiation to Materials) facility in which extended solid copper cylindrical targets were irradiated with the 440 GeV proton beam generated by the Super Proton Synchrotron (SPS). Simulations of beam-target heating considering the same beam parameters that were used in the experiments, were also performed. These experiments not only confirmed the existence of the hydrodynamic tunneling, but the experimental measurements showed very good agreement with the experimental results as well. This provided confidence in the work on LHC related beam-matter heating simulations. Currently, a design study is being carried out by the international community (with CERN taking the leading role) for a post LHC collider named, the Future Circular Collider (FCC) which will accelerate two counter rotating proton beams up to a particle energy of 50 TeV. Simulations of the full impact of one FCC beam comprised of 10,600 proton bunches with a solid copper target have also been done. These simulations have shown that although the static range of a single 50 TeV proton and its shower in solid copper is around 1.8 m, the entire beam will penetrate up to about 350 m in the target. Feasibility studies of developing a water beam dump for the FCC have also been carried out. A review of this work and its implications on machine protection system are presented in this paper.

  8. Commissioning of a CERN Production and Analysis Facility Based on xrootd

    NASA Astrophysics Data System (ADS)

    Campana, Simone; van der Ster, Daniel C.; Di Girolamo, Alessandro; Peters, Andreas J.; Duellmann, Dirk; Coelho Dos Santos, Miguel; Iven, Jan; Bell, Tim

    2011-12-01

    The CERN facility hosts the Tier-0 of the four LHC experiments, but as part of WLCG it also offers a platform for production activities and user analysis. The CERN CASTOR storage technology has been extensively tested and utilized for LHC data recording and exporting to external sites according to experiments computing model. On the other hand, to accommodate Grid data processing activities and, more importantly, chaotic user analysis, it was realized that additional functionality was needed including a different throttling mechanism for file access. This paper will describe the xroot-based CERN production and analysis facility for the ATLAS experiment and in particular the experiment use case and data access scenario, the xrootd redirector setup on top of the CASTOR storage system, the commissioning of the system and real life experience for data processing and data analysis.

  9. CERN alerter—RSS based system for information broadcast to all CERN offices

    NASA Astrophysics Data System (ADS)

    Otto, R.

    2008-07-01

    Nearly every large organization uses a tool to broadcast messages and information across the internal campus (messages like alerts announcing interruption in services or just information about upcoming events). These tools typically allow administrators (operators) to send 'targeted' messages which are sent only to specific groups of users or computers, e/g only those located in a specified building or connected to a particular computing service. CERN has a long history of such tools: CERNVMS's SPM_quotMESSAGE command, Zephyr [2] and the most recent the NICE Alerter based on the NNTP protocol. The NICE Alerter used on all Windows-based computers had to be phased out as a consequence of phasing out NNTP at CERN. The new solution to broadcast information messages on the CERN campus continues to provide the service based on cross-platform technologies, hence minimizing custom developments and relying on commercial software as much as possible. The new system, called CERN Alerter, is based on RSS (Really Simple Syndication) [9] for the transport protocol and uses Microsoft SharePoint as the backend for database and posting interface. The windows-based client relies on Internet Explorer 7.0 with custom code to trigger the window pop-ups and the notifications for new events. Linux and Mac OS X clients could also rely on any RSS readers to subscribe to targeted notifications. The paper covers the architecture and implementation aspects of the new system.

  10. OBITUARY: Maurice Jacob (1933 2007)

    NASA Astrophysics Data System (ADS)

    Quercigh, Emanuele; Šándor, Ladislav

    2008-04-01

    Maurice Jacob passed away on 2 May 2007. With his death, we have lost one of the founding fathers of the ultra-relativistic heavy ion programme. His interest in high-energy nuclear physics started in 1981 when alpha alpha collisions could first be studied in the CERN ISR. An enthusiastic supporter of ion beam experiments at CERN, Maurice was at the origin of the 1982 Quark Matter meeting in Bielefeld [1] which brought together more than 100 participants from both sides of the Atlantic, showing a good enthusiastic constituency for such research. There were twice as many the following year at Brookhaven. Finally in the mid-eighties, a heavy ion programme was approved both at CERN and at Brookhaven involving as many nuclear as particle physicists. It was the start of a fruitful interdisciplinary collaboration which is nowadays continuing both at RHIC and at LHC. Maurice followed actively the development of this field, reporting at a number of conferences and meetings (Les Arcs, Bielefeld, Beijing, Brookhaven, Lenox, Singapore, Taormina,...). This activity culminated in 2000, when Maurice, together with Ulrich Heinz, summarized the main results of the CERN SPS heavy-ion experiments and the evidence was obtained for a new state of matter [2]. Maurice was a brilliant theoretical physicist. His many contributions have been summarized in a recent article in the CERN Courier by two leading CERN theorists, John Ellis and Andre Martin [3]. The following is an excerpt from their article: `He began his research career at Saclay and, while still a PhD student, he continued brilliantly during a stay at Brookhaven. It was there in 1959 that Maurice, together with Giancarlo Wick, developed the helicity amplitude formalism that is the basis of many modern theoretical calculations. Maurice obtained his PhD in 1961 and, after a stay at Caltech, returned to Saclay. A second American foray was to SLAC, where he and Sam Berman made the crucial observation that the point-like structures (partons) seen in deep-inelastic scattering implied the existence of high-transverse-momentum processes in proton proton collisions, as the ISR at CERN subsequently discovered. In 1967 Maurice joined CERN, where he remained, apart from influential visits to Yale, Fermilab and elsewhere, until his retirement in 1998. He became one of the most respected international experts on the phenomenology of strong interactions, including diffraction, scaling, high-transverse-momentum processes and the formation of quark gluon plasma. In particular, he pioneered the studies of inclusive hadron-production processes, including scaling and its violations. Also, working with Ron Horgan, he made detailed predictions for the production of jets at CERN's proton antiproton collider. The UA2 and UA1 experiments subsequently discovered these. He was also interested in electron positron colliders, making pioneering calculations, together with Tai Wu, of radiation in high-energy collisions. Maurice was one of the scientific pillars of CERN, working closely with experimental colleagues in predicting and interpreting results from successive CERN colliders. He was indefatigable in organizing regular meetings on ISR physics, bringing together theorists and experimentalists to debate the meaning of new results and propose new measurements. He was one of the strongest advocates of Carlo Rubbia's proposal for a proton antiproton collider at CERN, and was influential in preparing and advertising its physics. In 1978 he organized the Les Houches workshop that brought the LEP project to the attention of the wider European particle physics community. He also organized the ECFA workshop at Lausanne in 1984 that made the first exploration of the possible physics of the LHC. It is a tragedy that Maurice has not lived to enjoy data from the LHC.' References [1] Maurice Jacob and Helmut Satz (eds) 1982 Proc. Workshop on Quark Matter Formation and Heavy Ion Collisions, Bielefeld, 10 14 May 1982 (Singapore: World Scientific Publishing) [2] Heinz Ulrich W and Jacob Maurice 2000 Evidence for a new state of matter: An assessment of the results from the CERN lead beam program. Preprint nucl-th/0002042 [3] Ellis J and Martin A 2007 CERN Courier 47 issue 6

  11. High altitude gust acceleration environment as experienced by a supersonic airplane

    NASA Technical Reports Server (NTRS)

    Ehernberger, L. J.; Love, B. J.

    1975-01-01

    High altitude turbulence experienced at supersonic speeds is described in terms of gust accelerations measured on the YF-12A airplane. The data were obtained during 90 flights at altitudes above 12.2 kilometers (40,000 feet). Subjective turbulence intensity ratings were obtained from air crew members. The air crew often rated given gust accelerations as being more intense during high altitude supersonic flight than during low altitude subsonic flight. The portion of flight distance in turbulence ranged from 6 percent to 8 percent at altitudes between 12.2 kilometers and 16.8 kilometers (40,000 feet and 55,000 feet) to less than 1 percent at altitudes above 18.3 kilometers (60,000 feet). The amount of turbulence varied with season, increasing by a factor of 3 or more from summer to winter. Given values of gust acceleration were less frequent, on the basis of distance traveled, for supersonic flight of the YF-12A airplane at altitudes above 12.2 kilometers (40,000 feet) than for subsonic flight of a jet passenger airplane at altitudes below 12.2 kilometers (40,000 feet). The median thickness of high altitude turbulence patches was less than 400 meters (1300 feet); the median length was less than 16 kilometers (10 miles). The distribution of the patch dimensions tended to be log normal.

  12. CERN automatic audio-conference service

    NASA Astrophysics Data System (ADS)

    Sierra Moral, Rodrigo

    2010-04-01

    Scientists from all over the world need to collaborate with CERN on a daily basis. They must be able to communicate effectively on their joint projects at any time; as a result telephone conferences have become indispensable and widely used. Managed by 6 operators, CERN already has more than 20000 hours and 5700 audio-conferences per year. However, the traditional telephone based audio-conference system needed to be modernized in three ways. Firstly, to provide the participants with more autonomy in the organization of their conferences; secondly, to eliminate the constraints of manual intervention by operators; and thirdly, to integrate the audio-conferences into a collaborative working framework. The large number, and hence cost, of the conferences prohibited externalization and so the CERN telecommunications team drew up a specification to implement a new system. It was decided to use a new commercial collaborative audio-conference solution based on the SIP protocol. The system was tested as the first European pilot and several improvements (such as billing, security, redundancy...) were implemented based on CERN's recommendations. The new automatic conference system has been operational since the second half of 2006. It is very popular for the users and has doubled the number of conferences in the past two years.

  13. Memorial W.Gentner

    ScienceCinema

    None

    2018-05-25

    The DG H. Schopper gives an introduction for the commemoration and ceremony of the life and work of Professor Wolfgang Gentner. W. Gentner, German physicist, born in 1906 in Frankfurt and died in September 1980 in Heidelberg, was director of CERN from 1955 to 1960, president of the Scientific Policy Committee from 1968 to 1971 and president of the Council of CERN from 1972 to 1974. He was one of the founders of CERN and four people who knew him well pay tribute to him, among others one of his students, as well as J.B. Adams and O. Sheffard.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The DG H. Schopper gives an introduction for the commemoration and ceremony of the life and work of Professor Wolfgang Gentner. W. Gentner, German physicist, born in 1906 in Frankfurt and died in September 1980 in Heidelberg, was director of CERN from 1955 to 1960, president of the Scientific Policy Committee from 1968 to 1971 and president of the Council of CERN from 1972 to 1974. He was one of the founders of CERN and four people who knew him well pay tribute to him, among others one of his students, as well as J.B. Adams and O. Sheffard.

  15. OPERA - First Beam Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamura, M.

    2008-02-21

    OPERA is a long base-line neutrino oscillation experiment to detect tau-neutrino appearance and to prove that the origin of the atmospheric muon neutrino deficit observed by Kamiokande is the neutrino oscillation. A Hybrid emulsion detector, of which weight is about 1.3 kton, has been installed in Gran Sasso laboratory. New muon neutrino beam line, CNGS, has been constructed at CERN to send neutrinos to Gran Sasso, 730 km apart from CERN. In 2006, first neutrinos were sent from CERN to LNGS and were detected by the OPERA detector successfully as planned.

  16. Application of accelerated failure time models for breast cancer patients' survival in Kurdistan Province of Iran.

    PubMed

    Karimi, Asrin; Delpisheh, Ali; Sayehmiri, Kourosh

    2016-01-01

    Breast cancer is the most common cancer and the second common cause of cancer-induced mortalities in Iranian women. There has been a rapid development in hazard models and survival analysis in the last decade. The aim of this study was to evaluate the prognostic factors of overall survival (OS) in breast cancer patients using accelerated failure time models (AFT). This was a retrospective-analytic cohort study. About 313 women with a pathologically proven diagnosis of breast cancer who had been treated during a 7-year period (since January 2006 until March 2014) in Sanandaj City, Kurdistan Province of Iran were recruited. Performance among AFT was assessed using the goodness of fit methods. Discrimination among the exponential, Weibull, generalized gamma, log-logistic, and log-normal distributions was done using Akaik information criteria and maximum likelihood. The 5 years OS was 75% (95% CI = 74.57-75.43). The main results in terms of survival were found for the different categories of the clinical stage covariate, tumor metastasis, and relapse of cancer. Survival time in breast cancer patients without tumor metastasis and relapse were 4, 2-fold longer than other patients with metastasis and relapse, respectively. One of the most important undermining prognostic factors in breast cancer is metastasis; hence, knowledge of the mechanisms of metastasis is necessary to prevent it so occurrence and treatment of metastatic breast cancer and ultimately extend the lifetime of patients.

  17. Membership Finland

    ScienceCinema

    None

    2018-05-18

    The DG C. Rubbia and the vice president of the council of CERN gives a warm welcome to the membership of Finland, as the 15th member of CERN since January 1 1991 in the presence of the Secretary-General and the ambassador.

  18. Visit CD

    ScienceCinema

    None

    2017-12-09

    Le DG H.Schopper souhaite la bienvenue aux ambassadeurs des pays membres et aux représentants des pays avec lesquels le Cern entretient des relations proches et fait un exposé sur les activités au Cern

  19. Terbium Radionuclides for Theranostics Applications: A Focus On MEDICIS-PROMED

    NASA Astrophysics Data System (ADS)

    Cavaier, R. Formento; Haddad, F.; Sounalet, T.; Stora, T.; Zahi, I.

    A new facility, named CERN-MEDICIS, is under construction at CERN to produce radionuclides for medical applications. In parallel, the MEDICIS-PROMED, a Marie Sklodowska-Curie innovative training network of the Horizon 2020 European Commission's program, is being coordinated by CERN to train young scientists on the production and use of innovative radionuclides and develop a network of experts within Europe. One program within MEDICIS-PROMED is to determine the feasibility of producing innovative radioisotopes for theranostics using a commercial middle-sized high-current cyclotron and the mass separation technology developed at CERN-MEDICIS. This will allow the production of high specific activity radioisotopes not achievable with the common post-processing by chemical separation. Radioisotopes of scandium, copper, arsenic and terbium have been identified. Preliminary studies of activation yield and irradiation parameters optimization for the production of Tb-149 will be described.

  20. Cryogenic Control System Migration and Developments towards the UNICOS CERN Standard at INFN

    NASA Astrophysics Data System (ADS)

    Modanese, Paolo; Calore, Andrea; Contran, Tiziano; Friso, Alessandro; Pengo, Marco; Canella, Stefania; Burioli, Sergio; Gallese, Benedetto; Inglese, Vitaliano; Pezzetti, Marco; Pengo, Ruggero

    The cryogenic control systems at Laboratori Nazionali di Legnaro (LNL) are undergoing an important and radical modernization, allowing all the plants controls and supervision systems to be renewed in a homogeneous way towards the CERN-UNICOS standard. Before the UNICOS migration project started there were as many as 7 different types of PLC and 7 different types of SCADA, each one requiring its own particular programming language. In these conditions, even a simple modification and/or integration on the program or on the supervision, required the intervention of a system integrator company, specialized in its specific control system. Furthermore it implied that the operators have to be trained to learn the different types of control systems. The CERN-UNICOS invented for LHC [1] has been chosen due to its reliability and planned to run and be maintained for decades on. The complete migration is part of an agreement between CERN and INFN.

  1. Multilevel mixed effects parametric survival models using adaptive Gauss-Hermite quadrature with application to recurrent events and individual participant data meta-analysis.

    PubMed

    Crowther, Michael J; Look, Maxime P; Riley, Richard D

    2014-09-28

    Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.

  2. Web-of-Objects (WoO)-Based Context Aware Emergency Fire Management Systems for the Internet of Things

    PubMed Central

    Shamszaman, Zia Ush; Ara, Safina Showkat; Chong, Ilyoung; Jeong, Youn Kwae

    2014-01-01

    Recent advancements in the Internet of Things (IoT) and the Web of Things (WoT) accompany a smart life where real world objects, including sensing devices, are interconnected with each other. The Web representation of smart objects empowers innovative applications and services for various domains. To accelerate this approach, Web of Objects (WoO) focuses on the implementation aspects of bringing the assorted real world objects to the Web applications. In this paper; we propose an emergency fire management system in the WoO infrastructure. Consequently, we integrate the formation and management of Virtual Objects (ViO) which are derived from real world physical objects and are virtually connected with each other into the semantic ontology model. The charm of using the semantic ontology is that it allows information reusability, extensibility and interoperability, which enable ViOs to uphold orchestration, federation, collaboration and harmonization. Our system is context aware, as it receives contextual environmental information from distributed sensors and detects emergency situations. To handle a fire emergency, we present a decision support tool for the emergency fire management team. The previous fire incident log is the basis of the decision support system. A log repository collects all the emergency fire incident logs from ViOs and stores them in a repository. PMID:24531299

  3. Web-of-Objects (WoO)-based context aware emergency fire management systems for the Internet of Things.

    PubMed

    Shamszaman, Zia Ush; Ara, Safina Showkat; Chong, Ilyoung; Jeong, Youn Kwae

    2014-02-13

    Recent advancements in the Internet of Things (IoT) and the Web of Things (WoT) accompany a smart life where real world objects, including sensing devices, are interconnected with each other. The Web representation of smart objects empowers innovative applications and services for various domains. To accelerate this approach, Web of Objects (WoO) focuses on the implementation aspects of bringing the assorted real world objects to the Web applications. In this paper; we propose an emergency fire management system in the WoO infrastructure. Consequently, we integrate the formation and management of Virtual Objects (ViO) which are derived from real world physical objects and are virtually connected with each other into the semantic ontology model. The charm of using the semantic ontology is that it allows information reusability, extensibility and interoperability, which enable ViOs to uphold orchestration, federation, collaboration and harmonization. Our system is context aware, as it receives contextual environmental information from distributed sensors and detects emergency situations. To handle a fire emergency, we present a decision support tool for the emergency fire management team. The previous fire incident log is the basis of the decision support system. A log repository collects all the emergency fire incident logs from ViOs and stores them in a repository.

  4. PREFACE: Lectures from the CERN Winter School on Strings, Supergravity and Gauge Theories, CERN, 9-13 February 2009 Lectures from the CERN Winter School on Strings, Supergravity and Gauge Theories, CERN, 9-13 February 2009

    NASA Astrophysics Data System (ADS)

    Uranga, A. M.

    2009-11-01

    This special section is devoted to the proceedings of the conference `Winter School on Strings, Supergravity and Gauge Theories', which took place at CERN, the European Centre for Nuclear Research, in Geneva, Switzerland 9-13 February 2009. This event is part of a yearly series of scientific schools, which represents a well established tradition. Previous events have been held at SISSA, in Trieste, Italy, in February 2005 and at CERN in January 2006, January 2007 and January 2008, and were funded by the European Mobility Research and Training Network `Constituents, Fundamental Forces and Symmetries of the Universe'. The next event will take place again at CERN, in January 2010. The school was primarily meant for young doctoral students and postdoctoral researchers working in the area of string theory. It consisted of several general lectures of four hours each, whose notes are published in this special section, and six working group discussion sessions, focused on specific topics of the network research program. It was well attended by over 200 participants. The topics of the lectures were chosen to provide an introduction to some of the areas of recent progress, and to the open problems, in string theory. One of the most active areas in string theory in recent years has been the AdS/CFT or gauge/gravity correspondence, which proposes the complete equivalence of string theory on (asymptotically) anti de Sitter spacetimes with certain quantum (gauge) field theories. The duality has recently been applied to understanding the hydrodynamical properties of a hot plasma in gauge theories (like the quark-gluon plasma created in heavy ion collisions at the RHIC experiment at Brookhaven, and soon at the LHC at CERN) in terms of a dual gravitational AdS theory in the presence of a black hole. These developments were reviewed in the lecture notes by M Rangamani. In addition, the AdS/CFT duality has been proposed as a tool to study interesting physical properties in other physical systems described by quantum field theory, for instance in the context of a condensed matter system. The lectures by S Hartnoll provided an introduction to this recent development with an emphasis on the dual holographic description of superconductivity. Finally, ideas inspired by the AdS/CFT correspondence are yielding deep insights into fundamental questions of quantum gravity, like the entropy of black holes and its interpretation in terms of microstates. The lectures by S Mathur reviewed the black hole entropy and information paradox, and the proposal for its resolution in terms of `fuzzball' microstates. Further sets of lectures, not included in this special section, by F Zwirner and V Mukhanov, covered phenomenological aspects of high energy physics beyond the Standard Model and of cosmology. The coming experimental data in these two fields are expected to foster new developments in connecting string theory to the real world. The conference was financially supported by CERN and partially by the Arnold Sommerfeld Center for Theoretical Physics of the Ludwig Maximilians University of Munich. It is a great pleasure for us to warmly thank the Theory Unit of CERN for its very kind hospitality and for the high quality of the assistance and the infrastructures that it has provided. A M Uranga CERN, Switzerland Guest Editor

  5. Helix Nebula and CERN: A Symbiotic approach to exploiting commercial clouds

    NASA Astrophysics Data System (ADS)

    Barreiro Megino, Fernando H.; Jones, Robert; Kucharczyk, Katarzyna; Medrano Llamas, Ramón; van der Ster, Daniel

    2014-06-01

    The recent paradigm shift toward cloud computing in IT, and general interest in "Big Data" in particular, have demonstrated that the computing requirements of HEP are no longer globally unique. Indeed, the CERN IT department and LHC experiments have already made significant R&D investments in delivering and exploiting cloud computing resources. While a number of technical evaluations of interesting commercial offerings from global IT enterprises have been performed by various physics labs, further technical, security, sociological, and legal issues need to be address before their large-scale adoption by the research community can be envisaged. Helix Nebula - the Science Cloud is an initiative that explores these questions by joining the forces of three European research institutes (CERN, ESA and EMBL) with leading European commercial IT enterprises. The goals of Helix Nebula are to establish a cloud platform federating multiple commercial cloud providers, along with new business models, which can sustain the cloud marketplace for years to come. This contribution will summarize the participation of CERN in Helix Nebula. We will explain CERN's flagship use-case and the model used to integrate several cloud providers with an LHC experiment's workload management system. During the first proof of concept, this project contributed over 40.000 CPU-days of Monte Carlo production throughput to the ATLAS experiment with marginal manpower required. CERN's experience, together with that of ESA and EMBL, is providing a great insight into the cloud computing industry and highlighted several challenges that are being tackled in order to ease the export of the scientific workloads to the cloud environments.

  6. The response of a bonner sphere spectrometer to charged hadrons.

    PubMed

    Agosteo, S; Dimovasili, E; Fassò, A; Silari, M

    2004-01-01

    Bonner sphere spectrometers (BSSs) are employed in neutron spectrometry and dosimetry since many years. Recent developments have seen the addition to a conventional BSS of one or more detectors (moderator plus thermal neutron counter) specifically designed to improve the overall response of the spectrometer to neutrons above 10 MeV. These additional detectors employ a shell of material with a high mass number (such as lead) within the polyethylene moderator, in order to slow down high-energy neutrons via (n,xn) reactions. A BSS can be used to measure neutron spectra both outside accelerator shielding and from an unshielded target. Measurements were recently performed at CERN of the neutron yield and spectral fluence at various angles from unshielded, semi-thick copper, silver and lead targets, bombarded by a mixed proton/pion beam with 40 GeV per c momentum. These experiments have provided evidence that under certain circumstances, the use of lead-enriched moderators may present a problem: these detectors were found to have a significant response to the charged hadron component accompanying the neutrons emitted from the target. Conventional polyethylene moderators show a similar behaviour but less pronounced. These secondary hadrons interact with the moderator and generate neutrons, which are in turn detected by the counter. To investigate this effect and determine a correction factor to be applied to the unfolding procedure, a series of Monte Carlo simulations were performed with the FLUKA code. These simulations aimed at determining the response of the BSS to charged hadrons under the specific experimental situation. Following these results, a complete response matrix of the extended BSS to charged pions and protons was calculated with FLUKA. An experimental verification was carried out with a 120 GeV per c hadron beam at the CERF facility at CERN.

  7. Formation of a uniform ion beam using octupole magnets for BioLEIR facility at CERN

    NASA Astrophysics Data System (ADS)

    Amin, T.; Barlow, R.; Ghithan, S.; Roy, G.; Schuh, S.

    2018-04-01

    The possibility to transform the Low Energy Ion Ring (LEIR) accelerator at CERN into a multidisciplinary, biomedical research facility (BioLEIR) was investigated based on a request from the biomedical community. BioLEIR aims to provide a unique facility with a range of fully stripped ion beams (e.g. He, Li, Be, B, C, N, O) and energies suitable for multidisciplinary biomedical, clinically-oriented research. Two horizontal and one vertical beam transport lines have been designed for transporting the extracted beam from LEIR to three experimental end-stations. The vertical beamline was designed for a maximum energy of 75 MeV/u, while the two horizontal beamlines shall deliver up to a maximum energy of 440 MeV/u. A pencil beam of 4.3 mm FWHM (Full Width Half Maximum) as well as a homogeneous broad beam of 40 × 40 mm2, with a beam homogeneity better than ±4%, are available at the first horizontal (H1) irradiation point, while only a pencil beam is available at the second horizontal (H2) and vertical (V) irradiation points. The H1 irradiation point shall be used to conduct systematic studies of the radiation effect from different ion species on cell-lines. The H1 beamline was designed to utilize two octupole magnets which transform the Gaussian beam distribution at the target location into an approximately uniformly distributed rectangular beam. In this paper, we report on the multi-particle tracking calculations performed using MAD-X software suite for the H1 beam optics to arrive at a homogeneous broad beam on target using nonlinear focusing techniques, and on those to create a Gaussian pencil beam on target by adjusting quadrupoles strengths and positions.

  8. Reaching record-low β* at the CERN Large Hadron Collider using a novel scheme of collimator settings and optics

    NASA Astrophysics Data System (ADS)

    Bruce, R.; Bracco, C.; De Maria, R.; Giovannozzi, M.; Mereghetti, A.; Mirarchi, D.; Redaelli, S.; Quaranta, E.; Salvachua, B.

    2017-03-01

    The Large Hadron Collider (LHC) at CERN is built to collide intense proton beams with an unprecedented energy of 7 TeV. The design stored energy per beam of 362 MJ makes the LHC beams highly destructive, so that any beam losses risk to cause quenches of superconducting magnets or damage to accelerator components. Collimators are installed to protect the machine and they define a minimum normalized aperture, below which no other element is allowed. This imposes a limit on the achievable luminosity, since when squeezing β* (the β-function at the collision point) to smaller values for increased luminosity, the β-function in the final focusing system increases. This leads to a smaller normalized aperture that risks to go below the allowed collimation aperture. In the first run of the LHC, this was the main limitation on β*, which was constrained to values above the design specification. In this article, we show through theoretical and experimental studies how tighter collimator openings and a new optics with specific phase-advance constraints allows a β* as small as 40 cm, a factor 2 smaller than β*=80 cm used in 2015 and significantly below the design value β*=55 cm, in spite of a lower beam energy. The proposed configuration with β*=40 cm has been successfully put into operation and has been used throughout 2016 as the LHC baseline. The decrease in β* compared to 2015 has been an essential contribution to reaching and surpassing, in 2016, the LHC design luminosity for the first time, and to accumulating a record-high integrated luminosity of around 40 fb-1 in one year, in spite of using less bunches than in the design.

  9. Simulations and measurements of beam loss patterns at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Bruce, R.; Assmann, R. W.; Boccone, V.; Bracco, C.; Brugger, M.; Cauchi, M.; Cerutti, F.; Deboy, D.; Ferrari, A.; Lari, L.; Marsili, A.; Mereghetti, A.; Mirarchi, D.; Quaranta, E.; Redaelli, S.; Robert-Demolaize, G.; Rossi, A.; Salvachua, B.; Skordis, E.; Tambasco, C.; Valentino, G.; Weiler, T.; Vlachoudis, V.; Wollmann, D.

    2014-08-01

    The CERN Large Hadron Collider (LHC) is designed to collide proton beams of unprecedented energy, in order to extend the frontiers of high-energy particle physics. During the first very successful running period in 2010-2013, the LHC was routinely storing protons at 3.5-4 TeV with a total beam energy of up to 146 MJ, and even higher stored energies are foreseen in the future. This puts extraordinary demands on the control of beam losses. An uncontrolled loss of even a tiny fraction of the beam could cause a superconducting magnet to undergo a transition into a normal-conducting state, or in the worst case cause material damage. Hence a multistage collimation system has been installed in order to safely intercept high-amplitude beam protons before they are lost elsewhere. To guarantee adequate protection from the collimators, a detailed theoretical understanding is needed. This article presents results of numerical simulations of the distribution of beam losses around the LHC that have leaked out of the collimation system. The studies include tracking of protons through the fields of more than 5000 magnets in the 27 km LHC ring over hundreds of revolutions, and Monte Carlo simulations of particle-matter interactions both in collimators and machine elements being hit by escaping particles. The simulation results agree typically within a factor 2 with measurements of beam loss distributions from the previous LHC run. Considering the complex simulation, which must account for a very large number of unknown imperfections, and in view of the total losses around the ring spanning over 7 orders of magnitude, we consider this an excellent agreement. Our results give confidence in the simulation tools, which are used also for the design of future accelerators.

  10. Development and Use of Mark Sense Record Cards for Recording Medical Data on Pilots Subjected to Acceleration Stress

    NASA Technical Reports Server (NTRS)

    Smedal, Harald A.; Havill, C. Dewey

    1962-01-01

    A TIME-HONORED system of recording medical histories and the data obtained on physical and laboratory examination has been that of writing the information on record sheets that go into a folder for each patient. In order to have information which would be more readily retrieved, 'a program was initiated in 1952 by the U. S. Naval School of Aviation Medicine in connection with their "Care of the Flyer" study to place this information on machine record cards. In 1958, a machine record card method was developed for recording medical data in connection with the astronaut selection program. Machine record cards were also developed by the Aero Medical Laboratory, Wright-Patterson AFB, Ohio, and the Aviation Medical Acceleration Laboratory, Naval Air Development Center, Johnsville, Pennsylvania, for use in connection with a variety of tests including acceleration stress.1 Therefore, a variety of systems resulted in which data of a medical nature could easily be recalled. During the NASA, Ames Research Center centrifuge studies/'S the pilot subjects were interviewed after each centrifuge run, or series of runs, and subjective information was recorded in a log book by the usual history taking methods referred to above. After the methods Were reviewed, it' was recognized that a card system would be very useful in recording data from our pilots after they had been exposed to acceleration stress. Since the acceleration stress cards already developed did not meet our requirements, it was decided a different card was needed.

  11. Offering Global Collaboration Services beyond CERN and HEP

    NASA Astrophysics Data System (ADS)

    Fernandes, J.; Ferreira, P.; Baron, T.

    2015-12-01

    The CERN IT department has built over the years a performant and integrated ecosystem of collaboration tools, from videoconference and webcast services to event management software. These services have been designed and evolved in very close collaboration with the various communities surrounding the laboratory and have been massively adopted by CERN users. To cope with this very heavy usage, global infrastructures have been deployed which take full advantage of CERN's international and global nature. If these services and tools are instrumental in enabling the worldwide collaboration which generates major HEP breakthroughs, they would certainly also benefit other sectors of science in which globalization has already taken place. Some of these services are driven by commercial software (Vidyo or Wowza for example), some others have been developed internally and have already been made available to the world as Open Source Software in line with CERN's spirit and mission. Indico for example is now installed in 100+ institutes worldwide. But providing the software is often not enough and institutes, collaborations and project teams do not always possess the expertise, or human or material resources that are needed to set up and maintain such services. Regional and national institutions have to answer needs, which are growingly global and often contradict their operational capabilities or organizational mandate and so are looking at existing worldwide service offers such as CERN's. We believe that the accumulated experience obtained through the operation of a large scale worldwide collaboration service combined with CERN's global network and its recently- deployed Agile Infrastructure would allow the Organization to set up and operate collaborative services, such as Indico and Vidyo, at a much larger scale and on behalf of worldwide research and education institutions and thus answer these pressing demands while optimizing resources at a global level. Such services would be built over a robust and massively scalable Indico server to which the concept of communities would be added, and which would then serve as a hub for accessing other collaboration services such as Vidyo, on the same simple and successful model currently in place for CERN users. This talk will describe this vision, its benefits and the steps that have already been taken to make it come to life.

  12. 2 MeV linear accelerator for industrial applications

    NASA Astrophysics Data System (ADS)

    Smith, Richard R.; Farrell, Sherman R.

    1997-02-01

    RPC Industries has developed a high average power scanned electron beam linac system for medium energy industrial processing, such as in-line sterilization. The parameters are: electron energy 2 MeV; average beam current 5.0 mA; and scanned width 0.5 meters. The control system features data logging and a Man-Machine Interface system. The accelerator is vertically mounted, the system height above the floor is 3.4 m, and the footprint is 0.9×1.2 meter2. The typical processing cell inside dimensions are 3.0 m by 3.5 m by 4.2 m high with concrete side walls 0.5 m thick above ground level. The equal exit depth dose is 0.73 gm cm-2. Additional topics that will be reported are: throughput, measurements of dose vs depth, dose uniformity across the web, and beam power by calorimeter and magnetic deflection of the beam.

  13. The sources of inspiration in research on position-sensitive detectors

    NASA Astrophysics Data System (ADS)

    Charpak, G.

    1988-12-01

    The high-energy experimental physicist is constantly confronted with the problem of identifying and localizing particles, charged or neutral. The community of high-energy physicists has thus produced a variety of original methods which have found, or are beginning to find, applications in many fields that are remote from this discipline. New hadron accelerators which are foreseen for the year 2000 raise formidable problems. To take an extreme case, beams crossing at 5 ns intervals are being considered, with several interactions per crossing and with collision multiplicities close to 100. Should a high-energy experimental physicist who is interested in research on particle detectors, limit his horizon to these questions? Even if most of his effort is legitimately concentrated on solving the specific problems encountered with the projected accelerators, it would be a mistake for him to limit his activity to reaching only this goal. In many fields there is considerable demand for improvement in the methods of radiation imaging. I will list some of them, and illustrate my point — which is that contributing of this field is both fruitful and cross-fertilizing — with examples from the activity of our own group at CERN. I apologize for not doing justice to the many other efforts made in the same direction by other groups or laboratories, but the proceedings of this conference will already be illuminating in this respect.

  14. Design considerations of a power supply system for fast cycling superconducting accelerator magnets of 2 Tesla b-field generated by a conductor of 100 kA current

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hays, Steve; Piekarz, Henryk; Pfeffer, Howie

    2007-06-01

    Recently proposed fast cycling accelerators for proton drivers (SF-SPS, CERN and SF-MR, SF-BOOSTER, FNAL) neutrino sources require development of new magnet technology. In support of this magnet development a power supply system will need to be developed that can support the high current and high rate of power swing required by the fast cycling (1 sec rise and fall in the SF-MR, 5Hz in Booster). This paper will outline a design concept for a +/- 2000 V and 100,000 A fast ramping power supply system. This power supply design is in support of a 6.44 km magnet system at 0.020more » H and 330 m 5 Hz, 0.00534 H superconducting loads. The design description will include the layout and plan for extending the present FNAL Main Injector style ramping power supply to the higher currents needed for this operation. This will also include the design for a harmonic filter and power factor corrector that will be needed to control the large power swings caused by the fast cycle time. A conceptual design for the current regulation system and control will also be outlined. The power circuit design will include the bridge, filter and transformer plan based on existing designs.« less

  15. Electromagnetic nonlinearities in a Roebel-cable-based accelerator magnet prototype: variational approach

    NASA Astrophysics Data System (ADS)

    Ruuskanen, J.; Stenvall, A.; Lahtinen, V.; Pardo, E.

    2017-02-01

    Superconducting magnets are the most expensive series of components produced in the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN). When developing such magnets beyond state-of-the-art technology, one possible option is to use high-temperature superconductors (HTS) that are capable of tolerating much higher magnetic fields than low-temperature superconductors (LTS), carrying simultaneously high current densities. Significant cost reductions due to decreased prototype construction needs can be achieved by careful modelling of the magnets. Simulations are used, e.g. for designing magnets fulfilling the field quality requirements of the beampipe, and adequate protection by studying the losses occurring during charging and discharging. We model the hysteresis losses and the magnetic field nonlinearity in the beampipe as a function of the magnet’s current. These simulations rely on the minimum magnetic energy variation principle, with optimization algorithms provided by the open-source optimization library interior point optimizer. We utilize this methodology to investigate a research and development accelerator magnet prototype made of REBCO Roebel cable. The applicability of this approach, when the magnetic field dependence of the superconductor’s critical current density is considered, is discussed. We also scrutinize the influence of the necessary modelling decisions one needs to make with this approach. The results show that different decisions can lead to notably different results, and experiments are required to study the electromagnetic behaviour of such magnets further.

  16. From Particle Physics to Medical Applications

    NASA Astrophysics Data System (ADS)

    Dosanjh, Manjit

    2017-06-01

    CERN is the world's largest particle physics research laboratory. Since it was established in 1954, it has made an outstanding contribution to our understanding of the fundamental particles and their interactions, and also to the technologies needed to analyse their properties and behaviour. The experimental challenges have pushed the performance of particle accelerators and detectors to the limits of our technical capabilities, and these groundbreaking technologies can also have a significant impact in applications beyond particle physics. In particular, the detectors developed for particle physics have led to improved techniques for medical imaging, while accelerator technologies lie at the heart of the irradiation methods that are widely used for treating cancer. Indeed, many important diagnostic and therapeutic techniques used by healthcare professionals are based either on basic physics principles or the technologies developed to carry out physics research. Ever since the discovery of x-rays by Roentgen in 1895, physics has been instrumental in the development of technologies in the biomedical domain, including the use of ionizing radiation for medical imaging and therapy. Some key examples that are explored in detail in this book include scanners based on positron emission tomography, as well as radiation therapy for cancer treatment. Even the collaborative model of particle physics is proving to be effective in catalysing multidisciplinary research for medical applications, ensuring that pioneering physics research is exploited for the benefit of all.

  17. Statistical analysis of variability properties of the Kepler blazar W2R 1926+42

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Hu, Shaoming; Wiita, Paul J.; Gupta, Alok C.

    2018-07-01

    We analysed Kepler light curves of the blazar W2R 1926+42 that provided nearly continuous coverage from quarter 11 to quarter 17 (589 d between 2011 and 2013) and examined some of their flux variability properties. We investigate the possibility that the light curve is dominated by a large number of individual flares and adopt exponential rise and decay models to investigate the symmetry properties of flares. We found that those variations of W2R 1926+42 are predominantly asymmetric with weak tendencies towards positive asymmetry (rapid rise and slow decay). The durations (D) and the amplitudes (F0) of flares can be fit with lognormal distributions. The energy (E) of each flare is also estimated for the first time. There are positive correlations between logD and logE with a slope of 1.36, and between logF0 and logE with a slope of 1.12. Lomb-Scargle periodograms are used to estimate the power spectral density shape. It is well described by a power law with an index ranging between -1.1 and -1.5. The sizes of the emission regions, R, are estimated to be in the range of 1.1 × 1015-6.6 × 1016cm. The flare asymmetry is difficult to explain by a light travel time effect but may be caused by differences between the time-scales for acceleration and dissipation of high-energy particles in the relativistic jet. A jet-in-jet model could also produce the observed lognormal distributions.

  18. Ortho effects in quantitative structure-activity relationships for acetylcholinesterase inhibition by aryl carbamates.

    PubMed

    Lin, Gialih; Liu, Yu-Chen; Lin, Yan-Fu; Wu, Yon-Gi

    2004-10-01

    Ortho-substituted phenyl-N-butyl carbamates (1-9) are characterized as "pseudo-pseudo-substrate" inhibitors of acetylcholinesterase. Since the inhibitors protonate at pH 7.0 buffer solution, the virtual inhibition constants (K'is) of the protonated inhibitors are calculated from the equation, - logK'i = - logKi - logKb. The logarithms of the inhibition constant (Ki), the carbamylation constant (k(c)), and the bimolecular inhibition constant (k(i)) for the enzyme inhibitions by carbamates 1-9 are multiply linearly correlated with the Hammett para-substituent constant (sigma(p)), the Taft-Kutter-Hansch ortho steric constant (E(S)), and the Swan-Lupton ortho polar constant (F). Values of rho, delta, and f for the - logKi-, logk(c)-, and logk(i)-correlations are -0.6, -0.16, 0.7; 0.11, 0.03, -0.3; and - 0.5, - 0.12, 0.4, respectively. The Ki step further divides into two steps: 1) the pre-equilibrium protonation of the inhibitors, Kb step and 2) formation of a negatively charged enzyme-inhibitor Michaelis-Menten complex--virtual inhibition, K'i step. The Ki step has little ortho steric enhancement effect; moreover, the k(c)step is insensitive to the ortho steric effect. The f value of 0.7 for the Ki step indicates that ortho electron-withdrawing substituents of the inhibitors accelerate the inhibition reactions from the ortho polar effect; however, the f value of -0.3 for the k(c)step implies that ortho electron-withdrawing substituents of the inhibitors lessen the inhibition reactions from the ortho polar effect.

  19. The ATLAS Experiment at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    ATLAS Collaboration; Aad, G.; Abat, E.; Abdallah, J.; Abdelalim, A. A.; Abdesselam, A.; Abdinov, O.; Abi, B. A.; Abolins, M.; Abramowicz, H.; Acerbi, E.; Acharya, B. S.; Achenbach, R.; Ackers, M.; Adams, D. L.; Adamyan, F.; Addy, T. N.; Aderholz, M.; Adorisio, C.; Adragna, P.; Aharrouche, M.; Ahlen, S. P.; Ahles, F.; Ahmad, A.; Ahmed, H.; Aielli, G.; Åkesson, P. F.; Åkesson, T. P. A.; Akimov, A. V.; Alam, S. M.; Albert, J.; Albrand, S.; Aleksa, M.; Aleksandrov, I. N.; Aleppo, M.; Alessandria, F.; Alexa, C.; Alexander, G.; Alexopoulos, T.; Alimonti, G.; Aliyev, M.; Allport, P. P.; Allwood-Spiers, S. E.; Aloisio, A.; Alonso, J.; Alves, R.; Alviggi, M. G.; Amako, K.; Amaral, P.; Amaral, S. P.; Ambrosini, G.; Ambrosio, G.; Amelung, C.; Ammosov, V. V.; Amorim, A.; Amram, N.; Anastopoulos, C.; Anderson, B.; Anderson, K. J.; Anderssen, E. C.; Andreazza, A.; Andrei, V.; Andricek, L.; Andrieux, M.-L.; Anduaga, X. S.; Anghinolfi, F.; Antonaki, A.; Antonelli, M.; Antonelli, S.; Apsimon, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Arce, A. T. H.; Archambault, J. P.; Arguin, J.-F.; Arik, E.; Arik, M.; Arms, K. E.; Armstrong, S. R.; Arnaud, M.; Arnault, C.; Artamonov, A.; Asai, S.; Ask, S.; Åsman, B.; Asner, D.; Asquith, L.; Assamagan, K.; Astbury, A.; Athar, B.; Atkinson, T.; Aubert, B.; Auerbach, B.; Auge, E.; Augsten, K.; Aulchenko, V. M.; Austin, N.; Avolio, G.; Avramidou, R.; Axen, A.; Ay, C.; Azuelos, G.; Baccaglioni, G.; Bacci, C.; Bachacou, H.; Bachas, K.; Bachy, G.; Badescu, E.; Bagnaia, P.; Bailey, D. C.; Baines, J. T.; Baker, O. K.; Ballester, F.; Baltasar Dos Santos Pedrosa, F.; Banas, E.; Banfi, D.; Bangert, A.; Bansal, V.; Baranov, S. P.; Baranov, S.; Barashkou, A.; Barberio, E. L.; Barberis, D.; Barbier, G.; Barclay, P.; Bardin, D. Y.; Bargassa, P.; Barillari, T.; Barisonzi, M.; Barnett, B. M.; Barnett, R. M.; Baron, S.; Baroncelli, A.; Barone, M.; Barr, A. J.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Barrillon, P.; Barriuso Poy, A.; Barros, N.; Bartheld, V.; Bartko, H.; Bartoldus, R.; Basiladze, S.; Bastos, J.; Batchelor, L. E.; Bates, R. L.; Batley, J. R.; Batraneanu, S.; Battistin, M.; Battistoni, G.; Batusov, V.; Bauer, F.; Bauss, B.; Baynham, D. E.; Bazalova, M.; Bazan, A.; Beauchemin, P. H.; Beaugiraud, B.; Beccherle, R. B.; Beck, G. A.; Beck, H. P.; Becks, K. H.; Bedajanek, I.; Beddall, A. J.; Beddall, A.; Bednár, P.; Bednyakov, V. A.; Bee, C.; Behar Harpaz, S.; Belanger, G. A. N.; Belanger-Champagne, C.; Belhorma, B.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellachia, F.; Bellagamba, L.; Bellina, F.; Bellomo, G.; Bellomo, M.; Beltramello, O.; Belymam, A.; Ben Ami, S.; Ben Moshe, M.; Benary, O.; Benchekroun, D.; Benchouk, C.; Bendel, M.; Benedict, B. H.; Benekos, N.; Benes, J.; Benhammou, Y.; Benincasa, G. P.; Benjamin, D. P.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Beretta, M.; Berge, D.; Bergeaas, E.; Berger, N.; Berghaus, F.; Berglund, S.; Bergsma, F.; Beringer, J.; Bernabéu, J.; Bernardet, K.; Berriaud, C.; Berry, T.; Bertelsen, H.; Bertin, A.; Bertinelli, F.; Bertolucci, S.; Besson, N.; Beteille, A.; Bethke, S.; Bialas, W.; Bianchi, R. M.; Bianco, M.; Biebel, O.; Bieri, M.; Biglietti, M.; Bilokon, H.; Binder, M.; Binet, S.; Bingefors, N.; Bingul, A.; Bini, C.; Biscarat, C.; Bischof, R.; Bischofberger, M.; Bitadze, A.; Bizzell, J. P.; Black, K. M.; Blair, R. E.; Blaising, J. J.; Blanch, O.; Blanchot, G.; Blocker, C.; Blocki, J.; Blondel, A.; Blum, W.; Blumenschein, U.; Boaretto, C.; Bobbink, G. J.; Bocci, A.; Bocian, D.; Bock, R.; Boehm, M.; Boek, J.; Bogaerts, J. A.; Bogouch, A.; Bohm, C.; Bohm, J.; Boisvert, V.; Bold, T.; Boldea, V.; Bondarenko, V. G.; Bonino, R.; Bonis, J.; Bonivento, W.; Bonneau, P.; Boonekamp, M.; Boorman, G.; Boosten, M.; Booth, C. N.; Booth, P. S. L.; Booth, P.; Booth, J. R. A.; Borer, K.; Borisov, A.; Borjanovic, I.; Bos, K.; Boscherini, D.; Bosi, F.; Bosman, M.; Bosteels, M.; Botchev, B.; Boterenbrood, H.; Botterill, D.; Boudreau, J.; Bouhova-Thacker, E. V.; Boulahouache, C.; Bourdarios, C.; Boutemeur, M.; Bouzakis, K.; Boyd, G. R.; Boyd, J.; Boyer, B. H.; Boyko, I. R.; Bozhko, N. I.; Braccini, S.; Braem, A.; Branchini, P.; Brandenburg, G. W.; Brandt, A.; Brandt, O.; Bratzler, U.; Braun, H. M.; Bravo, S.; Brawn, I. P.; Brelier, B.; Bremer, J.; Brenner, R.; Bressler, S.; Breton, D.; Brett, N. D.; Breugnon, P.; Bright-Thomas, P. G.; Brochu, F. M.; Brock, I.; Brock, R.; Brodbeck, T. J.; Brodet, E.; Broggi, F.; Broklova, Z.; Bromberg, C.; Brooijmans, G.; Brouwer, G.; Broz, J.; Brubaker, E.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruni, A.; Bruni, G.; Bruschi, M.; Buanes, T.; Buchanan, N. J.; Buchholz, P.; Budagov, I. A.; Büscher, V.; Bugge, L.; Buira-Clark, D.; Buis, E. J.; Bujor, F.; Buran, T.; Burckhart, H.; Burckhart-Chromek, D.; Burdin, S.; Burns, R.; Busato, E.; Buskop, J. J. F.; Buszello, K. P.; Butin, F.; Butler, J. M.; Buttar, C. M.; Butterworth, J.; Butterworth, J. M.; Byatt, T.; Cabrera Urbán, S.; Cabruja Casas, E.; Caccia, M.; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calderón Terol, D.; Callahan, J.; Caloba, L. P.; Caloi, R.; Calvet, D.; Camard, A.; Camarena, F.; Camarri, P.; Cambiaghi, M.; Cameron, D.; Cammin, J.; Campabadal Segura, F.; Campana, S.; Canale, V.; Cantero, J.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Caprio, M.; Caracinha, D.; Caramarcu, C.; Carcagno, Y.; Cardarelli, R.; Cardeira, C.; Cardiel Sas, L.; Cardini, A.; Carli, T.; Carlino, G.; Carminati, L.; Caron, B.; Caron, S.; Carpentieri, C.; Carr, F. S.; Carter, A. A.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Cascella, M.; Caso, C.; Castelo, J.; Castillo Gimenez, V.; Castro, N.; Castrovillari, F.; Cataldi, G.; Cataneo, F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Caughron, S.; Cauz, D.; Cavallari, A.; Cavalleri, P.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerna, C.; Cernoch, C.; Cerqueira, A. S.; Cerri, A.; Cerutti, F.; Cervetto, M.; Cetin, S. A.; Cevenini, F.; Chalifour, M.; Chamizo llatas, M.; Chan, A.; Chapman, J. W.; Charlton, D. G.; Charron, S.; Chekulaev, S. V.; Chelkov, G. A.; Chen, H.; Chen, L.; Chen, T.; Chen, X.; Cheng, S.; Cheng, T. L.; Cheplakov, A.; Chepurnov, V. F.; Cherkaoui El Moursli, R.; Chesneanu, D.; Cheu, E.; Chevalier, L.; Chevalley, J. L.; Chevallier, F.; Chiarella, V.; Chiefari, G.; Chikovani, L.; Chilingarov, A.; Chiodini, G.; Chouridou, S.; Chren, D.; Christiansen, T.; Christidi, I. A.; Christov, A.; Chu, M. L.; Chudoba, J.; Chuguev, A. G.; Ciapetti, G.; Cicalini, E.; Ciftci, A. K.; Cindro, V.; Ciobotaru, M. D.; Ciocio, A.; Cirilli, M.; Citterio, M.; Ciubancan, M.; Civera, J. V.; Clark, A.; Cleland, W.; Clemens, J. C.; Clement, B. C.; Clément, C.; Clements, D.; Clifft, R. W.; Cobal, M.; Coccaro, A.; Cochran, J.; Coco, R.; Coe, P.; Coelli, S.; Cogneras, E.; Cojocaru, C. D.; Colas, J.; Colijn, A. P.; Collard, C.; Collins-Tooth, C.; Collot, J.; Coluccia, R.; Comune, G.; Conde Muiño, P.; Coniavitis, E.; Consonni, M.; Constantinescu, S.; Conta, C.; Conventi, F. A.; Cook, J.; Cooke, M.; Cooper-Smith, N. J.; Cornelissen, T.; Corradi, M.; Correard, S.; Corso-Radu, A.; Coss, J.; Costa, G.; Costa, M. J.; Costanzo, D.; Costin, T.; Coura Torres, R.; Courneyea, L.; Couyoumtzelis, C.; Cowan, G.; Cox, B. E.; Cox, J.; Cragg, D. A.; Cranmer, K.; Cranshaw, J.; Cristinziani, M.; Crosetti, G.; Cuenca Almenar, C.; Cuneo, S.; Cunha, A.; Curatolo, M.; Curtis, C. J.; Cwetanski, P.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; Da Rocha Gesualdi Mello, A.; Da Silva, P. V. M.; Da Silva, R.; Dabrowski, W.; Dael, A.; Dahlhoff, A.; Dai, T.; Dallapiccola, C.; Dallison, S. J.; Dalmau, J.; Daly, C. H.; Dam, M.; Damazio, D.; Dameri, M.; Danielsen, K. M.; Danielsson, H. O.; Dankers, R.; Dannheim, D.; Darbo, G.; Dargent, P.; Daum, C.; Dauvergne, J. P.; David, M.; Davidek, T.; Davidson, N.; Davidson, R.; Dawson, I.; Dawson, J. W.; Daya, R. K.; De, K.; de Asmundis, R.; de Boer, R.; DeCastro, S.; DeGroot, N.; de Jong, P.; de La Broise, X.; DeLa Cruz-Burelo, E.; DeLa Taille, C.; DeLotto, B.; DeOliveira Branco, M.; DePedis, D.; de Saintignon, P.; DeSalvo, A.; DeSanctis, U.; DeSanto, A.; DeVivie DeRegie, J. B.; DeZorzi, G.; Dean, S.; Dedes, G.; Dedovich, D. V.; Defay, P. O.; Degele, R.; Dehchar, M.; Deile, M.; DelPapa, C.; DelPeso, J.; DelPrete, T.; Delagnes, E.; Delebecque, P.; Dell'Acqua, A.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delpierre, P.; Delruelle, N.; Delsart, P. A.; Deluca Silberberg, C.; Demers, S.; Demichev, M.; Demierre, P.; Demirköz, B.; Deng, W.; Denisov, S. P.; Dennis, C.; Densham, C. J.; Dentan, M.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K. K.; Dewhurst, A.; Di Ciaccio, A.; Di Ciaccio, L.; Di Domenico, A.; Di Girolamo, A.; Di Girolamo, B.; Di Luise, S.; Di Mattia, A.; Di Simone, A.; Diaz Gomez, M. M.; Diehl, E. B.; Dietl, H.; Dietrich, J.; Dietsche, W.; Diglio, S.; Dima, M.; Dindar, K.; Dinkespiler, B.; Dionisi, C.; Dipanjan, R.; Dita, P.; Dita, S.; Dittus, F.; Dixon, S. D.; Djama, F.; Djilkibaev, R.; Djobava, T.; do Vale, M. A. B.; Dobbs, M.; Dobinson, R.; Dobos, D.; Dobson, E.; Dobson, M.; Dodd, J.; Dogan, O. B.; Doherty, T.; Doi, Y.; Dolejsi, J.; Dolenc, I.; Dolezal, Z.; Dolgoshein, B. A.; Domingo, E.; Donega, M.; Dopke, J.; Dorfan, D. E.; Dorholt, O.; Doria, A.; Dos Anjos, A.; Dosil, M.; Dotti, A.; Dova, M. T.; Dowell, J. D.; Doyle, A. T.; Drake, G.; Drakoulakos, D.; Drasal, Z.; Drees, J.; Dressnandt, N.; Drevermann, H.; Driouichi, C.; Dris, M.; Drohan, J. G.; Dubbert, J.; Dubbs, T.; Duchovni, E.; Duckeck, G.; Dudarev, A.; Dührssen, M.; Dür, H.; Duerdoth, I. P.; Duffin, S.; Duflot, L.; Dufour, M.-A.; Dumont Dayot, N.; Duran Yildiz, H.; Durand, D.; Dushkin, A.; Duxfield, R.; Dwuznik, M.; Dydak, F.; Dzahini, D.; Díez Cornell, S.; Düren, M.; Ebenstein, W. L.; Eckert, S.; Eckweiler, S.; Eerola, P.; Efthymiopoulos, I.; Egede, U.; Egorov, K.; Ehrenfeld, W.; Eifert, T.; Eigen, G.; Einsweiler, K.; Eisenhandler, E.; Ekelof, T.; Eklund, L. M.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Ely, R.; Emeliyanov, D.; Engelmann, R.; Engström, M.; Ennes, P.; Epp, B.; Eppig, A.; Epshteyn, V. S.; Ereditato, A.; Eremin, V.; Eriksson, D.; Ermoline, I.; Ernwein, J.; Errede, D.; Errede, S.; Escalier, M.; Escobar, C.; Espinal Curull, X.; Esposito, B.; Esteves, F.; Etienne, F.; Etienvre, A. I.; Etzion, E.; Evans, H.; Evdokimov, V. N.; Evtoukhovitch, P.; Eyring, A.; Fabbri, L.; Fabjan, C. W.; Fabre, C.; Faccioli, P.; Facius, K.; Fadeyev, V.; Fakhrutdinov, R. M.; Falciano, S.; Falleau, I.; Falou, A. C.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farrell, J.; Farthouat, P.; Fasching, D.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Fawzi, F.; Fayard, L.; Fayette, F.; Febbraro, R.; Fedin, O. L.; Fedorko, I.; Feld, L.; Feldman, G.; Feligioni, L.; Feng, C.; Feng, E. J.; Fent, J.; Fenyuk, A. B.; Ferencei, J.; Ferguson, D.; Ferland, J.; Fernando, W.; Ferrag, S.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferrer, A.; Ferrer, M. L.; Ferrere, D.; Ferretti, C.; Ferro, F.; Fiascaris, M.; Fichet, S.; Fiedler, F.; Filimonov, V.; Filipčič, A.; Filippas, A.; Filthaut, F.; Fincke-Keeler, M.; Finocchiaro, G.; Fiorini, L.; Firan, A.; Fischer, P.; Fisher, M. J.; Fisher, S. M.; Flaminio, V.; Flammer, J.; Flechl, M.; Fleck, I.; Flegel, W.; Fleischmann, P.; Fleischmann, S.; Fleta Corral, C. M.; Fleuret, F.; Flick, T.; Flix, J.; Flores Castillo, L. R.; Flowerdew, M. J.; Föhlisch, F.; Fokitis, M.; Fonseca Martin, T. M.; Fopma, J.; Forbush, D. A.; Formica, A.; Foster, J. M.; Fournier, D.; Foussat, A.; Fowler, A. J.; Fox, H.; Francavilla, P.; Francis, D.; Franz, S.; Fraser, J. T.; Fraternali, M.; Fratianni, S.; Freestone, J.; French, R. S.; Fritsch, K.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fulachier, J.; Fullana Torregrosa, E.; Fuster, J.; Gabaldon, C.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Gallas, E. J.; Gallas, M. V.; Gallop, B. J.; Gan, K. K.; Gannaway, F. C.; Gao, Y. S.; Gapienko, V. A.; Gaponenko, A.; Garciá, C.; Garcia-Sciveres, M.; Garcìa Navarro, J. E.; Garde, V.; Gardner, R. W.; Garelli, N.; Garitaonandia, H.; Garonne, V. G.; Garvey, J.; Gatti, C.; Gaudio, G.; Gaumer, O.; Gautard, V.; Gauzzi, P.; Gavrilenko, I. L.; Gay, C.; Gayde, J.-C.; Gazis, E. N.; Gazo, E.; Gee, C. N. P.; Geich-Gimbel, C.; Gellerstedt, K.; Gemme, C.; Genest, M. H.; Gentile, S.; George, M. A.; George, S.; Gerlach, P.; Gernizky, Y.; Geweniger, C.; Ghazlane, H.; Ghete, V. M.; Ghez, P.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giakoumopoulou, V.; Giangiobbe, V.; Gianotti, F.; Gibbard, B.; Gibson, A.; Gibson, M. D.; Gibson, S. M.; Gieraltowski, G. F.; Gil Botella, I.; Gilbert, L. M.; Gilchriese, M.; Gildemeister, O.; Gilewsky, V.; Gillman, A. R.; Gingrich, D. M.; Ginzburg, J.; Giokaris, N.; Giordani, M. P.; Girard, C. G.; Giraud, P. F.; Girtler, P.; Giugni, D.; Giusti, P.; Gjelsten, B. K.; Glasman, C.; Glazov, A.; Glitza, K. W.; Glonti, G. L.; Gnanvo, K. G.; Godlewski, J.; Göpfert, T.; Gössling, C.; Göttfert, T.; Goldfarb, S.; Goldin, D.; Goldschmidt, N.; Golling, T.; Gollub, N. P.; Golonka, P. J.; Golovnia, S. N.; Gomes, A.; Gomes, J.; Gonçalo, R.; Gongadze, A.; Gonidec, A.; Gonzalez, S.; González de la Hoz, S.; González Millán, V.; Gonzalez Silva, M. L.; Gonzalez-Pineiro, B.; González-Sevilla, S.; Goodrick, M. J.; Goodson, J. J.; Goossens, L.; Gorbounov, P. A.; Gordeev, A.; Gordon, H.; Gorelov, I.; Gorfine, G.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Gorokhov, S. A.; Gorski, B. T.; Goryachev, S. V.; Goryachev, V. N.; Gosselink, M.; Gostkin, M. I.; Gouanère, M.; Gough Eschrich, I.; Goujdami, D.; Goulette, M.; Gousakov, I.; Gouveia, J.; Gowdy, S.; Goy, C.; Grabowska-Bold, I.; Grabski, V.; Grafström, P.; Grah, C.; Grahn, K.-J.; Grancagnolo, F.; Grancagnolo, S.; Grassmann, H.; Gratchev, V.; Gray, H. M.; Graziani, E.; Green, B.; Greenall, A.; Greenfield, D.; Greenwood, D.; Gregor, I. M.; Grewal, A.; Griesmayer, E.; Grigalashvili, N.; Grigson, C.; Grillo, A. A.; Grimaldi, F.; Grimm, K.; Gris, P. L. Y.; Grishkevich, Y.; Groenstege, H.; Groer, L. S.; Grognuz, J.; Groh, M.; Gross, E.; Grosse-Knetter, J.; Grothe, M. E. M.; Grudzinski, J.; Gruse, C.; Gruwe, M.; Grybel, K.; Grybos, P.; Gschwendtner, E. M.; Guarino, V. J.; Guicheney, C. J.; Guilhem, G.; Guillemin, T.; Gunther, J.; Guo, B.; Gupta, A.; Gurriana, L.; Gushchin, V. N.; Gutierrez, P.; Guy, L.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haas, S.; Haber, C.; Haboubi, G.; Hackenburg, R.; Hadash, E.; Hadavand, H. K.; Haeberli, C.; Härtel, R.; Haggerty, R.; Hahn, F.; Haider, S.; Hajduk, Z.; Hakimi, M.; Hakobyan, H.; Hakobyan, H.; Haller, J.; Hallewell, G. D.; Hallgren, B.; Hamacher, K.; Hamilton, A.; Han, H.; Han, L.; Hanagaki, K.; Hance, M.; Hanke, P.; Hansen, C. J.; Hansen, F. H.; Hansen, J. R.; Hansen, J. B.; Hansen, J. D.; Hansen, P. H.; Hansl-Kozanecka, T.; Hanson, G.; Hansson, P.; Hara, K.; Harder, S.; Harel, A.; Harenberg, T.; Harper, R.; Hart, J. C.; Hart, R. G. G.; Hartjes, F.; Hartman, N.; Haruyama, T.; Harvey, A.; Hasegawa, Y.; Hashemi, K.; Hassani, S.; Hatch, M.; Hatley, R. W.; Haubold, T. G.; Hauff, D.; Haug, F.; Haug, S.; Hauschild, M.; Hauser, R.; Hauviller, C.; Havranek, M.; Hawes, B. M.; Hawkings, R. J.; Hawkins, D.; Hayler, T.; Hayward, H. S.; Haywood, S. J.; Hazen, E.; He, M.; He, Y. P.; Head, S. J.; Hedberg, V.; Heelan, L.; Heinemann, F. E. W.; Heldmann, M.; Hellman, S.; Helsens, C.; Henderson, R. C. W.; Hendriks, P. J.; Henriques Correia, A. M.; Henrot-Versille, S.; Henry-Couannier, F.; Henß, T.; Herten, G.; Hertenberger, R.; Hervas, L.; Hess, M.; Hessey, N. P.; Hicheur, A.; Hidvegi, A.; Higón-Rodriguez, E.; Hill, D.; Hill, J.; Hill, J. C.; Hill, N.; Hillier, S. J.; Hinchliffe, I.; Hindson, D.; Hinkelbein, C.; Hodges, T. A.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoffman, J.; Hoffmann, A. E.; Hoffmann, D.; Hoffmann, H. F.; Holder, M.; Hollins, T. I.; Hollyman, G.; Holmes, A.; Holmgren, S. O.; Holt, R.; Holtom, E.; Holy, T.; Homer, R. J.; Homma, Y.; Homola, P.; Honerbach, W.; Honma, A.; Hooton, I.; Horazdovsky, T.; Horn, C.; Horvat, S.; Hostachy, J.-Y.; Hott, T.; Hou, S.; Houlden, M. A.; Hoummada, A.; Hover, J.; Howell, D. F.; Hrivnac, J.; Hruska, I.; Hryn'ova, T.; Huang, G. S.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, B. T.; Hughes, E.; Hughes, G.; Hughes-Jones, R. E.; Hulsbergen, W.; Hurst, P.; Hurwitz, M.; Huse, T.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Ibbotson, M.; Ibragimov, I.; Ichimiya, R.; Iconomidou-Fayard, L.; Idarraga, J.; Idzik, M.; Iengo, P.; Iglesias Escudero, M. C.; Igonkina, O.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Ilyushenka, Y.; Imbault, D.; Imbert, P.; Imhaeuser, M.; Imori, M.; Ince, T.; Inigo-Golfin, J.; Inoue, K.; Ioannou, P.; Iodice, M.; Ionescu, G.; Ishii, K.; Ishino, M.; Ishizawa, Y.; Ishmukhametov, R.; Issever, C.; Ito, H.; Ivashin, A. V.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jackson, J.; Jackson, J. N.; Jaekel, M.; Jagielski, S.; Jahoda, M.; Jain, V.; Jakobs, K.; Jakubek, J.; Jansen, E.; Jansweijer, P. P. M.; Jared, R. C.; Jarlskog, G.; Jarp, S.; Jarron, P.; Jelen, K.; Jen-La Plante, I.; Jenni, P.; Jeremie, A.; Jez, P.; Jézéquel, S.; Jiang, Y.; Jin, G.; Jin, S.; Jinnouchi, O.; Joffe, D.; Johansen, L. G.; Johansen, M.; Johansson, K. E.; Johansson, P.; Johns, K. A.; Jon-And, K.; Jones, M.; Jones, R.; Jones, R. W. L.; Jones, T. W.; Jones, T. J.; Jones, A.; Jonsson, O.; Joo, K. K.; Joos, D.; Joos, M.; Joram, C.; Jorgensen, S.; Joseph, J.; Jovanovic, P.; Junnarkar, S. S.; Juranek, V.; Jussel, P.; Kabachenko, V. V.; Kabana, S.; Kaci, M.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagawa, S.; Kaiser, S.; Kajomovitz, E.; Kakurin, S.; Kalinovskaya, L. V.; Kama, S.; Kambara, H.; Kanaya, N.; Kandasamy, A.; Kandasamy, S.; Kaneda, M.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kaplon, J.; Karagounis, M.; Karagoz Unel, M.; Karr, K.; Karst, P.; Kartvelishvili, V.; Karyukhin, A. N.; Kashif, L.; Kasmi, A.; Kass, R. D.; Kastanas, A.; Kataoka, M.; Kataoka, Y.; Katsoufis, E.; Katunin, S.; Kawagoe, K.; Kawai, M.; Kawamoto, T.; Kayumov, F.; Kazanin, V. A.; Kazarinov, M. Y.; Kazarov, A.; Kazi, S. I.; Keates, J. R.; Keeler, R.; Keener, P. T.; Kehoe, R.; Keil, M.; Kekelidze, G. D.; Kelly, M.; Kennedy, J.; Kenyon, M.; Kepka, O.; Kerschen, N.; Kerševan, B. P.; Kersten, S.; Ketterer, C.; Khakzad, M.; Khalilzade, F.; Khandanyan, H.; Khanov, A.; Kharchenko, D.; Khodinov, A.; Kholodenko, A. G.; Khomich, A.; Khomutnikov, V. P.; Khoriauli, G.; Khovanskiy, N.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kieft, G.; Kierstead, J. A.; Kilvington, G.; Kim, H.; Kim, H.; Kim, S. H.; Kind, P.; King, B. T.; Kirk, J.; Kirsch, G. P.; Kirsch, L. E.; Kiryunin, A. E.; Kisielewska, D.; Kisielewski, B.; Kittelmann, T.; Kiver, A. M.; Kiyamura, H.; Kladiva, E.; Klaiber-Lodewigs, J.; Kleinknecht, K.; Klier, A.; Klimentov, A.; Kline, C. R.; Klingenberg, R.; Klinkby, E. B.; Klioutchnikova, T.; Klok, P. F.; Klous, S.; Kluge, E.-E.; Kluit, P.; Klute, M.; Kluth, S.; Knecht, N. K.; Kneringer, E.; Knezo, E.; Knobloch, J.; Ko, B. R.; Kobayashi, T.; Kobel, M.; Kodys, P.; König, A. C.; König, S.; Köpke, L.; Koetsveld, F.; Koffas, T.; Koffeman, E.; Kohout, Z.; Kohriki, T.; Kokott, T.; Kolachev, G. M.; Kolanoski, H.; Kolesnikov, V.; Koletsou, I.; Kollefrath, M.; Kolos, S.; Kolya, S. D.; Komar, A. A.; Komaragiri, J. R.; Kondo, T.; Kondo, Y.; Kondratyeva, N. V.; Kono, T.; Kononov, A. I.; Konoplich, R.; Konovalov, S. P.; Konstantinidis, N.; Kootz, A.; Koperny, S.; Kopikov, S. V.; Korcyl, K.; Kordas, K.; Koreshev, V.; Korn, A.; Korolkov, I.; Korotkov, V. A.; Korsmo, H.; Kortner, O.; Kostrikov, M. E.; Kostyukhin, V. V.; Kotamäki, M. J.; Kotchetkov, D.; Kotov, S.; Kotov, V. M.; Kotov, K. Y.; Kourkoumelis, C.; Koutsman, A.; Kovalenko, S.; Kowalewski, R.; Kowalski, H.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kral, V.; Kramarenko, V.; Kramberger, G.; Kramer, A.; Krasel, O.; Krasny, M. W.; Krasznahorkay, A.; Krepouri, A.; Krieger, P.; Krivkova, P.; Krobath, G.; Kroha, H.; Krstic, J.; Kruchonak, U.; Krüger, H.; Kruger, K.; Krumshteyn, Z. V.; Kubik, P.; Kubischta, W.; Kubota, T.; Kudin, L. G.; Kudlaty, J.; Kugel, A.; Kuhl, T.; Kuhn, D.; Kukhtin, V.; Kulchitsky, Y.; Kundu, N.; Kupco, A.; Kupper, M.; Kurashige, H.; Kurchaninov, L. L.; Kurochkin, Y. A.; Kus, V.; Kuykendall, W.; Kuzhir, P.; Kuznetsova, E. K.; Kvasnicka, O.; Kwee, R.; La Marra, D.; La Rosa, M.; La Rotonda, L.; Labarga, L.; Labbe, J. A.; Lacasta, C.; Lacava, F.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lamanna, E.; Lambacher, M.; Lambert, F.; Lampl, W.; Lancon, E.; Landgraf, U.; Landon, M. P. J.; Landsman, H.; Langstaff, R. R.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Lapin, V. V.; Laplace, S.; Laporte, J. F.; Lara, V.; Lari, T.; Larionov, A. V.; Lasseur, C.; Lau, W.; Laurelli, P.; Lavorato, A.; Lavrijsen, W.; Lazarev, A. B.; LeBihan, A.-C.; LeDortz, O.; LeManer, C.; LeVine, M.; Leahu, L.; Leahu, M.; Lebel, C.; Lechowski, M.; LeCompte, T.; Ledroit-Guillon, F.; Lee, H.; Lee, J. S. H.; Lee, S. C.; Lefebvre, M.; Lefevre, R. P.; Legendre, M.; Leger, A.; LeGeyt, B. C.; Leggett, C.; Lehmacher, M.; Lehmann Miotto, G.; Lehto, M.; Leitner, R.; Lelas, D.; Lellouch, D.; Leltchouk, M.; Lendermann, V.; Leney, K. J. C.; Lenz, T.; Lenzen, G.; Lepidis, J.; Leroy, C.; Lessard, J.-R.; Lesser, J.; Lester, C. G.; Letheren, M.; Fook Cheong, A. Leung; Levêque, J.; Levin, D.; Levinson, L. J.; Levitski, M. S.; Lewandowska, M.; Leyton, M.; Li, J.; Li, W.; Liabline, M.; Liang, Z.; Liang, Z.; Liberti, B.; Lichard, P.; Liebig, W.; Lifshitz, R.; Liko, D.; Lim, H.; Limper, M.; Lin, S. C.; Lindahl, A.; Linde, F.; Lindquist, L.; Lindsay, S. W.; Linhart, V.; Lintern, A. J.; Liolios, A.; Lipniacka, A.; Liss, T. M.; Lissauer, A.; List, J.; Litke, A. M.; Liu, S.; Liu, T.; Liu, Y.; Livan, M.; Lleres, A.; Llosá Llácer, G.; Lloyd, S. L.; Lobkowicz, F.; Loch, P.; Lockman, W. S.; Loddenkoetter, T.; Loebinger, F. K.; Loginov, A.; Loh, C. W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Loken, J.; Lokwitz, S.; Long, M. C.; Lopes, L.; Lopez Mateos, D.; Losty, M. J.; Lou, X.; Loureiro, K. F.; Lovas, L.; Love, J.; Lowe, A.; Lozano Fantoba, M.; Lu, F.; Lu, J.; Lu, L.; Lubatti, H. J.; Lucas, S.; Luci, C.; Lucotte, A.; Ludwig, A.; Ludwig, I.; Ludwig, J.; Luehring, F.; Lüke, D.; Luijckx, G.; Luisa, L.; Lumb, D.; Luminari, L.; Lund, E.; Lund-Jensen, B.; Lundberg, B.; Lundquist, J.; Lupi, A.; Lupu, N.; Lutz, G.; Lynn, D.; Lynn, J.; Lys, J.; Lysan, V.; Lytken, E.; López-Amengual, J. M.; Ma, H.; Ma, L. L.; Maaß en, M.; Maccarrone, G.; Mace, G. G. R.; Macina, D.; Mackeprang, R.; Macpherson, A.; MacQueen, D.; Macwaters, C.; Madaras, R. J.; Mader, W. F.; Maenner, R.; Maeno, T.; Mättig, P.; Mättig, S.; Magrath, C. A.; Mahalalel, Y.; Mahboubi, K.; Mahout, G.; Maidantchik, C.; Maio, A.; Mair, G. M.; Mair, K.; Makida, Y.; Makowiecki, D.; Malecki, P.; Maleev, V. P.; Malek, F.; Malon, D.; Maltezos, S.; Malychev, V.; Malyukov, S.; Mambelli, M.; Mameghani, R.; Mamuzic, J.; Manabe, A.; Manara, A.; Manca, G.; Mandelli, L.; Mandić, I.; Mandl, M.; Maneira, J.; Maneira, M.; Mangeard, P. S.; Mangin-Brinet, M.; Manjavidze, I. D.; Mann, W. A.; Manolopoulos, S.; Manousakis-Katsikakis, A.; Mansoulie, B.; Manz, A.; Mapelli, A.; Mapelli, L.; March, L.; Marchand, J. F.; Marchesotti, M.; Marcisovsky, M.; Marin, A.; Marques, C. N.; Marroquim, F.; Marshall, R.; Marshall, Z.; Martens, F. K.; Garcia, S. Marti i.; Martin, A. J.; Martin, B.; Martin, B.; Martin, F. F.; Martin, J. P.; Martin, Ph; Martinez, G.; Martínez Lacambra, C.; Martinez Outschoorn, V.; Martini, A.; Martins, J.; Maruyama, T.; Marzano, F.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Maß, M.; Massa, I.; Massaro, G.; Massol, N.; Mathes, M.; Matheson, J.; Matricon, P.; Matsumoto, H.; Matsunaga, H.; Maugain, J. M.; Maxfield, S. J.; May, E. N.; Mayer, J. K.; Mayri, C.; Mazini, R.; Mazzanti, M.; Mazzanti, P.; Mazzoni, E.; Mazzucato, F.; McKee, S. P.; McCarthy, R. L.; McCormick, C.; McCubbin, N. A.; McDonald, J.; McFarlane, K. W.; McGarvie, S.; McGlone, H.; McLaren, R. A.; McMahon, S. J.; McMahon, T. R.; McMahon, T. J.; McPherson, R. A.; Mechtel, M.; Meder-Marouelli, D.; Medinnis, M.; Meera-Lebbai, R.; Meessen, C.; Mehdiyev, R.; Mehta, A.; Meier, K.; Meinhard, H.; Meinhardt, J.; Meirosu, C.; Meisel, F.; Melamed-Katz, A.; Mellado Garcia, B. R.; Mendes Jorge, P.; Mendez, P.; Menke, S.; Menot, C.; Meoni, E.; Merkl, D.; Merola, L.; Meroni, C.; Merritt, F. S.; Messmer, I.; Metcalfe, J.; Meuser, S.; Meyer, J.-P.; Meyer, T. C.; Meyer, W. T.; Mialkovski, V.; Michelotto, M.; Micu, L.; Middleton, R.; Miele, P.; Migliaccio, A.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikestikova, M.; Mikulec, B.; Mikuž, M.; Miller, D. W.; Miller, R. J.; Miller, W.; Milosavljevic, M.; Milstead, D. A.; Mima, S.; Minaenko, A. A.; Minano, M.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Mir, L. M.; Mirabelli, G.; Miralles Verge, L.; Misawa, S.; Miscetti, S.; Misiejuk, A.; Mitra, A.; Mitrofanov, G. Y.; Mitsou, V. A.; Miyagawa, P. S.; Miyazaki, Y.; Mjörnmark, J. U.; Mkrtchyan, S.; Mladenov, D.; Moa, T.; Moch, M.; Mochizuki, A.; Mockett, P.; Modesto, P.; Moed, S.; Mönig, K.; Möser, N.; Mohn, B.; Mohr, W.; Mohrdieck-Möck, S.; Moisseev, A. M.; Moles Valls, R. M.; Molina-Perez, J.; Moll, A.; Moloney, G.; Mommsen, R.; Moneta, L.; Monnier, E.; Montarou, G.; Montesano, S.; Monticelli, F.; Moore, R. W.; Moore, T. B.; Moorhead, G. F.; Moraes, A.; Morel, J.; Moreno, A.; Moreno, D.; Morettini, P.; Morgan, D.; Morii, M.; Morin, J.; Morley, A. K.; Mornacchi, G.; Morone, M.-C.; Morozov, S. V.; Morris, E. J.; Morris, J.; Morrissey, M. C.; Moser, H. G.; Mosidze, M.; Moszczynski, A.; Mouraviev, S. V.; Mouthuy, T.; Moye, T. H.; Moyse, E. J. W.; Mueller, J.; Müller, M.; Muijs, A.; Muller, T. R.; Munar, A.; Munday, D. J.; Murakami, K.; Murillo Garcia, R.; Murray, W. J.; Myagkov, A. G.; Myska, M.; Nagai, K.; Nagai, Y.; Nagano, K.; Nagasaka, Y.; Nairz, A. M.; Naito, D.; Nakamura, K.; Nakamura, Y.; Nakano, I.; Nanava, G.; Napier, A.; Nassiakou, M.; Nasteva, I.; Nation, N. R.; Naumann, T.; Nauyock, F.; Nderitu, S. K.; Neal, H. A.; Nebot, E.; Nechaeva, P.; Neganov, A.; Negri, A.; Negroni, S.; Nelson, C.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Nesterov, S. Y.; Neukermans, L.; Nevski, P.; Newcomer, F. M.; Nichols, A.; Nicholson, C.; Nicholson, R.; Nickerson, R. B.; Nicolaidou, R.; Nicoletti, G.; Nicquevert, B.; Niculescu, M.; Nielsen, J.; Niinikoski, T.; Niinimaki, M. J.; Nikitin, N.; Nikolaev, K.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, H.; Nilsson, B. S.; Nilsson, P.; Nisati, A.; Nisius, R.; Nodulman, L. J.; Nomachi, M.; Nomoto, H.; Noppe, J.-M.; Nordberg, M.; Norniella Francisco, O.; Norton, P. R.; Novakova, J.; Nowak, M.; Nozaki, M.; Nunes, R.; Nunes Hanninger, G.; Nunnemann, T.; Nyman, T.; O'Connor, P.; O'Neale, S. W.; O'Neil, D. C.; O'Neill, M.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermaier, M.; Oberson, P.; Ochi, A.; Ockenfels, W.; Odaka, S.; Odenthal, I.; Odino, G. A.; Ogren, H.; Oh, S. H.; Ohshima, T.; Ohshita, H.; Okawa, H.; Olcese, M.; Olchevski, A. G.; Oliver, C.; Oliver, J.; Olivo Gomez, M.; Olszewski, A.; Olszowska, J.; Omachi, C.; Onea, A.; Onofre, A.; Oram, C. J.; Ordonez, G.; Oreglia, M. J.; Orellana, F.; Oren, Y.; Orestano, D.; Orlov, I. O.; Orr, R. S.; Orsini, F.; Osborne, L. S.; Osculati, B.; Osuna, C.; Otec, R.; Othegraven, R.; Ottewell, B.; Ould-Saada, F.; Ouraou, A.; Ouyang, Q.; Øye, O. K.; Ozcan, V. E.; Ozone, K.; Ozturk, N.; Pacheco Pages, A.; Padhi, S.; Padilla Aranda, C.; Paganis, E.; Paige, F.; Pailler, P. M.; Pajchel, K.; Palestini, S.; Palla, J.; Pallin, D.; Palmer, M. J.; Pan, Y. B.; Panikashvili, N.; Panin, V. N.; Panitkin, S.; Pantea, D.; Panuskova, M.; Paolone, V.; Paoloni, A.; Papadopoulos, I.; Papadopoulou, T.; Park, I.; Park, W.; Parker, M. A.; Parker, S.; Parkman, C.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pasqualucci, E.; Passardi, G.; Passeri, A.; Passmore, M. S.; Pastore, F.; Pastore, Fr; Pataraia, S.; Pate, D.; Pater, J. R.; Patricelli, S.; Pauly, T.; Pauna, E.; Peak, L. S.; Peeters, S. J. M.; Peez, M.; Pei, E.; Peleganchuk, S. V.; Pellegrini, G.; Pengo, R.; Pequenao, J.; Perantoni, M.; Perazzo, A.; Pereira, A.; Perepelkin, E.; Perera, V. J. O.; Perez Codina, E.; Perez Reale, V.; Peric, I.; Perini, L.; Pernegger, H.; Perrin, E.; Perrino, R.; Perrodo, P.; Perrot, G.; Perus, P.; Peshekhonov, V. D.; Petereit, E.; Petersen, J.; Petersen, T. C.; Petit, P. J. F.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petti, R.; Pezzetti, M.; Pfeifer, B.; Phan, A.; Phillips, A. W.; Phillips, P. W.; Piacquadio, G.; Piccinini, M.; Pickford, A.; Piegaia, R.; Pier, S.; Pilcher, J. E.; Pilkington, A. D.; Pimenta Dos Santos, M. A.; Pina, J.; Pinfold, J. L.; Ping, J.; Pinhão, J.; Pinto, B.; Pirotte, O.; Placakyte, R.; Placci, A.; Plamondon, M.; Plano, W. G.; Pleier, M.-A.; Pleskach, A. V.; Podkladkin, S.; Podlyski, F.; Poffenberger, P.; Poggioli, L.; Pohl, M.; Polak, I.; Polesello, G.; Policicchio, A.; Polini, A.; Polychronakos, V.; Pomarede, D. M.; Pommès, K.; Ponsot, P.; Pontecorvo, L.; Pope, B. G.; Popescu, R.; Popovic, D. S.; Poppleton, A.; Popule, J.; Portell Bueso, X.; Posch, C.; Pospelov, G. E.; Pospichal, P.; Pospisil, S.; Postranecky, M.; Potrap, I. N.; Potter, C. J.; Poulard, G.; Pousada, A.; Poveda, J.; Prabhu, R.; Pralavorio, P.; Prasad, S.; Prast, J.; Prat, S.; Prata, M.; Pravahan, R.; Preda, T.; Pretzl, K.; Pribyl, L.; Price, D.; Price, L. E.; Price, M. J.; Prichard, P. M.; Prieur, D.; Primavera, M.; Primor, D.; Prokofiev, K.; Prosso, E.; Proudfoot, J.; Przysiezniak, H.; Puigdengoles, C.; Purdham, J.; Purohit, M.; Puzo, P.; Pylaev, A. N.; Pylypchenko, Y.; Qi, M.; Qian, J.; Qian, W.; Qian, Z.; Qing, D.; Quadt, A.; Quarrie, D. R.; Quayle, W. B.; Rabbers, J. J.; Radeka, V.; Rafi, J. M.; Ragusa, F.; Rahimi, A. M.; Rahm, D.; Raine, C.; Raith, B.; Rajagopalan, S.; Rajek, S.; Rammer, H.; Ramstedt, M.; Rangod, S.; Ratoff, P. N.; Raufer, T.; Rauscher, F.; Rauter, E.; Raymond, M.; Reads, A. L.; Rebuzzi, D.; Redlinger, G. R.; Reeves, K.; Rehak, M.; Reichold, A.; Reinherz-Aronis, E.; Reisinger, I.; Reljic, D.; Rembser, C.; Ren, Z.; Renaudin-Crepe, S. R. C.; Renkel, P.; Rensch, B.; Rescia, S.; Rescigno, M.; Resconi, S.; Resende, B.; Rewiersma, P.; Rey, J.; Rey-Campagnolle, M.; Rezaie, E.; Reznicek, P.; Richards, R. A.; Richer, J.-P.; Richter, R. H.; Richter, R.; Richter-Was, E.; Ridel, M.; Riegler, W.; Rieke, S.; Rijpstra, M.; Rijssenbeek, M.; Rimoldi, A.; Rios, R. R.; Riu Dachs, I.; Rivline, M.; Rivoltella, G.; Rizatdinova, F.; Robertson, S. H.; Robichaud-Veronneau, A.; Robins, S.; Robinson, D.; Robson, A.; Rochford, J. H.; Roda, C.; Rodier, S.; Roe, S.; Røhne, O.; Rohrbach, F.; Roldán, J.; Rolli, S.; Romance, J. B.; Romaniouk, A.; Romanov, V. M.; Romeo, G.; Roos, L.; Ros, E.; Rosati, S.; Rosenbaum, F.; Rosenbaum, G. A.; Rosenberg, E. I.; Rosselet, L.; Rossi, L. P.; Rossi, L.; Rotaru, M.; Rothberg, J.; Rottländer, I.; Rousseau, D.; Rozanov, A.; Rozen, Y.; Ruber, R.; Ruckert, B.; Rudolph, G.; Rühr, F.; Ruggieri, F.; Ruggiero, G.; Ruiz, H.; Ruiz-Martinez, A.; Rulikowska-Zarebska, E.; Rumiantsev, V.; Rumyantsev, L.; Runge, K.; Runolfsson, O.; Rusakovich, N. A.; Rust, D. R.; Rutherfoord, J. P.; Ruwiedel, C.; Ryabov, Y. F.; Ryadovikov, V.; Ryan, P.; Rybkine, G.; da Costa, J. Sá; Saavedra, A. F.; Saboumazrag, S.; F-W Sadrozinski, H.; Sadykov, R.; Sakamoto, H.; Sala, P.; Salamon, A.; Saleem, M.; Salihagic, D.; Salt, J.; Saltó Bauza, O.; Salvachúa Ferrando, B. M.; Salvatore, D.; Salzburger, A.; Sampsonidis, D.; Samset, B. H.; Sánchez Sánchez, C. A.; Sanchis Lozano, M. A.; Sanchis Peris, E.; Sandaker, H.; Sander, H. G.; Sandhoff, M.; Sandvoss, S.; Sankey, D. P. C.; Sanny, B.; Sansone, S.; Sansoni, A.; Santamarina Rios, C.; Santander, J.; Santi, L.; Santoni, C.; Santonico, R.; Santos, J.; Sapinski, M.; Saraiva, J. G.; Sarri, F.; Sasaki, O.; Sasaki, T.; Sasao, N.; Satsounkevitch, I.; Sauvage, D.; Sauvage, G.; Savard, P.; Savine, A. Y.; Savinov, V.; Savoy-Navarro, A.; Savva, P.; Saxon, D. H.; Says, L. P.; Sbarra, C.; Sbrissa, E.; Sbrizzi, A.; Scannicchio, D. A.; Schaarschmidt, J.; Schacht, P.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schaller, M.; Schamov, A. G.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Scherzer, M. I.; Schiavi, C.; Schick, H.; Schieck, J.; Schieferdecker, P.; Schioppa, M.; Schlager, G.; Schlenker, S.; Schlereth, J. L.; Schmid, P.; Schmidt, M. P.; Schmitt, C.; Schmitt, K.; Schmitz, M.; Schmücker, H.; Schoerner, T.; Scholte, R. C.; Schott, M.; Schouten, D.; Schram, M.; Schricker, A.; Schroff, D.; Schuh, S.; Schuijlenburg, H. W.; Schuler, G.; Schultes, J.; Schultz-Coulon, H.-C.; Schumacher, J.; Schumacher, M.; Schune, Ph; Schwartzman, A.; Schweiger, D.; Schwemling, Ph; Schwick, C.; Schwienhorst, R.; Schwierz, R.; Schwindling, J.; Scott, W. G.; Secker, H.; Sedykh, E.; Seguin-Moreau, N.; Segura, E.; Seidel, S. C.; Seiden, A.; Seixas, J. M.; Sekhniaidze, G.; Seliverstov, D. M.; Selldén, B.; Seman, M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Seuster, R.; Severini, H.; Sevior, M. E.; Sexton, K. A.; Sfyrla, A.; Shah, T. P.; Shan, L.; Shank, J. T.; Shapiro, M.; Shatalov, P. B.; Shaver, L.; Shaw, C.; Shears, T. G.; Sherwood, P.; Shibata, A.; Shield, P.; Shilov, S.; Shimojima, M.; Shin, T.; Shiyakova, M.; Shmeleva, A.; Shoa, M.; Shochet, M. J.; Shupe, M. A.; Sicho, P.; Sidoti, A.; Siebel, A.; Siebel, M.; Siegrist, J.; Sijacki, D.; Silva, J.; Silverstein, S. B.; Simak, V.; Simic, Lj; Simion, S.; Simmons, B.; Simonyan, M.; Sinervo, P.; Sipica, V.; Siragusa, G.; Sisakyan, A. N.; Sivoklokov, S.; Sjölin, J.; Skubic, P.; Skvorodnev, N.; Slattery, P.; Slavicek, T.; Sliwa, K.; Sloan, T. J.; Sloper, J.; Smakhtin, V.; Small, A.; Smirnov, S. Yu; Smirnov, Y.; Smirnova, L.; Smirnova, O.; Smith, N. A.; Smith, B. C.; Smith, D. S.; Smith, J.; Smith, K. M.; Smith, B.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snow, S. W.; Snow, J.; Snuverink, J.; Snyder, S.; Soares, M.; Soares, S.; Sobie, R.; Sodomka, J.; Söderberg, M.; Soffer, A.; Solans, C. A.; Solar, M.; Sole, D.; Solfaroli Camillocci, E.; Solodkov, A. A.; Solov'yanov, O. V.; Soloviev, I.; Soluk, R.; Sondericker, J.; Sopko, V.; Sopko, B.; Sorbi, M.; Soret Medel, J.; Sosebee, M.; Sosnovtsev, V. V.; Sospedra Suay, L.; Soukharev, A.; Soukup, J.; Spagnolo, S.; Spano, F.; Speckmayer, P.; Spegel, M.; Spencer, E.; Spighi, R.; Spigo, G.; Spila, F.; Spiriti, E.; Spiwoks, R.; Spogli, L.; Spousta, M.; Sprachmann, G.; Spurlock, B.; St. Denis, R. D.; Stahl, T.; Staley, R. J.; Stamen, R.; Stancu, S. N.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stapnes, S.; Starchenko, E. A.; Staroba, P.; Stastny, J.; Staude, A.; Stavina, P.; Stavrianakou, M.; Stavropoulos, G.; Stefanidis, E.; Steffens, J. L.; Stekl, I.; Stelzer, H. J.; Stenzel, H.; Stewart, G.; Stewart, T. D.; Stiller, W.; Stockmanns, T.; Stodulski, M.; Stonjek, S.; Stradling, A.; Straessner, A.; Strandberg, J.; Strandlie, A.; Strauss, M.; Strickland, V.; Striegel, D.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Strong, J. A.; Stroynowski, R.; Stugu, B.; Stumer, I.; Su, D.; Subramania, S.; Suchkov, S. I.; Sugaya, Y.; Sugimoto, T.; Suk, M.; Sulin, V. V.; Sultanov, S.; Sun, Z.; Sundal, B.; Sushkov, S.; Susinno, G.; Sutcliffe, P.; Sutton, M. R.; Sviridov, Yu M.; Sykora, I.; Szczygiel, R. R.; Szeless, B.; Szymocha, T.; Sánchez, J.; Ta, D.; Taboada Gameiro, S.; Tadel, M.; Tafirout, R.; Taga, A.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Talby, M.; Talyshev, A.; Tamsett, M. C.; Tanaka, J.; Tanaka, K.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tanaka, Y.; Tappern, G. P.; Tapprogge, S.; Tarem, S.; Tarrade, F.; Tarrant, J.; Tartarelli, G.; Tas, P.; Tasevsky, M.; Tayalati, Y.; Taylor, F. E.; Taylor, G.; Taylor, G. N.; Taylor, R. P.; Tcherniatine, V.; Tegenfeldt, F.; Teixeira-Dias, P.; Ten Kate, H.; Teng, P. K.; Ter-Antonyan, R.; Terada, S.; Terron, J.; Terwort, M.; Teuscher, R. J.; Tevlin, C. M.; Thadome, J.; Thion, J.; Thioye, M.; Thomas, A.; Thomas, J. P.; Thomas, T. L.; Thomas, E.; Thompson, R. J.; Thompson, A. S.; Thun, R. P.; Tic, T.; Tikhomirov, V. O.; Tikhonov, Y. A.; Timm, S.; Timmermans, C. J. W. P.; Tipton, P.; Tique Aires Viegas, F. J.; Tisserant, S.; Titov, M.; Tobias, J.; Tocut, V. M.; Toczek, B.; Todorova-Nova, S.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tomasek, L.; Tomasek, M.; Tomasz, F.; Tomoto, M.; Tompkins, D.; Tompkins, L.; Toms, K.; Tonazzo, A.; Tong, G.; Tonoyan, A.; Topfel, C.; Topilin, N. D.; Torrence, E.; Torres Pais, J. G.; Toth, J.; Touchard, F.; Tovey, D. R.; Tovey, S. N.; Towndrow, E. F.; Trefzger, T.; Treichel, M.; Treis, J.; Tremblet, L.; Tribanek, W.; Tricoli, A.; Trigger, I. M.; Trilling, G.; Trincaz-Duvoid, S.; Tripiana, M. F.; Trischuk, W.; Trka, Z.; Trocmé, B.; Troncon, C.; C-L Tseng, J.; Tsiafis, I.; Tsiareshka, P. V.; Tsipolitis, G.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Turala, M.; Turk Cakir, I.; Turlay, E.; Tuts, P. M.; Twomey, M. S.; Tyndel, M.; Typaldos, D.; Tyrvainen, H.; Tzamarioudaki, E.; Tzanakos, G.; Ueda, I.; Uhrmacher, M.; Ukegawa, F.; Ullán Comes, M.; Unal, G.; Underwood, D. G.; Undrus, A.; Unel, G.; Unno, Y.; Urkovsky, E.; Usai, G.; Usov, Y.; Vacavant, L.; Vacek, V.; Vachon, B.; Vahsen, S.; Valderanis, C.; Valenta, J.; Valente, P.; Valero, A.; Valkar, S.; Valls Ferrer, J. A.; Van der Bij, H.; van der Graaf, H.; van der Kraaij, E.; Van Eijk, B.; van Eldik, N.; van Gemmeren, P.; van Kesteren, Z.; van Vulpen, I.; Van Berg, R.; Vandelli, W.; Vandoni, G.; Vaniachine, A.; Vannucci, F.; Varanda, M.; Varela Rodriguez, F.; Vari, R.; Varnes, E. W.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vassilakopoulos, V. I.; Vassilieva, L.; Vataga, E.; Vaz, L.; Vazeille, F.; Vedrine, P.; Vegni, G.; Veillet, J. J.; Vellidis, C.; Veloso, F.; Veness, R.; Veneziano, S.; Ventura, A.; Ventura, S.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vertogardov, L.; Vetterli, M. C.; Vichou, I.; Vickey, T.; Viehhauser, G. H. A.; Vigeolas, E.; Villa, M.; Villani, E. G.; Villate, J.; Villella, I.; Vilucchi, E.; Vincent, P.; Vincke, H.; Vincter, M. G.; Vinogradov, V. B.; Virchaux, M.; Viret, S.; Virzi, J.; Vitale, A.; Vivarelli, I.; Vives, R.; Vives Vaques, F.; Vlachos, S.; Vogt, H.; Vokac, P.; Vollmer, C. F.; Volpi, M.; Volpini, G.; von Boehn-Buchholz, R.; von der Schmitt, H.; von Toerne, E.; Vorobel, V.; Vorobiev, A. P.; Vorozhtsov, A. S.; Vorozhtsov, S. B.; Vos, M.; Voss, K. C.; Voss, R.; Vossebeld, J. H.; Vovenko, A. S.; Vranjes, N.; Vrba, V.; Vreeswijk, M.; Anh, T. Vu; Vuaridel, B.; Vudragovic, M.; Vuillemin, V.; Vuillermet, R.; Wänanen, A.; Wahlen, H.; Walbersloh, J.; Walker, R.; Walkowiak, W.; Wall, R.; Wallny, R. S.; Walsh, S.; Wang, C.; Wang, J. C.; Wappler, F.; Warburton, A.; Ward, C. P.; Warner, G. P.; Warren, M.; Warsinsky, M.; Wastie, R.; Watkins, P. M.; Watson, A. T.; Watts, G.; Waugh, A. T.; Waugh, B. M.; Weaverdyck, C.; Webel, M.; Weber, G.; Weber, J.; Weber, M.; Weber, P.; Weidberg, A. R.; Weilhammer, P. M.; Weingarten, J.; Weiser, C.; Wellenstein, H.; Wellisch, H. P.; Wells, P. S.; Wemans, A.; Wen, M.; Wenaus, T.; Wendler, S.; Wengler, T.; Wenig, S.; Wermes, N.; Werneke, P.; Werner, P.; Werthenbach, U.; Wheeler-Ellis, S. J.; Whitaker, S. P.; White, A.; White, M. J.; White, S.; Whittington, D.; Wicek, F.; Wicke, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiesmann, M.; Wiesmann, M.; Wijnen, T.; Wildauer, A.; Wilhelm, I.; Wilkens, H. G.; Williams, H. H.; Willis, W.; Willocq, S.; Wilmut, I.; Wilson, J. A.; Wilson, A.; Wingerter-Seez, I.; Winton, L.; Witzeling, W.; Wlodek, T.; Woehrling, E.; Wolter, M. W.; Wolters, H.; Wosiek, B.; Wotschack, J.; Woudstra, M. J.; Wright, C.; Wu, S. L.; Wu, X.; Wuestenfeld, J.; Wunstorf, R.; Xella-Hansen, S.; Xiang, A.; Xie, S.; Xie, Y.; Xu, G.; Xu, N.; Yamamoto, A.; Yamamoto, S.; Yamaoka, H.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, J. C.; Yang, S.; Yang, U. K.; Yang, Y.; Yang, Z.; Yao, W.-M.; Yao, Y.; Yarradoddi, K.; Yasu, Y.; Ye, J.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, H.; Yoshida, R.; Young, C.; Youssef, S. P.; Yu, D.; Yu, J.; Yu, M.; Yu, X.; Yuan, J.; Yurkewicz, A.; Zaets, V. G.; Zaidan, R.; Zaitsev, A. M.; Zajac, J.; Zajacova, Z.; Zalite, A. Yu; Zalite, Yo K.; Zanello, L.; Zarzhitsky, P.; Zaytsev, A.; Zdrazil, M.; Zeitnitz, C.; Zeller, M.; Zema, P. F.; Zendler, C.; Zenin, A. V.; Zenis, T.; Zenonos, Z.; Zenz, S.; Zerwas, D.; Zhang, H.; Zhang, J.; Zheng, W.; Zhang, X.; Zhao, L.; Zhao, T.; Zhao, X.; Zhao, Z.; Zhelezko, A.; Zhemchugov, A.; Zheng, S.; Zhichao, L.; Zhou, B.; Zhou, N.; Zhou, S.; Zhou, Y.; Zhu, C. G.; Zhu, H. Z.; Zhuang, X. A.; Zhuravlov, V.; Zilka, B.; Zimin, N. I.; Zimmermann, S.; Ziolkowski, M.; Zitoun, R.; Zivkovic, L.; Zmouchko, V. V.; Zobernig, G.; Zoccoli, A.; Zoeller, M. M.; Zolnierowski, Y.; Zsenei, A.; zur Nedden, M.; Zychacek, V.

    2008-08-01

    The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper. A brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented.

  20. CERN IRRADIATION FACILITIES.

    PubMed

    Pozzi, Fabio; Garcia Alia, Ruben; Brugger, Markus; Carbonez, Pierre; Danzeca, Salvatore; Gkotse, Blerina; Richard Jaekel, Martin; Ravotti, Federico; Silari, Marco; Tali, Maris

    2017-09-28

    CERN provides unique irradiation facilities for applications in dosimetry, metrology, intercomparison of radiation protection devices, benchmark of Monte Carlo codes and radiation damage studies to electronics. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Towards a 21st century telephone exchange at CERN

    NASA Astrophysics Data System (ADS)

    Valentín, F.; Hesnaux, A.; Sierra, R.; Chapron, F.

    2015-12-01

    The advent of mobile telephony and Voice over IP (VoIP) has significantly impacted the traditional telephone exchange industry—to such an extent that private branch exchanges are likely to disappear completely in the near future. For large organisations, such as CERN, it is important to be able to smooth this transition by implementing new multimedia platforms that can protect past investments and the flexibility needed to securely interconnect emerging VoIP solutions and forthcoming developments such as Voice over LTE (VoLTE). We present the results of ongoing studies and tests at CERN of the latest technologies in this area.

  2. Ageing Studies on the First Resistive-MicroMeGaS Quadruplet at GIF++ Preliminary Results

    NASA Astrophysics Data System (ADS)

    Alvarez Gonzalez, B.; Bianco, M.; Farina, E.; Iengo, P.; Kuger, F.; Lin, T.; Longo, L.; Sekhniaidze, G.; Sidiropoulou, O.; Schott, M.; Valderanis, C.; Wotschack, J.

    2018-02-01

    A resistive-MicroMeGaS quadruplet built at CERN has been installed at the new CERN Gamma Irradiation Facility (GIF++) with the aim of carrying out a long-term ageing study. Two smaller resistive bulk-MicroMeGaS produced at the CERN PCB workshop have also been installed at GIF++ in order to provide a comparison of the ageing behavior with the MicroMeGaS quadruplet. We give an overview of the ongoing tests at GIF++ in terms of particle rate, integrated charge and spatial resolution of the MicroMeGaS detectors.

  3. Media Training

    ScienceCinema

    None

    2017-12-09

    With the LHC starting up soon, the world's media are again turning their attention to CERN. We're all likely to be called upon to explain what is happening at CERN to media, friends and neighbours. The seminar will be given by BBC television news journalists Liz Pike and Nadia Marchant, and will deal with the kind of questions we're likely to be confronted with through the restart period. The training is open for everybody. Make sure you arrive early enough to get a seat - there are only 200 seats in the Globe. The session will also be webcast: http://webcast.cern.ch/

  4. The significance of Cern

    ScienceCinema

    None

    2017-12-09

    Le Prof. V.Weisskopf, DG du Cern de 1961 à 1965, est né à Vienne, a fait ses études à Göttingen et a une carrière académique particulièrement riche. Il a travaillé à Berlin, Copenhague et Berlin et est parti aux Etats Unis pour participer au projet Manhattan et était Prof. au MTT jusqu'à 1960. Revenu en Europe, il a été DG du Cern et lui a donné l'impulsion que l'on sait.

  5. HIGH ENERGY PHYSICS: CERN Link Breathes Life Into Russian Physics.

    PubMed

    Stone, R

    2000-10-13

    Without fanfare, 600 Russian scientists here at CERN, the European particle physics laboratory, are playing key roles in building the Large Hadron Collider (LHC), a machine that will explore fundamental questions such as why particles have mass, as well as search for exotic new particles whose existence would confirm supersymmetry, a popular theory that aims to unify the four forces of nature. In fact, even though Russia is not one of CERN's 20 member states, most top high-energy physicists in Russia are working on the LHC. Some say their work could prove the salvation of high-energy physics back home.

  6. Experience with procuring, deploying and maintaining hardware at remote co-location centre

    NASA Astrophysics Data System (ADS)

    Bärring, O.; Bonfillou, E.; Clement, B.; Coelho Dos Santos, M.; Dore, V.; Gentit, A.; Grossir, A.; Salter, W.; Valsan, L.; Xafi, A.

    2014-05-01

    In May 2012 CERN signed a contract with the Wigner Data Centre in Budapest for an extension to CERN's central computing facility beyond its current boundaries set by electrical power and cooling available for computing. The centre is operated as a remote co-location site providing rack-space, electrical power and cooling for server, storage and networking equipment acquired by CERN. The contract includes a 'remote-hands' services for physical handling of hardware (rack mounting, cabling, pushing power buttons, ...) and maintenance repairs (swapping disks, memory modules, ...). However, only CERN personnel have network and console access to the equipment for system administration. This report gives an insight to adaptations of hardware architecture, procurement and delivery procedures undertaken enabling remote physical handling of the hardware. We will also describe tools and procedures developed for automating the registration, burn-in testing, acceptance and maintenance of the equipment as well as an independent but important change to the IT assets management (ITAM) developed in parallel as part of the CERN IT Agile Infrastructure project. Finally, we will report on experience from the first large delivery of 400 servers and 80 SAS JBOD expansion units (24 drive bays) to Wigner in March 2013. Changes were made to the abstract file on 13/06/2014 to correct errors, the pdf file was unchanged.

  7. Building an organic block storage service at CERN with Ceph

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel; Wiebalck, Arne

    2014-06-01

    Emerging storage requirements, such as the need for block storage for both OpenStack VMs and file services like AFS and NFS, have motivated the development of a generic backend storage service for CERN IT. The goals for such a service include (a) vendor neutrality, (b) horizontal scalability with commodity hardware, (c) fault tolerance at the disk, host, and network levels, and (d) support for geo-replication. Ceph is an attractive option due to its native block device layer RBD which is built upon its scalable, reliable, and performant object storage system, RADOS. It can be considered an "organic" storage solution because of its ability to balance and heal itself while living on an ever-changing set of heterogeneous disk servers. This work will present the outcome of a petabyte-scale test deployment of Ceph by CERN IT. We will first present the architecture and configuration of our cluster, including a summary of best practices learned from the community and discovered internally. Next the results of various functionality and performance tests will be shown: the cluster has been used as a backend block storage system for AFS and NFS servers as well as a large OpenStack cluster at CERN. Finally, we will discuss the next steps and future possibilities for Ceph at CERN.

  8. Self-service for software development projects and HPC activities

    NASA Astrophysics Data System (ADS)

    Husejko, M.; Høimyr, N.; Gonzalez, A.; Koloventzos, G.; Asbury, D.; Trzcinska, A.; Agtzidis, I.; Botrel, G.; Otto, J.

    2014-05-01

    This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons.Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAllister, Liam

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions".This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAllister, Liam

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental InteractionS". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ashoke

    Part 7.The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five seriesmore » of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions";. This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ashoke

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network". The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher.« less

  17. PREFACE: International Conference on Computing in High Energy and Nuclear Physics (CHEP 2012)

    NASA Astrophysics Data System (ADS)

    Ernst, Michael; Düllmann, Dirk; Rind, Ofer; Wong, Tony

    2012-12-01

    The International Conference on Computing in High Energy and Nuclear Physics (CHEP) was held at New York University on 21- 25 May 2012. CHEP is a major series of international conferences for physicists and computing professionals from the High Energy and Nuclear Physics community and related scientific and technical fields. The CHEP conference provides a forum to exchange information on computing progress and needs for the community, and to review recent, ongoing and future activities. CHEP conferences are held at roughly 18-month intervals, alternating between Europe, Asia, the Americas and other parts of the world. Recent CHEP conferences have been held in Taipei, Taiwan (2010); Prague, Czech Republic (2009); Victoria, Canada (2007); Mumbai, India (2006); Interlaken, Switzerland (2004); San Diego, United States (2003); Beijing, China (2001); Padova, Italy (2000). CHEP 2012 was organized by Brookhaven National Laboratory (BNL) and co-sponsored by New York University. The organizational structure for CHEP consists of an International Advisory Committee (IAC) which sets the overall themes of the conference, a Program Organizing Committee (POC) that oversees the program content, and a Local Organizing Committee (LOC) that is responsible for local arrangements (lodging, transportation and social events) and conference logistics (registration, program scheduling, conference site selection and conference proceedings). There were over 500 attendees with a program that included plenary sessions of invited speakers, a number of parallel sessions comprising around 125 oral and 425 poster presentations and industrial exhibitions. We thank all the presenters for the excellent scientific content of their contributions to the conference. Conference tracks covered topics on Online Computing, Event Processing, Distributed Processing and Analysis on Grids and Clouds, Computer Facilities, Production Grids and Networking, Software Engineering, Data Stores and Databases and Collaborative Tools. We would like to thank Brookhaven Science Associates, New York University, Blue Nest Events, the International Advisory Committee, the Program Committee and the Local Organizing Committee members for all their support and assistance. We also would like to acknowledge the support provided by the following sponsors: ACEOLE, Data Direct Networks, Dell, the European Middleware Initiative and Nexsan. Special thanks to the Program Committee members for their careful choice of conference contributions and enormous effort in reviewing and editing the conference proceedings. The next CHEP conference will be held in Amsterdam, the Netherlands on 14-18 October 2013. Conference Chair Michael Ernst (BNL) Program Committee Daniele Bonacorsi, University of Bologna, Italy Simone Campana, CERN, Switzerland Philippe Canal, Fermilab, United States Sylvain Chapeland, CERN, Switzerland Dirk Düllmann, CERN, Switzerland Johannes Elmsheuser, Ludwig Maximilian University of Munich, Germany Maria Girone, CERN, Switzerland Steven Goldfarb, University of Michigan, United States Oliver Gutsche, Fermilab, United States Benedikt Hegner, CERN, Switzerland Andreas Heiss, Karlsruhe Institute of Technology, Germany Peter Hristov, CERN, Switzerland Tony Johnson, SLAC, United States David Lange, LLNL, United States Adam Lyon, Fermilab, United States Remigius Mommsen, Fermilab, United States Axel Naumann, CERN, Switzerland Niko Neufeld, CERN, Switzerland Rolf Seuster, TRIUMF, Canada Local Organizing Committee Maureen Anderson, John De Stefano, Mariette Faulkner, Ognian Novakov, Ofer Rind, Tony Wong (BNL) Kyle Cranmer (NYU) International Advisory Committee Mohammad Al-Turany, GSI, Germany Lothar Bauerdick, Fermilab, United States Ian Bird, CERN, Switzerland Dominique Boutigny, IN2P3, France Federico Carminati, CERN, Switzerland Marco Cattaneo, CERN, Switzerland Gang Chen, Institute of High Energy Physics, China Peter Clarke, University of Edinburgh, United Kingdom Sridhara Dasu, University of Wisconsin-Madison, United States Günter Duckeck, Ludwig Maximilian University of Munich, Germany Richard Dubois, SLAC, United States Michael Ernst, BNL, United States Ian Fisk, Fermilab, United States Gonzalo Merino, PIC, Spain John Gordon, STFC-RAL, United Kingdom Volker Gülzow, DESY, Germany Frederic Hemmer, CERN, Switzerland Viatcheslav Ilyin, Moscow State University, Russia Nobuhiko Katayama, KEK, Japan Alexei Klimentov, BNL, United States Simon C. Lin, Academia Sinica, Taiwan Milos Lokajícek, FZU Prague, Czech Republic David Malon, ANL, United States Pere Mato Vila, CERN, Switzerland Mauro Morandin, INFN CNAF, Italy Harvey Newman, Caltech, United States Farid Ould-Saada, University of Oslo, Norway Ruth Pordes, Fermilab, United States Hiroshi Sakamoto, University of Tokyo, Japan Alberto Santoro, UERJ, Brazil Jim Shank, Boston University, United States Dongchul Son, Kyungpook National University, South Korea Reda Tafirout, TRIUMF, Canada Stephen Wolbers, Fermilab, United States Frank Wuerthwein, UCSD, United States

  18. The feasibility of well-logging measurements of arsenic levels using neutron-activation analysis

    USGS Publications Warehouse

    Oden, C.P.; Schweitzer, J.S.; McDowell, G.M.

    2006-01-01

    Arsenic is an extremely toxic metal, which poses a significant problem in many mining environments. Arsenic contamination is also a major problem in ground and surface waters. A feasibility study was conducted to determine if neutron-activation analysis is a practical method of measuring in situ arsenic levels. The response of hypothetical well-logging tools to arsenic was simulated using a readily available Monte Carlo simulation code (MCNP). Simulations were made for probes with both hyperpure germanium (HPGe) and bismuth germanate (BGO) detectors using accelerator and isotopic neutron sources. Both sources produce similar results; however, the BGO detector is much more susceptible to spectral interference than the HPGe detector. Spectral interference from copper can preclude low-level arsenic measurements when using the BGO detector. Results show that a borehole probe could be built that would measure arsenic concentrations of 100 ppm by weight to an uncertainty of 50 ppm in about 15 min. ?? 2006 Elsevier Ltd. All rights reserved.

  19. Influence of smoking and packaging methods on lipid stability and microbial quality of Capelin (Mallotus villosus) and Sardine (Sardinella gibossa)

    PubMed Central

    Cyprian, Odoli O; Van Nguyen, Minh; Sveinsdottir, Kolbrun; Jonsson, Asbjorn; Tomasson, Tumi; Thorkelsson, Gudjon; Arason, Sigurjon

    2015-01-01

    Lipid and microbial quality of smoked capelin (two groups differing in lipid content) and sardine was studied, with the aim of introducing capelin in the smoked sardine markets. Lipid hydrolysis (phospholipid and free fatty acids) and oxidation index (hydroperoxides and thiobarbituric acid-reactive substances), fatty acid composition, and total viable count were measured in raw and packaged smoked fish during chilled storage (day 2, 10, 16, 22, 28). Lipid hydrolysis was more pronounced in low lipid capelin, whereas accelerated lipid oxidation occurred in high lipid capelin. Muscle lipid was less stable in sardine than capelin. Essential polyunsaturated fatty acids (eicosapentaenoic acid and docosahexaenoic acid) constituted 12% of fatty acids in capelin and 19% in sardine. Vacuum packaging as well as hot smoking retarded bacterial growth, recording counts of ≤log 5 CFU/g compared to ≥log 7CFU/g in cold smoked air packaged. Smoked low lipid capelin was considered an alternative for introduction in smoked sardine markets. PMID:26405526

  20. Functional response of ungulate browsers in disturbed eastern hemlock forests

    USGS Publications Warehouse

    DeStefano, Stephen

    2015-01-01

    Ungulate browsing in predator depleted North American landscapes is believed to be causing widespread tree recruitment failures. However, canopy disturbances and variations in ungulate densities are sources of heterogeneity that can buffer ecosystems against herbivory. Relatively little is known about the functional response (the rate of consumption in relation to food availability) of ungulates in eastern temperate forests, and therefore how “top down” control of vegetation may vary with disturbance type, intensity, and timing. This knowledge gap is relevant in the Northeastern United States today with the recent arrival of hemlock woolly adelgid (HWA; Adelges tsugae) that is killing eastern hemlocks (Tsuga canadensis) and initiating salvage logging as a management response. We used an existing experiment in central New England begun in 2005, which simulated severe adelgid infestation and intensive logging of intact hemlock forest, to examine the functional response of combined moose (Alces americanus) and white-tailed deer (Odocoileus virginianus) foraging in two different time periods after disturbance (3 and 7 years). We predicted that browsing impacts would be linear or accelerating (Type I or Type III response) in year 3 when regenerating stem densities were relatively low and decelerating (Type II response) in year 7 when stem densities increased. We sampled and compared woody regeneration and browsing among logged and simulated insect attack treatments and two intact controls (hemlock and hardwood forest) in 2008 and again in 2012. We then used AIC model selection to compare the three major functional response models (Types I, II, and III) of ungulate browsing in relation to forage density. We also examined relative use of the different stand types by comparing pellet group density and remote camera images. In 2008, total and proportional browse consumption increased with stem density, and peaked in logged plots, revealing a Type I response. In 2012, stem densities were greatest in girdled plots, but proportional browse consumption was highest at intermediate stem densities in logged plots, exhibiting a Type III (rather than a Type II) functional response. Our results revealed shifting top–down control by herbivores at different stages of stand recovery after disturbance and in different understory conditions resulting from logging vs. simulated adelgid attack. If forest managers wish to promote tree regeneration in hemlock stands that is more resistant to ungulate browsers, leaving HWA-infested stands unmanaged may be a better option than preemptively logging them.

  1. CERN goes iconic

    NASA Astrophysics Data System (ADS)

    2017-06-01

    There are more than 1800 emoji that can be sent and received in text messages and e-mails. Now, the CERN particle-physics lab near Geneva has got in on the act and released its own collection of 35 images that can be used by anyone with an Apple device.

  2. Neutrino Factory Plans at CERN

    NASA Astrophysics Data System (ADS)

    Riche, J. A.

    2002-10-01

    The considerable interest raised by the discovery of neutrino oscillations and recent progress in studies of muon colliders has triggered interest in considering a neutrino factory at CERN. This paper explains the reference scenario, indicates the other possible choices and mentions the R&D that are foreseen.

  3. Wi-Fi Service enhancement at CERN

    NASA Astrophysics Data System (ADS)

    Ducret, V.; Sosnowski, A.; Gonzalez Caballero, B.; Barrand, Q.

    2017-10-01

    Since the early 2000’s, the number of mobile devices connected to CERN’s internal network has increased from just a handful to well over 10,000. Wireless access is no longer simply “nice to have” or just for conference and meeting rooms; support for mobility is expected by most, if not all, of the CERN community. In this context, a full renewal of the CERN Wi-Fi network has been launched to deliver a state-of-the-art campus-wide Wi-Fi Infrastructure. We aim to deliver, in more than 200 office buildings with a surface area of over 400,000m2 and including many high-priority and high-occupation zones, an end-user experience comparable, for most applications, to a wired connection and with seamless mobility support. We describe here the studies and tests performed at CERN to ensure the solution we are deploying can meet these goals as well as delivering a single, simple, flexible and open management platform.

  4. Thermostructural characterization and structural elastic property optimization of novel high luminosity LHC collimation materials at CERN

    NASA Astrophysics Data System (ADS)

    Borg, M.; Bertarelli, A.; Carra, F.; Gradassi, P.; Guardia-Valenzuela, J.; Guinchard, M.; Izquierdo, G. Arnau; Mollicone, P.; Sacristan-de-Frutos, O.; Sammut, N.

    2018-03-01

    The CERN Large Hadron Collider is currently being upgraded to operate at a stored beam energy of 680 MJ through the High Luminosity upgrade. The LHC performance is dependent on the functionality of beam collimation systems, essential for safe beam cleaning and machine protection. A dedicated beam experiment at the CERN High Radiation to Materials facility is created under the HRMT-23 experimental campaign. This experiment investigates the behavior of three collimation jaws having novel composite absorbers made of copper diamond, molybdenum carbide graphite, and carbon fiber carbon, experiencing accidental scenarios involving the direct beam impact on the material. Material characterization is imperative for the design, execution, and analysis of such experiments. This paper presents new data and analysis of the thermostructural characteristics of some of the absorber materials commissioned within CERN facilities. In turn, characterized elastic properties are optimized through the development and implementation of a mixed numerical-experimental optimization technique.

  5. Carbon Stocks and Fluxes in Tropical Lowland Dipterocarp Rainforests in Sabah, Malaysian Borneo

    PubMed Central

    Saner, Philippe; Loh, Yen Yee; Ong, Robert C.; Hector, Andy

    2012-01-01

    Deforestation in the tropics is an important source of carbon C release to the atmosphere. To provide a sound scientific base for efforts taken to reduce emissions from deforestation and degradation (REDD+) good estimates of C stocks and fluxes are important. We present components of the C balance for selectively logged lowland tropical dipterocarp rainforest in the Malua Forest Reserve of Sabah, Malaysian Borneo. Total organic C in this area was 167.9 Mg C ha−1±3.8 (SD), including: Total aboveground (TAGC: 55%; 91.9 Mg C ha−1±2.9 SEM) and belowground carbon in trees (TBGC: 10%; 16.5 Mg C ha−1±0.5 SEM), deadwood (8%; 13.2 Mg C ha−1±3.5 SEM) and soil organic matter (SOM: 24%; 39.6 Mg C ha−1±0.9 SEM), understory vegetation (3%; 5.1 Mg C ha−1±1.7 SEM), standing litter (<1%; 0.7 Mg C ha−1±0.1 SEM) and fine root biomass (<1%; 0.9 Mg C ha−1±0.1 SEM). Fluxes included litterfall, a proxy for leaf net primary productivity (4.9 Mg C ha−1 yr−1±0.1 SEM), and soil respiration, a measure for heterotrophic ecosystem respiration (28.6 Mg C ha−1 yr−1±1.2 SEM). The missing estimates necessary to close the C balance are wood net primary productivity and autotrophic respiration. Twenty-two years after logging TAGC stocks were 28% lower compared to unlogged forest (128 Mg C ha−1±13.4 SEM); a combined weighted average mean reduction due to selective logging of −57.8 Mg C ha−1 (with 95% CI −75.5 to −40.2). Based on the findings we conclude that selective logging decreased the dipterocarp stock by 55–66%. Silvicultural treatments may have the potential to accelerate the recovery of dipterocarp C stocks to pre-logging levels. PMID:22235319

  6. Highlights from the CERN/ESO/NordForsk ''Gender in Physics Day''

    NASA Astrophysics Data System (ADS)

    Primas, F.; Guinot, G.; Strandberg, L.

    2017-03-01

    In their role as observers on the EU Gender Equality Network in the European Research Area (GENERA) project, funded under the Horizon 2020 framework, CERN, ESO and NordForsk joined forces and organised a Gender in Physics Day at the CERN Globe of Science and Innovation. The one-day conference aimed to examine innovative activities promoting gender equality, and to discuss gender-oriented policies and best practice in the European Research Area (with special emphasis on intergovernmental organisations), as well as the importance of building solid networks. The event was very well attended and was declared a success. The main highlights of the meeting are reported.

  7. Dissemination of data measured at the CERN n_TOF facility

    NASA Astrophysics Data System (ADS)

    Dupont, E.; Otuka, N.; Cabellos, O.; Aberle, O.; Aerts, G.; Altstadt, S.; Alvarez, H.; Alvarez-Velarde, F.; Andriamonje, S.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Badurek, G.; Balibrea, J.; Barbagallo, M.; Barros, S.; Baumann, P.; Bécares, V.; Bečvář, F.; Beinrucker, C.; Belloni, F.; Berthier, B.; Berthoumieux, E.; Billowes, J.; Boccone, V.; Bosnar, D.; Brown, A.; Brugger, M.; Caamaño, M.; Calviani, M.; Calviño, F.; Cano-Ott, D.; Capote, R.; Cardella, R.; Carrapiço, C.; Casanovas, A.; Castelluccio, D. M.; Cennini, P.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Chin, M.; Colonna, N.; Cortés, G.; Cortés-Giraldo, M. A.; Cosentino, L.; Couture, A.; Cox, J.; Damone, L. A.; David, S.; Deo, K.; Diakaki, M.; Dillmann, I.; Domingo-Pardo, C.; Dressler, R.; Dridi, W.; Duran, I.; Eleftheriadis, C.; Embid-Segura, M.; Fernández-Domínguez, B.; Ferrant, L.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Fraval, K.; Frost, R. J. W.; Fujii, K.; Furman, W.; Ganesan, S.; Garcia, A. R.; Gawlik, A.; Gheorghe, I.; Gilardoni, S.; Giubrone, G.; Glodariu, T.; Göbel, K.; Gomez-Hornillos, M. B.; Goncalves, I. F.; Gonzalez-Romero, E.; Goverdovski, A.; Gramegna, F.; Griesmayer, E.; Guerrero, C.; Gunsing, F.; Gurusamy, P.; Haight, R.; Harada, H.; Heftrich, T.; Heil, M.; Heinitz, S.; Hernández-Prieto, A.; Heyse, J.; Igashira, M.; Isaev, S.; Jenkins, D. G.; Jericha, E.; Kadi, Y.; Kaeppeler, F.; Kalamara, A.; Karadimos, D.; Karamanis, D.; Katabuchi, T.; Kavrigin, P.; Kerveno, M.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Konovalov, V.; Krtička, M.; Kroll, J.; Kurtulgil, D.; Lampoudis, C.; Langer, C.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Naour, C. Le; Lerendegui-Marco, J.; Leong, L. S.; Licata, M.; Meo, S. Lo; Lonsdale, S. J.; Losito, R.; Lozano, M.; Macina, D.; Manousos, A.; Marganiec, J.; Martinez, T.; Marrone, S.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Mondelaers, W.; Montesano, S.; Moreau, C.; Mosconi, M.; Musumarra, A.; Negret, A.; Nolte, R.; O'Brien, S.; Oprea, A.; Palomo-Pinto, F. R.; Pancin, J.; Paradela, C.; Patronis, N.; Pavlik, A.; Pavlopoulos, P.; Perkowski, J.; Perrot, L.; Pigni, M. T.; Plag, R.; Plompen, A.; Plukis, L.; Poch, A.; Porras, I.; Praena, J.; Pretel, C.; Quesada, J. M.; Radeck, D.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego, A.; Robles, M.; Roman, F.; Rout, P. C.; Rudolf, G.; Rubbia, C.; Rullhusen, P.; Ryan, J. A.; Sabaté-Gilarte, M.; Salgado, J.; Santos, C.; Sarchiapone, L.; Sarmento, R.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Stephan, C.; Suryanarayana, S. V.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tavora, L.; Terlizzi, R.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Versaci, R.; Vermeulen, M. J.; Villamarin, D.; Vicente, M. C.; Vlachoudis, V.; Vlastou, R.; Voss, F.; Wallner, A.; Walter, S.; Ware, T.; Warren, S.; Weigand, M.; Weiß, C.; Wolf, C.; Wiesher, M.; Wisshak, K.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    The n_TOF neutron time-of-flight facility at CERN is used for high quality nuclear data measurements from thermal energy up to hundreds of MeV. In line with the CERN open data policy, the n_TOF Collaboration takes actions to preserve its unique data, facilitate access to them in standardised format, and allow their re-use by a wide community in the fields of nuclear physics, nuclear astrophysics and various nuclear technologies. The present contribution briefly describes the n_TOF outcomes, as well as the status of dissemination and preservation of n_TOF final data in the international EXFOR library.

  8. Characterization of the mechanomyographic signal of three different muscles and at different levels of isometric contractions.

    PubMed

    Jotta, Bruno; Cavalcanti Garcia, Marco Antonio; Visintainer Pino, Alexandre; De Souza, Marcio Nogueira

    2015-01-01

    Lateral (X) and longitudinal (Y) mechanical oscillations of muscle fibers that take place during muscular contraction seem to contain information additionally to the myoelectric activity, which can contribute to the interpretation of some muscle gradation force mechanisms. However, no previous study was found that had investigated the relationship between the muscle force and features associated to the mechanomyographic (MMG) signal obtained by means of a biaxial accelerometer in three different muscles. Therefore, the aim of this study was to evaluate the relationship between the force output at different load levels (20% to 100%) of the maximum voluntary isometric contraction (%MVIC) and the two signals supplied by a biaxial accelerometer and, in addition, the so-called resultant (R) acceleration signal derived from the two signals mentioned previously. Twenty seven male volunteers participated in this study. The force output related to the right biceps brachii, soleus and gastrocnemius medialis muscles was studied by means of linear regression models fit to log-transformed of the root mean square (RMS) values of the MMG signals in X, Y, and R axes versus each %MVIC. The phase angle of R acceleration (PhaseR) and anthropometric data were also considered. The angular coefficient a and the antilog of y-intercept b from the log-transformed of MMG data values versus force output were able to distinguish partially motor unit strategies during isometric contractions in the three muscles studied. The findings suggest that biaxial accelerometer seems to be an interesting approach in the assessment of muscle contraction properties.

  9. The keys to CERN conference rooms - Managing local collaboration facilities in large organisations

    NASA Astrophysics Data System (ADS)

    Baron, T.; Domaracky, M.; Duran, G.; Fernandes, J.; Ferreira, P.; Gonzalez Lopez, J. B.; Jouberjean, F.; Lavrut, L.; Tarocco, N.

    2014-06-01

    For a long time HEP has been ahead of the curve in its usage of remote collaboration tools, like videoconference and webcast, while the local CERN collaboration facilities were somewhat behind the expected quality standards for various reasons. This time is now over with the creation by the CERN IT department in 2012 of an integrated conference room service which provides guidance and installation services for new rooms (either equipped for videoconference or not), as well as maintenance and local support. Managing now nearly half of the 246 meeting rooms available on the CERN sites, this service has been built to cope with the management of all CERN rooms with limited human resources. This has been made possible by the intensive use of professional software to manage and monitor all the room equipment, maintenance and activity. This paper focuses on presenting these packages, either off-the-shelf commercial products (asset and maintenance management tool, remote audio-visual equipment monitoring systems, local automation devices, new generation touch screen interfaces for interacting with the room) when available or locally developed integration and operational layers (generic audio-visual control and monitoring framework) and how they help overcoming the challenges presented by such a service. The aim is to minimise local human interventions while preserving the highest service quality and placing the end user back in the centre of this collaboration platform.

  10. Status and Roadmap of CernVM

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Blomer, J.; Buncic, P.; Charalampidis, I.; Ganis, G.; Meusel, R.

    2015-12-01

    Cloud resources nowadays contribute an essential share of resources for computing in high-energy physics. Such resources can be either provided by private or public IaaS clouds (e.g. OpenStack, Amazon EC2, Google Compute Engine) or by volunteers computers (e.g. LHC@Home 2.0). In any case, experiments need to prepare a virtual machine image that provides the execution environment for the physics application at hand. The CernVM virtual machine since version 3 is a minimal and versatile virtual machine image capable of booting different operating systems. The virtual machine image is less than 20 megabyte in size. The actual operating system is delivered on demand by the CernVM File System. CernVM 3 has matured from a prototype to a production environment. It is used, for instance, to run LHC applications in the cloud, to tune event generators using a network of volunteer computers, and as a container for the historic Scientific Linux 5 and Scientific Linux 4 based software environments in the course of long-term data preservation efforts of the ALICE, CMS, and ALEPH experiments. We present experience and lessons learned from the use of CernVM at scale. We also provide an outlook on the upcoming developments. These developments include adding support for Scientific Linux 7, the use of container virtualization, such as provided by Docker, and the streamlining of virtual machine contextualization towards the cloud-init industry standard.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher. This video is Part 11 in the series.« less

  12. A possible biomedical facility at the European Organization for Nuclear Research (CERN).

    PubMed

    Dosanjh, M; Jones, B; Myers, S

    2013-05-01

    A well-attended meeting, called "Brainstorming discussion for a possible biomedical facility at CERN", was held by the European Organization for Nuclear Research (CERN) at the European Laboratory for Particle Physics on 25 June 2012. This was concerned with adapting an existing, but little used, 78-m circumference CERN synchrotron to deliver a wide range of ion species, preferably from protons to at least neon ions, with beam specifications that match existing clinical facilities. The potential extensive research portfolio discussed included beam ballistics in humanoid phantoms, advanced dosimetry, remote imaging techniques and technical developments in beam delivery, including gantry design. In addition, a modern laboratory for biomedical characterisation of these beams would allow important radiobiological studies, such as relative biological effectiveness, in a dedicated facility with standardisation of experimental conditions and biological end points. A control photon and electron beam would be required nearby for relative biological effectiveness comparisons. Research beam time availability would far exceed that at other facilities throughout the world. This would allow more rapid progress in several biomedical areas, such as in charged hadron therapy of cancer, radioisotope production and radioprotection. The ethos of CERN, in terms of open access, peer-reviewed projects and governance has been so successful for High Energy Physics that application of the same to biomedicine would attract high-quality research, with possible contributions from Europe and beyond, along with potential new funding streams.

  13. Test of the CLAS12 RICH large-scale prototype in the direct proximity focusing configuration

    DOE PAGES

    Anefalos Pereira, S.; Baltzell, N.; Barion, L.; ...

    2016-02-11

    A large area ring-imaging Cherenkov detector has been designed to provide clean hadron identification capability in the momentum range from 3 GeV/c up to 8 GeV/c for the CLAS12 experiments at the upgraded 12 GeV continuous electron beam accelerator facility of Jefferson Laboratory. The adopted solution foresees a novel hybrid optics design based on aerogel radiator, composite mirrors and high-packed and high-segmented photon detectors. Cherenkov light will either be imaged directly (forward tracks) or after two mirror reflections (large angle tracks). We report here the results of the tests of a large scale prototype of the RICH detector performed withmore » the hadron beam of the CERN T9 experimental hall for the direct detection configuration. As a result, the tests demonstrated that the proposed design provides the required pion-to-kaon rejection factor of 1:500 in the whole momentum range.« less

  14. Design optimization of an ironless inductive position sensor for the LHC collimators

    NASA Astrophysics Data System (ADS)

    Danisi, A.; Masi, A.; Losito, R.; Perriard, Y.

    2013-09-01

    The Ironless Inductive Position Sensor (I2PS) is an air-cored displacement sensor which has been conceived to be totally immune to external DC/slowly-varying magnetic fields. It can thus be used as a valid alternative to Linear Variable Differential Transformers (LVDTs), which can show a position error in magnetic environments. In addition, since it retains the excellent properties of LVDTs, the I2PS can be used in harsh environments, such as nuclear plants, plasma control and particle accelerators. This paper focuses on the design optimization of the sensor, considering the CERN LHC Collimators as application. In particular, the optimization comes after a complete review of the electromagnetic and thermal modeling of the sensor, as well as the proper choice of the reading technique. The design optimization stage is firmly based on these preliminary steps. Therefore, the paper summarises the sensor's complete development, from its modeling to its actual implementation. A set of experimental measurements demonstrates the sensor's performances to be those expected in the design phase.

  15. Discovery of naked charm particles and lifetime differences among charm species using nuclear emulsion techniques innovated in Japan

    PubMed Central

    NIU, Kiyoshi

    2008-01-01

    This is a historical review of the discovery of naked charm particles and lifetime differences among charm species. These discoveries in the field of cosmic-ray physics were made by the innovation of nuclear emulsion techniques in Japan. A pair of naked charm particles was discovered in 1971 in a cosmic-ray interaction, three years prior to the discovery of the hidden charm particle, J/Ψ, in western countries. Lifetime differences between charged and neutral charm particles were pointed out in 1975, which were later re-confirmed by the collaborative Experiment E531 at Fermilab. Japanese physicists led by K.Niu made essential contributions to it with improved emulsion techniques, complemented by electronic detectors. This review also discusses the discovery of artificially produced naked charm particles by us in an accelerator experiment at Fermilab in 1975 and of multiple-pair productions of charm particles in a single interaction in 1987 by the collaborative Experiment WA75 at CERN. PMID:18941283

  16. Cosmic Radiation Detection and Observations

    NASA Astrophysics Data System (ADS)

    Ramirez Chavez, Juan; Troncoso, Maria

    Cosmic rays consist of high-energy particles accelerated from remote supernova remnant explosions and travel vast distances throughout the universe. Upon arriving at earth, the majority of these particles ionize gases in the upper atmosphere, while others interact with gas molecules in the troposphere and producing secondary cosmic rays, which are the main focus of this research. To observe these secondary cosmic rays, a detector telescope was designed and equipped with two silicon photomultipliers (SiPMs). Each SiPM is coupled to a bundle of 4 wavelength shifting optical fibers that are embedded inside a plastic scintillator sheet. The SiPM signals were amplified using a fast preamplifier with coincidence between detectors established using a binary logic gate. The coincidence events were recorded with two devices; a digital counter and an Arduino micro-controller. For detailed analysis of the SiPM waveforms, a DRS4 sensory digitizer captured the waveforms for offline analysis with the CERN software package Physics Analysis Workstation in a Linux environment. Results from our experiments would be presented. Hartnell College STEM Internship Program.

  17. Analytical N beam position monitor method

    NASA Astrophysics Data System (ADS)

    Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.

    2017-11-01

    Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.

  18. Tests of a Roman Pot prototype for the TOTEM experiment

    NASA Astrophysics Data System (ADS)

    Deile, M.; Alagoz, E.; Anelli, G.; Antchev, G.; Ayache, M.; Caspers, F.; Dimovasili, E.; Dinapoli, R.; Drouhin, F.; Eggert, K.; Escourrou, J.L; Fochler, O.; Gill, K.; Grabit, R.; Haung, F.; Jarron, P.; Kaplon, J.; Kroyer, T.; Luntama, T.; Macina, D.; Mattelon, E.; Niewiadomski, H.; Mirabito, L.; Noschis, E.P.; Oriunno, M.; Park, a.; Perrot, A.-L.; Pirotte, O.; Quetsch, J.M.; Regnier, F.; Ruggiero, G.; Saramad, S.; Siegrist, P.; Snoeys, W.; sSouissi, T.; Szczygiel, R.; Troska, J.; Vasey, F.; Verdier, A.; Da Vià, C.; Hasi, J.; Kok, A.; Watts, S.; Kašpar, J.; Kundrát, V.; Lokajíček, M.V.; Smotlacha, J.; Avati, V.; Järvinen, M.; Kalliokoski, M.; Kalliopuska, J.; Kurvinen, K.; Lauhakangas, R.; Oljemark, F.; Orava, R.; Österberg, K.; Palmieri, V.; Saarikko, H.; Soininen, A.; Boccone, V.; Bozzo, M.; Buzzo, A.; Cuneo, S.; Ferro, F.; Macrí, M.; Minutoli, S.; Morelli, A.; Musico, P.; Negri, M.; Santroni, A.; Sette, G.; Sobol, A.; sBerardi, V.; Catanesi, M.G.; Radicioni, E.

    The TOTEM collaboration has developed and tested the first prototype of its Roman Pots to be operated in the LHC. TOTEM Roman Pots contain stacks of 10 silicon detectors with strips oriented in two orthogonal directions. To measure proton scattering angles of a few microradians, the detectors will approach the beam centre to a distance of 10 sigma + 0.5 mm (= 1.3 mm). Dead space near the detector edge is minimised by using two novel "edgeless" detector technologies. The silicon detectors are used both for precise track reconstruction and for triggering. The first full-sized prototypes of both detector technologies as well as their read-out electronics have been developed, built and operated. The tests took place first in a fixed-target muon beam at CERN's SPS, and then in the proton beam-line of the SPS accelerator ring. We present the test beam results demonstrating the successful functionality of the system despite slight technical shortcomings to be improved in the near future.

  19. The Beginning of the Physics of Leptons

    NASA Astrophysics Data System (ADS)

    Ting, Samuel C. C.

    Over the last 30 years the study of lepton pairs from both hadron and electron accelerators and colliders has led to the discovery of J, ϒ, Z and W particles. The study of acoplanar eμ pairs + missing energy has led to the discovery of the heavy lepton, now called τ lepton. Indeed, the study of lepton pairs with and without missing energy has become the main method in high energy colliders for searching new particles. This paper presents some of the important contributions made by Antonino Zichichi over a 10 year period at CERN and Frascati in opening this new field of physics. This includes the development of instrumentation to distinguish leptons from hadrons, the first experiment on lepton pair production from hadron machines, the precision tests of electrodynamics at very small distances, the production of hadrons from e+e- collisions and most importantly his invention of a new method e+e- → eμ + missing momenta, experimentally proving that, thanks to his new electron and muon detection technology, these signals have very little background.

  20. The discovery and measurements of a Higgs boson.

    PubMed

    Gianotti, F; Virdee, T S

    2015-01-13

    In July 2012, the ATLAS and CMS collaborations at CERN's Large Hadron Collider announced the discovery of a Higgs-like boson, a new heavy particle at a mass more than 130 times the mass of a proton. Since then, further data have revealed its properties to be strikingly similar to those of the Standard Model Higgs boson, a particle expected from the mechanism introduced almost 50 years ago by six theoreticians including British physicists Peter Higgs from Edinburgh University and Tom Kibble from Imperial College London. The discovery is the culmination of a truly remarkable scientific journey and undoubtedly the most significant scientific discovery of the twenty-first century so far. Its experimental confirmation turned out to be a monumental task requiring the creation of an accelerator and experiments of unprecedented capability and complexity, designed to discern the signatures that correspond to the Higgs boson. Thousands of scientists and engineers, in each of the ATLAS and CMS teams, came together from all four corners of the world to make this massive discovery possible.

  1. Dimensional changes of Nb 3Sn Rutherford cables during heat treatment

    DOE PAGES

    Rochepault, E.; Ferracin, P.; Ambrosio, G.; ...

    2016-06-01

    In high field magnet applications, Nb 3Sn coils undergo a heat treatment step after winding. During this stage, coils radially expand and longitudinally contract due to the Nb 3Sn phase change. In order to prevent residual strain from altering superconducting performances, the tooling must provide the adequate space for these dimensional changes. The aim of this paper is to understand the behavior of cable dimensions during heat treatment and to provide estimates of the space to be accommodated in the tooling for coil expansion and contraction. In addition, this paper summarizes measurements of dimensional changes on strands, single Rutherford cables,more » cable stacks, and coils performed between 2013 and 2015. These samples and coils have been performed within a collaboration between CERN and the U.S. LHC Accelerator Research Program to develop Nb 3Sn quadrupole magnets for the HiLumi LHC. The results are also compared with other high field magnet projects.« less

  2. Experience with case tools in the design of process-oriented software

    NASA Astrophysics Data System (ADS)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  3. Nuclear spectroscopy with Geant4. The superheavy challenge

    NASA Astrophysics Data System (ADS)

    Sarmiento, Luis G.

    2016-12-01

    The simulation toolkit Geant4 was originally developed at CERN for high-energy physics. Over the years it has been established as a swiss army knife not only in particle physics but it has seen an accelerated expansion towards nuclear physics and more recently to medical imaging and γ- and ion- therapy to mention but a handful of new applications. The validity of Geant4 is vast and large across many particles, ions, materials, and physical processes with typically various different models to choose from. Unfortunately, atomic nuclei with atomic number Z > 100 are not properly supported. This is likely due to the rather novelty of the field, its comparably small user base, and scarce evaluated experimental data. To circumvent this situation different workarounds have been used over the years. In this work the simulation toolkit Geant4 will be introduced with its different components and the effort to bring the software to the heavy and superheavy region will be described.

  4. ENLIGHT and other EU-funded projects in hadron therapy.

    PubMed

    Dosanjh, M; Jones, B; Mayer, R; Meyer, R

    2010-10-01

    Following impressive results from early phase trials in Japan and Germany, there is a current expansion in European hadron therapy. This article summarises present European Union-funded projects for research and co-ordination of hadron therapy across Europe. Our primary focus will be on the research questions associated with carbon ion treatment of cancer, but these considerations are also applicable to treatments using proton beams and other light ions. The challenges inherent in this new form of radiotherapy require maximum interdisciplinary co-ordination. On the basis of its successful track record in particle and accelerator physics, the internationally funded CERN laboratories (otherwise known as the European Organisation for Nuclear Research) have been instrumental in promoting collaborations for research purposes in this area of radiation oncology. There will soon be increased opportunities for referral of patients across Europe for hadron therapy. Oncologists should be aware of these developments, which confer enhanced prospects for better cancer cure rates as well as improved quality of life in many cancer patients.

  5. Experiments with crystal deflectors for high energy ion beams: Electromagnetic dissociation probability for well channeled ions

    NASA Astrophysics Data System (ADS)

    Scandale, W.; Taratin, A. M.; Kovalenko, A. D.

    2013-01-01

    The paper presents the current status with the use of the crystal defectors for high energy ion beams. The channeling properties of multicharged ions are discussed. The results of the experiments on the deflection and extraction (collimation) of high energy ion beams with bent crystals performed in the accelerator centers are shortly considered. The analysis of the recent collimation experiment with a Pb nuclei of 270GeV/c per charge at the CERN Super Proton Synchrotron showed that the channeling efficiency was as large as about 90%. For Pb ions of the LHC energies a new mechanism, which can reduce the channeling efficiency, appears. The electromagnetic dissociation (ED) becomes possible for well channeled particles. However, the estimations performed in the paper show that the ED probability is small and should not visibly reduce the collimation efficiency. On the other hand, the aligned crystal gives the possibility to study the ED processes of heavy nuclei in the conditions when nuclear interactions are fully suppressed.

  6. Graphical processors for HEP trigger systems

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Di Lorenzo, S.; Fantechi, R.; Fiorini, M.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Neri, I.; Paolucci, P. S.; Pastorelli, E.; Piandani, R.; Pontisso, L.; Rossetti, D.; Simula, F.; Sozzi, M.; Vicini, P.

    2017-02-01

    General-purpose computing on GPUs is emerging as a new paradigm in several fields of science, although so far applications have been tailored to employ GPUs as accelerators in offline computations. With the steady decrease of GPU latencies and the increase in link and memory throughputs, time is ripe for real-time applications using GPUs in high-energy physics data acquisition and trigger systems. We will discuss the use of online parallel computing on GPUs for synchronous low level trigger systems, focusing on tests performed on the trigger of the CERN NA62 experiment. Latencies of all components need analysing, networking being the most critical. To keep it under control, we envisioned NaNet, an FPGA-based PCIe Network Interface Card (NIC) enabling GPUDirect connection. Moreover, we discuss how specific trigger algorithms can be parallelised and thus benefit from a GPU implementation, in terms of increased execution speed. Such improvements are particularly relevant for the foreseen LHC luminosity upgrade where highly selective algorithms will be crucial to maintain sustainable trigger rates with very high pileup.

  7. Measuring gravitational effects on antimatter in space

    NASA Astrophysics Data System (ADS)

    Piacentino, Giovanni Maria; Gioiosa, Antonio; Palladino, Anthony; Venanzoni, Graziano

    2017-04-01

    A direct measurement of the gravitational acceleration of antimatter has never been performed to date. Recently, such an experiment has been proposed, using antihydrogen with an atom interferometer and an antihydrogen confinament has been realized at CERN. In alternative we propose an experimental test of the gravitational interaction with antimatter by measuring the branching fraction of the CP violating decay of KL in space. In fact, even if the theoretical Standard Model explains the CPV with the presence of pure phase in the KMC Kobaiashi-Maskava-Cabibbo matrix, ample room is left for contributions by other interactions and forces to generate CPV in the mixing of the neutral K and B mesons. Gravitation is a good candidate and we show that at the altitude of the International Space Station, gravitational effects may change the level of CP violation such that a 5 sigma discrimination may be obtained by collecting the KL produced by the cosmic proton flux within a few years.

  8. The LHCb Run Control

    NASA Astrophysics Data System (ADS)

    Alessio, F.; Barandela, M. C.; Callot, O.; Duval, P.-Y.; Franek, B.; Frank, M.; Galli, D.; Gaspar, C.; Herwijnen, E. v.; Jacobsson, R.; Jost, B.; Neufeld, N.; Sambade, A.; Schwemmer, R.; Somogyi, P.

    2010-04-01

    LHCb has designed and implemented an integrated Experiment Control System. The Control System uses the same concepts and the same tools to control and monitor all parts of the experiment: the Data Acquisition System, the Timing and the Trigger Systems, the High Level Trigger Farm, the Detector Control System, the Experiment's Infrastructure and the interaction with the CERN Technical Services and the Accelerator. LHCb's Run Control, the main interface used by the experiment's operator, provides access in a hierarchical, coherent and homogeneous manner to all areas of the experiment and to all its sub-detectors. It allows for automated (or manual) configuration and control, including error recovery, of the full experiment in its different running modes. Different instances of the same Run Control interface are used by the various sub-detectors for their stand-alone activities: test runs, calibration runs, etc. The architecture and the tools used to build the control system, the guidelines and components provided to the developers, as well as the first experience with the usage of the Run Control will be presented

  9. Status and operation of the Linac4 ion source prototypes

    NASA Astrophysics Data System (ADS)

    Lettry, J.; Aguglia, D.; Andersson, P.; Bertolo, S.; Butterworth, A.; Coutron, Y.; Dallocchio, A.; Chaudet, E.; Gil-Flores, J.; Guida, R.; Hansen, J.; Hatayama, A.; Koszar, I.; Mahner, E.; Mastrostefano, C.; Mathot, S.; Mattei, S.; Midttun, Ø.; Moyret, P.; Nisbet, D.; Nishida, K.; O'Neil, M.; Ohta, M.; Paoluzzi, M.; Pasquino, C.; Pereira, H.; Rochez, J.; Sanchez Alvarez, J.; Sanchez Arias, J.; Scrivens, R.; Shibata, T.; Steyaert, D.; Thaus, N.; Yamamoto, T.

    2014-02-01

    CERN's Linac4 45 kV H- ion sources prototypes are installed at a dedicated ion source test stand and in the Linac4 tunnel. The operation of the pulsed hydrogen injection, RF sustained plasma, and pulsed high voltages are described. The first experimental results of two prototypes relying on 2 MHz RF-plasma heating are presented. The plasma is ignited via capacitive coupling, and sustained by inductive coupling. The light emitted from the plasma is collected by viewports pointing to the plasma chamber wall in the middle of the RF solenoid and to the plasma chamber axis. Preliminary measurements of optical emission spectroscopy and photometry of the plasma have been performed. The design of a cesiated ion source is presented. The volume source has produced a 45 keV H- beam of 16-22 mA which has successfully been used for the commissioning of the Low Energy Beam Transport (LEBT), Radio Frequency Quadrupole (RFQ) accelerator, and chopper of Linac4.

  10. ENLIGHT and other EU-funded projects in hadron therapy

    PubMed Central

    Dosanjh, M; Jones, B; Meyer, R

    2010-01-01

    Following impressive results from early phase trials in Japan and Germany, there is a current expansion in European hadron therapy. This article summarises present European Union-funded projects for research and co-ordination of hadron therapy across Europe. Our primary focus will be on the research questions associated with carbon ion treatment of cancer, but these considerations are also applicable to treatments using proton beams and other light ions. The challenges inherent in this new form of radiotherapy require maximum interdisciplinary co-ordination. On the basis of its successful track record in particle and accelerator physics, the internationally funded CERN laboratories (otherwise known as the European Organisation for Nuclear Research) have been instrumental in promoting collaborations for research purposes in this area of radiation oncology. There will soon be increased opportunities for referral of patients across Europe for hadron therapy. Oncologists should be aware of these developments, which confer enhanced prospects for better cancer cure rates as well as improved quality of life in many cancer patients. PMID:20846982

  11. A Security Monitoring Framework For Virtualization Based HEP Infrastructures

    NASA Astrophysics Data System (ADS)

    Gomez Ramirez, A.; Martinez Pedreira, M.; Grigoras, C.; Betev, L.; Lara, C.; Kebschull, U.; ALICE Collaboration

    2017-10-01

    High Energy Physics (HEP) distributed computing infrastructures require automatic tools to monitor, analyze and react to potential security incidents. These tools should collect and inspect data such as resource consumption, logs and sequence of system calls for detecting anomalies that indicate the presence of a malicious agent. They should also be able to perform automated reactions to attacks without administrator intervention. We describe a novel framework that accomplishes these requirements, with a proof of concept implementation for the ALICE experiment at CERN. We show how we achieve a fully virtualized environment that improves the security by isolating services and Jobs without a significant performance impact. We also describe a collected dataset for Machine Learning based Intrusion Prevention and Detection Systems on Grid computing. This dataset is composed of resource consumption measurements (such as CPU, RAM and network traffic), logfiles from operating system services, and system call data collected from production Jobs running in an ALICE Grid test site and a big set of malware samples. This malware set was collected from security research sites. Based on this dataset, we will proceed to develop Machine Learning algorithms able to detect malicious Jobs.

  12. Mapping Remote and Multidisciplinary Learning Barriers: Lessons from "Challenge-Based Innovation" at CERN

    ERIC Educational Resources Information Center

    Jensen, Matilde Bisballe; Utriainen, Tuuli Maria; Steinert, Martin

    2018-01-01

    This paper presents the experienced difficulties of students participating in the multidisciplinary, remote collaborating engineering design course challenge-based innovation at CERN. This is with the aim to identify learning barriers and improve future learning experiences. We statistically analyse the rated differences between distinct design…

  13. DG's New Year's presentation

    ScienceCinema

    Heuer, R.-D.

    2018-05-22

    CERN general staff meeting. Looking back at key messages: Highest priority: LHC physics in 2009; Increase diversity of the scientific program; Prepare for future projects; Establish open and direct communication; Prepare CERN towards a global laboratory; Increase consolidation efforts; Financial situation--tight; Knowledge and technology transfer--proactive; Contract policy and internal mobility--lessons learned.

  14. WorldWide Web: Hypertext from CERN.

    ERIC Educational Resources Information Center

    Nickerson, Gord

    1992-01-01

    Discussion of software tools for accessing information on the Internet focuses on the WorldWideWeb (WWW) system, which was developed at the European Particle Physics Laboratory (CERN) in Switzerland to build a worldwide network of hypertext links using available networking technology. Its potential for use with multimedia documents is also…

  15. Deployment and Operational Experiences with CernVM-FS at the GridKa Tier-1 Center

    NASA Astrophysics Data System (ADS)

    Alef, Manfred; Jäger, Axel; Petzold and, Andreas; Verstege, Bernhard

    2012-12-01

    In 2012 the GridKa Tier-1 computing center hosts 130 kHS06 computing resources and 14PB disk and 17PB tape space. These resources are shared between the four LHC VOs and a number of national and international VOs from high energy physics and other sciences. CernVM-FS has been deployed at GridKa to supplement the existing NFS-based system to access VO software on the worker nodes. It provides a solution tailored to the requirement of the LHC VOs. We will focus on the first operational experiences and the monitoring of CernVM-FS on the worker nodes and the squid caches.

  16. Open Media Training Session

    ScienceCinema

    None

    2017-12-09

    Have you ever wondered how the media work and why some topics make it into the news and other don't? Would you like to know how to (and how not to) give an interview to a journalist? With the LHC preparing for first collisions at high energies, the world's media are again turning their attention to CERN. We're all likely to be called upon to explain what is happening at CERN to media, friends and neighbours. The seminar will be given by BBC television news journalists Liz Pike and Nadia Marchant, and will deal with the kind of questions we're likely to be confronted with through the restart period. Follow the webcast: http://webcast.cern.ch/

  17. CERN - Six Decades of Science, Innovation, Cooperation, and Inspiration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quigg, Chris

    The European Laboratory for Particle Physics, which straddles the Swiss-French border northwest of Geneva, celebrates its sixtieth birthday in 2014 CERN is the preeminent particle-physics institution in the world, currently emphasizing the study of collisions of protons and heavy nuclei at very high energies and the exploration of physics on the electroweak scale (energies where electromagnetism and the weak nuclear force merge). With brilliant accomplishments in research, innovation, and education, and a sustained history of cooperation among people from different countries and cultures, CERN ranks as one of the signal achievements of the postwar European Project. For physicists the worldmore » over, the laboratory is a source of pride and inspiration.« less

  18. More "Hands-On" Particle Physics: Learning with ATLAS at CERN

    ERIC Educational Resources Information Center

    Long, Lynne

    2011-01-01

    This article introduces teachers and students to a new portal of resources called Learning with ATLAS at CERN (http://learningwithatlas-portal.eu/), which has been developed by a European consortium of academic researchers and schools' liaison and outreach providers from countries across Europe. It includes the use of some of the mind-boggling…

  19. History of Cern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2007-12-20

    Cérémonie à l'occasion de l'apparition du premier volume du livre sur l'histoire du Cern, avec plusieurs personnes présentes qui jouaient un rôle important dans cette organisation européenne couronnée de succès grâce à l'esprit des membres fondateurs qui est et restera essentiel

  20. Stabilities of Dried Suspensions of Influenza Virus Sealed in a Vacuum or Under Different Gases

    PubMed Central

    Greiff, Donald; Rightsel, Wilton A.

    1969-01-01

    Suspensions of purified influenza virus, dried to a 1.4% content of residual moisture by sublimation of ice in vacuo, were sealed in a vacuum or under different gases of high purity. The stabilities of the several preparations were determined by an accelerated storage test. Based on the times predicted for the dried preparations stored at different temperatures to lose 1 log of infectivity titer, the order of stabilities in relation to sealing in vacuum or under different gases was as follows: helium > hydrogen > vacuum > argon > nitrogen > oxygen > carbon dioxide. Images PMID:5797938

  1. Graphical and PC-software analysis of volcano eruption precursors according to the Materials Failure Forecast Method (FFM)

    NASA Astrophysics Data System (ADS)

    Cornelius, Reinold R.; Voight, Barry

    1995-03-01

    The Materials Failure Forecasting Method for volcanic eruptions (FFM) analyses the rate of precursory phenomena. Time of eruption onset is derived from the time of "failure" implied by accelerating rate of deformation. The approach attempts to fit data, Ω, to the differential relationship Ω¨=AΩ˙, where the dot superscript represents the time derivative, and the data Ω may be any of several parameters describing the accelerating deformation or energy release of the volcanic system. Rate coefficients, A and α, may be derived from appropriate data sets to provide an estimate of time to "failure". As the method is still an experimental technique, it should be used with appropriate judgment during times of volcanic crisis. Limitations of the approach are identified and discussed. Several kinds of eruption precursory phenomena, all simulating accelerating creep during the mechanical deformation of the system, can be used with FFM. Among these are tilt data, slope-distance measurements, crater fault movements and seismicity. The use of seismic coda, seismic amplitude-derived energy release and time-integrated amplitudes or coda lengths are examined. Usage of cumulative coda length directly has some practical advantages to more rigorously derived parameters, and RSAM and SSAM technologies appear to be well suited to real-time applications. One graphical and four numerical techniques of applying FFM are discussed. The graphical technique is based on an inverse representation of rate versus time. For α = 2, the inverse rate plot is linear; it is concave upward for α < 2 and concave downward for α > 2. The eruption time is found by simple extrapolation of the data set toward the time axis. Three numerical techniques are based on linear least-squares fits to linearized data sets. The "linearized least-squares technique" is most robust and is expected to be the most practical numerical technique. This technique is based on an iterative linearization of the given rate-time series. The hindsight technique is disadvantaged by a bias favouring a too early eruption time in foresight applications. The "log rate versus log acceleration technique", utilizing a logarithmic representation of the fundamental differential equation, is disadvantaged by large data scatter after interpolation of accelerations. One further numerical technique, a nonlinear least-squares fit to rate data, requires special and more complex software. PC-oriented computer codes were developed for data manipulation, application of the three linearizing numerical methods, and curve fitting. Separate software is required for graphing purposes. All three linearizing techniques facilitate an eruption window based on a data envelope according to the linear least-squares fit, at a specific level of confidence, and an estimated rate at time of failure.

  2. Commissioning the CERN IT Agile Infrastructure with experiment workloads

    NASA Astrophysics Data System (ADS)

    Medrano Llamas, Ramón; Harald Barreiro Megino, Fernando; Kucharczyk, Katarzyna; Kamil Denis, Marek; Cinquilli, Mattia

    2014-06-01

    In order to ease the management of their infrastructure, most of the WLCG sites are adopting cloud based strategies. In the case of CERN, the Tier 0 of the WLCG, is completely restructuring the resource and configuration management of their computing center under the codename Agile Infrastructure. Its goal is to manage 15,000 Virtual Machines by means of an OpenStack middleware in order to unify all the resources in CERN's two datacenters: the one placed in Meyrin and the new on in Wigner, Hungary. During the commissioning of this infrastructure, CERN IT is offering an attractive amount of computing resources to the experiments (800 cores for ATLAS and CMS) through a private cloud interface. ATLAS and CMS have joined forces to exploit them by running stress tests and simulation workloads since November 2012. This work will describe the experience of the first deployments of the current experiment workloads on the CERN private cloud testbed. The paper is organized as follows: the first section will explain the integration of the experiment workload management systems (WMS) with the cloud resources. The second section will revisit the performance and stress testing performed with HammerCloud in order to evaluate and compare the suitability for the experiment workloads. The third section will go deeper into the dynamic provisioning techniques, such as the use of the cloud APIs directly by the WMS. The paper finishes with a review of the conclusions and the challenges ahead.

  3. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    ScienceCinema

    Arnold, Jeffrey

    2018-05-14

    Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided. About the speaker: Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.

  4. Got Questions About the Higgs Boson? Ask a Scientist

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinchliffe, Ian

    Ask a scientist about the Higgs boson. There's a lot of buzz this week over new data from CERN's Large Hadron Collider (LHC) and the final data from Fermilab's Tevatron about the Higgs boson. It raises questions about what scientists have found and what still remains to be found -- and what it all means. Berkeley Lab's Ian Hinchliffe invites you to send in questions about the Higgs. He'll answer a few of your questions in a follow-up video later this week. Hinchliffe is a theoretical physicist who heads Berkeley Lab's sizable contingent with the ATLAS experiment at CERN. •more » Post your questions in the comment box • E-mail your questions to askascientist@lbl.gov • Tweet to @BerkeleyLab • Or post on our facebook page: facebook/berkeleylab Update on July 5: Ian responds to several of your questions in this video: http://youtu.be/1BkpD1IS62g. Update on 7/04: Here's CERN's press release from earlier today on the latest preliminary results in the search for the long sought Higgs particle: http://press.web.cern.ch/press/PressReleases/Releases2012/PR17.12E.htm. And here's a Q&A on what the news tells us: http://cdsweb.cern.ch/journal/CERNBulletin/2012/28/News%20Articles/1459460?ln=en. CERN will present the new LHC data at a seminar July 4th at 9:00 in the morning Geneva time (3:00 in the morning Eastern Daylight Time, midnight on the Pacific Coast), where the ATLAS collaboration and their rivals in the CMS experiment will announce their results. Tevatron results were announced by Fermilab on Monday morning. For more background on the LHC's search for the Higgs boson, visit http://newscenter.lbl.gov/feature-stories/2012/06/28/higgs-2012/.« less

  5. Got Questions About the Higgs Boson? Ask a Scientist

    ScienceCinema

    Hinchliffe, Ian

    2017-12-12

    Ask a scientist about the Higgs boson. There's a lot of buzz this week over new data from CERN's Large Hadron Collider (LHC) and the final data from Fermilab's Tevatron about the Higgs boson. It raises questions about what scientists have found and what still remains to be found -- and what it all means. Berkeley Lab's Ian Hinchliffe invites you to send in questions about the Higgs. He'll answer a few of your questions in a follow-up video later this week. Hinchliffe is a theoretical physicist who heads Berkeley Lab's sizable contingent with the ATLAS experiment at CERN. • Post your questions in the comment box • E-mail your questions to askascientist@lbl.gov • Tweet to @BerkeleyLab • Or post on our facebook page: facebook/berkeleylab Update on July 5: Ian responds to several of your questions in this video: http://youtu.be/1BkpD1IS62g. Update on 7/04: Here's CERN's press release from earlier today on the latest preliminary results in the search for the long sought Higgs particle: http://press.web.cern.ch/press/PressReleases/Releases2012/PR17.12E.htm. And here's a Q&A on what the news tells us: http://cdsweb.cern.ch/journal/CERNBulletin/2012/28/News%20Articles/1459460?ln=en. CERN will present the new LHC data at a seminar July 4th at 9:00 in the morning Geneva time (3:00 in the morning Eastern Daylight Time, midnight on the Pacific Coast), where the ATLAS collaboration and their rivals in the CMS experiment will announce their results. Tevatron results were announced by Fermilab on Monday morning. For more background on the LHC's search for the Higgs boson, visit http://newscenter.lbl.gov/feature-stories/2012/06/28/higgs-2012/.

  6. Progress report on nuclear spectroscopic studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bingham, C.R.; Guidry, M.W.; Riedinger, L.L.

    1994-02-18

    The Nuclear Physics group at the University of Tennessee, Knoxville (UTK) is involved in several aspects of heavy-ion physics including both nuclear structure and reaction mechanisms. While the main emphasis is on experimental problems, the authors have maintained a strong collaboration with several theorists in order to best pursue the physics of their measurements. During the last year they have had several experiments at the ATLAS at Argonne National Laboratory, the GAMMASPHERE at the LBL 88 Cyclotron, and with the NORDBALL at the Niels Bohr Institute Tandem. Also, they continue to be very active in the WA93/98 collaboration studying ultra-relativisticmore » heavy ion physics utilizing the SPS accelerator at CERN in Geneva, Switzerland and in the PHENIX Collaboration at the RHIC accelerator under construction at Brookhaven National Laboratory. During the last year their experimental work has been in three broad areas: (1) the structure of nuclei at high angular momentum, (2) the structure of nuclei far from stability, and (3) ultra-relativistic heavy-ion physics. The results of studies in these particular areas are described in this document. These studies concentrate on the structure of nuclear matter in extreme conditions of rotational motion, imbalance of neutrons and protons, or very high temperature and density. Another area of research is heavy-ion-induced transfer reactions, which utilize the transfer of nucleons to states with high angular momentum to learn about their structure and to understand the transfer of particles, energy, and angular momentum in collisions between heavy ions.« less

  7. Radiation tolerant power converter controls

    NASA Astrophysics Data System (ADS)

    Todd, B.; Dinius, A.; King, Q.; Uznanski, S.

    2012-11-01

    The Large Hadron Collider (LHC) at the European Organisation for Nuclear Research (CERN) is the world's most powerful particle collider. The LHC has several thousand magnets, both warm and super-conducting, which are supplied with current by power converters. Each converter is controlled by a purpose-built electronic module called a Function Generator Controller (FGC). The FGC allows remote control of the power converter and forms the central part of a closed-loop control system where the power converter voltage is set, based on the converter output current and magnet-circuit characteristics. Some power converters and FGCs are located in areas which are exposed to beam-induced radiation. There are numerous radiation induced effects, some of which lead to a loss of control of the power converter, having a direct impact upon the accelerator's availability. Following the first long shut down (LS1), the LHC will be able to run with higher intensity beams and higher beam energy. This is expected to lead to significantly increased radiation induced effects in materials close to the accelerator, including the FGC. Recent radiation tests indicate that the current FGC would not be sufficiently reliable. A so-called FGClite is being designed to work reliably in the radiation environment in the post-LS1 era. This paper outlines the concepts of power converter controls for machines such as the LHC, introduces the risks related to radiation and a radiation tolerant project flow. The FGClite is then described, with its key concepts and challenges: aiming for high reliability in a radiation field.

  8. Evaluation of a commercial system for CAMAC-based control of the Chalk River Laboratories tandem-accelerator-superconducting-cyclotron complexcomplex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greiner, B.F.; Caswell, D.J.; Slater, W.R.

    1992-04-01

    This paper discusses the control system of the Tandem Accelerator Superconducting Cyclotron (TASCC) of AECL Research at its Chalk River Laboratories which is presently based on a PDP-11 computer and the IAS operating system. The estimated expense of a custom conversion of the system to a current, equivalent operating system is prohibitive. The authors have evaluated a commercial control package from VISTA Control Systems based on VAX microcomputers and the VMS operating system. Vsystem offers a modern, graphical operator interface, an extensive software toolkit for configuration of the system and a multi-feature data-logging capability, all of which far surpass themore » functionality of the present control system. However, the implementation of some familiar, practical features that TASCC operators find to be essential has proven to be challenging. The assessment of Vsystem, which is described in terms of presently perceived strengths and weaknesses, is, on balance, very positive.« less

  9. Of people, particles and prejudice

    NASA Astrophysics Data System (ADS)

    Jackson, Penny; Greene, Anne; Mears, Matt; Spacecadet1; Green, Christian; Hunt, Devin J.; Berglyd Olsen, Veronica K.; Ilya, Komarov; Pierpont, Elaine; Gillman, Matthew

    2016-05-01

    In reply to Louise Mayor's feature article “Where people and particles collide”, about the experiences of researchers at CERN who are lesbian, gay, bisexual or transgender (LGBT), efforts to make LGBT CERN an officially recognized club, and incidents where posters advertising the club have been torn down or defaced (March pp31-36, http://ow.ly/YVP2Z).

  10. The Secret Chambers in the Chephren Pyramid

    ERIC Educational Resources Information Center

    Gutowski, Bartosz; Józwiak, Witold; Joos, Markus; Kempa, Janusz; Komorowska, Kamila; Krakowski, Kamil; Pijus, Ewa; Szymczak, Kamil; Trojanowska, Malgorzata

    2018-01-01

    In 2016, we (seven high school students from a school in Plock, Poland) participated in the CERN Beamline for Schools competition. Together with our team coach, Mr. Janusz Kempa, we submitted a proposal to CERN that was selected as one of two winning proposals that year. This paper describes our experiment from the early days of brainstorming to…

  11. Lead Ions and Coulomb's Law at the LHC (CERN)

    ERIC Educational Resources Information Center

    Cid-Vidal, Xabier; Cid, Ramon

    2018-01-01

    Although for most of the time the Large Hadron Collider (LHC) at CERN collides protons, for around one month every year lead ions are collided, to expand the diversity of the LHC research programme. Furthermore, in an effort not originally foreseen, proton-lead collisions are also taking place, with results of high interest to the physics…

  12. From strangeness enhancement to quark-gluon plasma discovery

    NASA Astrophysics Data System (ADS)

    Koch, Peter; Müller, Berndt; Rafelski, Johann

    2017-11-01

    This is a short survey of signatures and characteristics of the quark-gluon plasma in the light of experimental results that have been obtained over the past three decades. In particular, we present an in-depth discussion of the strangeness observable, including a chronology of the experimental effort to detect QGP at CERN-SPS, BNL-RHIC, and CERN-LHC.

  13. Ceremony 25th birthday Cern

    ScienceCinema

    None

    2018-05-18

    Celebration of CERN's 25th birthday with a speech by L. Van Hove and J.B. Adams, musical interludes by Ms. Mey and her colleagues (starting with Beethoven). The general managers then proceed with the presentation of souvenirs to members of the personnel who have 25 years of service in the organization. A gesture of recognition is also given to Zwerner.

  14. Comittees

    NASA Astrophysics Data System (ADS)

    2004-10-01

    Fritz Caspers (CERN, Switzerland), Michel Chanel (CERN, Switzerland), Håkan Danared (MSL, Sweden), Bernhard Franzke (GSI, Germany), Manfred Grieser (MPI für Kernphysik, Germany), Dieter Habs (LMU München, Germany), Jeffrey Hangst (University of Aarhus, Denmark), Takeshi Katayama (RIKEN/Univ. Tokyo, Japan), H.-Jürgen Kluge (GSI, Germany), Shyh-Yuan Lee (Indiana University, USA), Rudolf Maier (FZ Jülich, Germany), John Marriner (FNAL, USA), Igor Meshkov (JINR, Russia), Dieter Möhl (CERN, Switzerland), Vasily Parkhomchuk (BINP, Russia), Robert Pollock (Indiana University), Dieter Prasuhn (FZ Jülich, Germany), Dag Reistad (TSL, Sweden), John Schiffer (ANL, USA), Andrew Sessler (LBNL, USA), Alexander Skrinsky (BINP, Russia), Markus Steck (GSI, Germany), Jie Wei (BNL, USA), Andreas Wolf (MPI für Kernphysik, Germany), Hongwei Zhao (IMP, People's Rep. of China).

  15. Across Europe to CERN: Taking students on the ultimate physics experience

    NASA Astrophysics Data System (ADS)

    Wheeler, Sam

    2018-05-01

    In 2013, I was an Einstein Fellow with the U.S. Department of Energy and I was asked by a colleague, working in a senator's office, if I would join him in a meeting with a physicist to "translate" the science into something more understandable. That meeting turned out to be a wonderful opportunity I would never have otherwise had. During the meeting I met Michael Tuts, a physicist who was working on project ATLAS at CERN. Afterwards, I walked with him out of the Senate office building to Union Station and, in parting, he gave me his card and told me that if I were in Geneva that he could help me get a tour of CERN and the LHC.

  16. User and group storage management the CMS CERN T2 centre

    NASA Astrophysics Data System (ADS)

    Cerminara, G.; Franzoni, G.; Pfeiffer, A.

    2015-12-01

    A wide range of detector commissioning, calibration and data analysis tasks is carried out by CMS using dedicated storage resources available at the CMS CERN Tier-2 centre. Relying on the functionalities of the EOS disk-only storage technology, the optimal exploitation of the CMS user/group resources has required the introduction of policies for data access management, data protection, cleanup campaigns based on access pattern, and long term tape archival. The resource management has been organised around the definition of working groups and the delegation to an identified responsible of each group composition. In this paper we illustrate the user/group storage management, and the development and operational experience at the CMS CERN Tier-2 centre in the 2012-2015 period.

  17. [CERN-MEDICIS (Medical Isotopes Collected from ISOLDE): a new facility].

    PubMed

    Viertl, David; Buchegger, Franz; Prior, John O; Forni, Michel; Morel, Philippe; Ratib, Osman; Bühler Léo H; Stora, Thierry

    2015-06-17

    CERN-MEDICIS is a facility dedicated to research and development in life science and medical applications. The research platform was inaugurated in October 2014 and will produce an increasing range of innovative isotopes using the proton beam of ISOLDE for fundamental studies in cancer research, for new imaging and therapy protocols in cell and animal models and for preclinical trials, possibly extended to specific early phase clinical studies (phase 0) up to phase I trials. CERN, the University Hospital of Geneva (HUG), the University Hospital of Lausanne (CHUV), the Swiss Institute for Experimental Cancer (ISREC) at Swiss Federal Institutes of Technology (EPFL) that currently support the project will benefit of the initial production that will then be extended to other centers.

  18. Biological hydrolysis pretreatment on secondary sludge: Enhancement of anaerobic digestion and mechanism study.

    PubMed

    Ding, Huihuang H; Chang, Sheng; Liu, Yi

    2017-11-01

    The performance of biological hydrolysis (BH) pretreatment on municipal secondary sludge was evaluated in this study. During 6-day BH at 42°C (BH42), soluble chemical oxygen demand (sCOD) increased from 175.2±38.2mg/L to 3314.5±683.4mg/L; the dominant volatile fatty acid (VFA) was acetic acid, and its concentration increased from 41.5±2.1mg/L to 786.0±133.2mg/L. The extracted extracellular polymeric substances (EPS) from untreated secondary sludge contained three main fractions, and Fraction I gradually decreased from 133.9kDa to 24.9kDa during 6-day BH42. The BH pre-treatment at 42°C and 55°C both achieved more than 4-log reduction of total coliforms and 3-log reduction of E. coli. The BH pretreated secondary sludge at 15-day biochemical methane potential (BMP) test was comparable with the untreated secondary sludge after 30-day BMP, showing a significant enhancement on the acceleration of biogas production by BH pretreatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Unified high-temperature behavior of thin-gauge superalloys

    NASA Astrophysics Data System (ADS)

    England, Raymond Oliver

    This research proposes a methodology for accelerated testing in the area of high-temperature creep and oxidation resistance for thin-gauge superalloy materials. Traditional long-term creep (stress-relaxation) and oxidation tests are completed to establish a baseline. The temperature range used in this study is between 1200 and 1700°F. The alloys investigated are Incoloy MA 956, Waspaloy, Haynes 214, Haynes 242, Haynes 230, and Incoloy 718. The traditional creep test involves loading the specimens to a constant test mandrel radius of curvature, and measuring the retained radius of curvature as a function of time. The accelerated creep test uses a servohydraulic test machine to conduct single specimen, variable strain-rate load relaxation experiments. Standard metallographic evaluations are used to determine extent and morphology of attack in the traditional oxidation tests, while the accelerated oxidation test utilizes thermogravimetric analysis to obtain oxidation rate data. The traditional long-term creep testing indicates that the mechanically-alloyed material Incoloy MA 956 and Haynes alloy 214 may be suitable for long-term, high-temperature (above 1400°F) structural applications. The accelerated creep test produced a continuous linear function of log stress versus strain rate which can be used to calculate creep rate. The long-term and traditional oxidation tests indicate that Al2O3 scale formers such as Incoloy MA 956 and Haynes 214 are much more resistant to high-temperature oxidation than Cr2O3 scale formers such as Waspaloy. Both accelerated tests can be completed within roughly one day, and can evaluate multiple test temperatures using standardized single specimens. These simple experiments can be correlated with traditional long-term tests which require years to complete.

  20. Hair Cortisol Concentrations in Adolescent Girls with Anorexia Nervosa are Lower Compared to Healthy and Psychiatric Controls.

    PubMed

    Föcker, Manuel; Stalder, Tobias; Kirschbaum, Clemens; Albrecht, Muriel; Adams, Frederike; de Zwaan, Martina; Hebebrand, Johannes; Peters, Triinu; Albayrak, Özgür

    2016-11-01

    In anorexia nervosa (AN) hypercortisolism has been described using urine, plasma and saliva samples as short-term markers for the hypothalamic-pituitary-adrenal (HPA)-axis. Here, for the first time, we analyse hair cortisol concentration (HCC) as a marker for long-term integrated cortisol secretion in female patients with AN compared to female healthy controls (HC) and female psychiatric controls (PC). HCC was assessed in 22 female adolescent psychiatric inpatients with AN compared to 20 female HC and to 117 female PC of the same age range. For further analyses we examined the associations of age and body mass index (BMI) with HCC. Log HCC was lower in AN-patients compared to HC (p = 0.030). BMI-standard deviation scores (SDS) but not age correlated with log HCC (BMI-SDS: r = 0.19, bias corrected accelerated 95% confidence interval: [.04, .34], p = 0.015; age: r = 0.10, bias corrected accelerated 95% confidence interval: [-.07, .25], p = 0.213) when combining AN, HC and PC samples. We find lower HCC in AN, compared to HC and PC, respectively. Based on the relationship between HCC and BMI-SDS across AN, HC and PC, we argue that HCC might not capture endocrine alterations because of AN pathology-related processes but rather shows consistent relationships with BMI, which extent even to the very low range of BMI values, as present in AN patients. Alternatively, incorporation of cortisol into the hair follicle might have been compromised because of trophic hair follicle disturbances that had been reported in AN patients, previously. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association.

Top