Bruning, Oliver
2018-05-23
Overview of the operation and upgrade plans for the machine. Upgrade studies and taskforces. The Chamonix 2010 discussions led to five new task forces: planning for a long shut down in 2012 for splice consolidation; long term consolidation planning for the injector complex; SPS upgrade task force (accelerated program for SPS upgrade); PSB upgrade and its implications for the PS (e.g. radiation etc.); LHC High Luminosity project (investigate planning for ONE upgrade by 2018-2020); Launch of a dedicated study for doubling the beam energy in the LHC->HE-LHC.
Toivanen, V; Bellodi, G; Dimov, V; Küchler, D; Lombardi, A M; Maintrot, M
2016-02-01
Linac3 is the first accelerator in the heavy ion injector chain of the Large Hadron Collider (LHC), providing multiply charged heavy ion beams for the CERN experimental program. The ion beams are produced with GTS-LHC, a 14.5 GHz electron cyclotron resonance ion source, operated in afterglow mode. Improvement of the GTS-LHC beam formation and beam transport along Linac3 is part of the upgrade program of the injector chain in preparation for the future high luminosity LHC. A mismatch between the ion beam properties in the ion source extraction region and the acceptance of the following Low Energy Beam Transport (LEBT) section has been identified as one of the factors limiting the Linac3 performance. The installation of a new focusing element, an einzel lens, into the GTS-LHC extraction region is foreseen as a part of the Linac3 upgrade, as well as a redesign of the first section of the LEBT. Details of the upgrade and results of a beam dynamics study of the extraction region and LEBT modifications will be presented.
HL-LHC and HE-LHC Upgrade Plans and Opportunities for US Participation
NASA Astrophysics Data System (ADS)
Apollinari, Giorgio
2017-01-01
The US HEP community has identified the exploitation of physics opportunities at the High Luminosity-LHC (HL-LHC) as the highest near-term priority. Thanks to multi-year R&D programs, US National Laboratories and Universities have taken the leadership in the development of technical solutions to increase the LHC luminosity, enabling the HL-LHC Project and uniquely positioning this country to make critical contributions to the LHC luminosity upgrade. This talk will describe the shaping of the US Program to contribute in the next decade to HL-LHC through newly developed technologies such as Nb3Sn focusing magnets or superconducting crab cavities. The experience gained through the execution of the HL-LHC Project in the US will constitute a pool of knowledge and capabilities allowing further developments in the future. Opportunities for US participations in proposed hadron colliders, such as a possible High Energy-LHC (HE-LHC), will be described as well.
Radiation Hard Silicon Particle Detectors for Phase-II LHC Trackers
NASA Astrophysics Data System (ADS)
Oblakowska-Mucha, A.
2017-02-01
The major LHC upgrade is planned after ten years of accelerator operation. It is foreseen to significantly increase the luminosity of the current machine up to 1035 cm-2s-1 and operate as the upcoming High Luminosity LHC (HL-LHC) . The major detectors upgrade, called the Phase-II Upgrade, is also planned, a main reason being the aging processes caused by severe particle radiation. Within the RD50 Collaboration, a large Research and Development program has been underway to develop silicon sensors with sufficient radiation tolerance for HL-LHC trackers. In this summary, several results obtained during the testing of the devices after irradiation to HL-LHC levels are presented. Among the studied structures, one can find advanced sensors types like 3D silicon detectors, High-Voltage CMOS technologies, or sensors with intrinsic gain (LGAD). Based on these results, the RD50 Collaboration gives recommendation for the silicon detectors to be used in the detector upgrade.
Upgrade of the beam extraction system of the GTS-LHC electron cyclotron resonance ion source at CERN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toivanen, V., E-mail: ville.aleksi.toivanen@cern.ch; Bellodi, G.; Dimov, V.
2016-02-15
Linac3 is the first accelerator in the heavy ion injector chain of the Large Hadron Collider (LHC), providing multiply charged heavy ion beams for the CERN experimental program. The ion beams are produced with GTS-LHC, a 14.5 GHz electron cyclotron resonance ion source, operated in afterglow mode. Improvement of the GTS-LHC beam formation and beam transport along Linac3 is part of the upgrade program of the injector chain in preparation for the future high luminosity LHC. A mismatch between the ion beam properties in the ion source extraction region and the acceptance of the following Low Energy Beam Transport (LEBT)more » section has been identified as one of the factors limiting the Linac3 performance. The installation of a new focusing element, an einzel lens, into the GTS-LHC extraction region is foreseen as a part of the Linac3 upgrade, as well as a redesign of the first section of the LEBT. Details of the upgrade and results of a beam dynamics study of the extraction region and LEBT modifications will be presented.« less
NASA Astrophysics Data System (ADS)
Brodzinski, K.; Claudet, S.; Ferlin, G.; Tavian, L.; Wagner, U.; Van Weelderen, R.
The discovery of a Higgs boson at CERN in 2012 is the start of a major program of work to measure this particle's properties with the highest possible precision for testing the validity of the Standard Model and to search for further new physics at the energy frontier. The LHC is in a unique position to pursue this program. Europe's top priority is the exploitation of the full potential of the LHC, including the high-luminosity upgrade of the machine and detectors with an objective to collect ten times more data than in the initial design, by around 2030. To reach this objective, the LHC cryogenic system must be upgraded to withstand higher beam current and higher luminosity at top energy while keeping the same operation availability by improving the collimation system and the protection of electronics sensitive to radiation. This paper will present the conceptual design of the cryogenic system upgrade with recent updates in performance requirements, the corresponding layout and architecture of the system as well as the main technical challenges which have to be met in the coming years.
NASA Astrophysics Data System (ADS)
Tavian, L.; Brodzinski, K.; Claudet, S.; Ferlin, G.; Wagner, U.; van Weelderen, R.
The discovery of a Higgs boson at CERN in 2012 is the start of a major program of work to measure this particle's properties with the highest possible precision for testing the validity of the Standard Model and to search for further new physics at the energy frontier. The LHC is in a unique position to pursue this program. Europe's top priority is the exploitation of the full potential of the LHC, including the high-luminosity upgrade of the machine and detectors with an objective to collect ten times more data than in the initial design, by around 2030. To reach this objective, the LHC cryogenic system must be upgraded to withstand higher beam current and higher luminosity at top energy while keeping the same operation availability by improving the collimation system and the protection of electronics sensitive to radiation. This chapter will present the conceptual design of the cryogenic system upgrade with recent updates in performance requirements, the corresponding layout and architecture of the system as well as the main technical challenges which have to be met in the coming years.
NASA Astrophysics Data System (ADS)
Schmidt, Burkhard
2016-04-01
In the second phase of the LHC physics program, the accelerator will provide an additional integrated luminosity of about 2500/fb over 10 years of operation to the general purpose detectors ATLAS and CMS. This will substantially enlarge the mass reach in the search for new particles and will also greatly extend the potential to study the properties of the Higgs boson discovered at the LHC in 2012. In order to meet the experimental challenges of unprecedented pp luminosity, the experiments will need to address the aging of the present detectors and to improve the ability to isolate and precisely measure the products of the most interesting collisions. The lectures gave an overview of the physics motivation and described the conceptual designs and the expected performance of the upgrades of the four major experiments, ALICE, ATLAS, CMS and LHCb, along with the plans to develop the appropriate experimental techniques and a brief overview of the accelerator upgrade. Only some key points of the upgrade program of the four major experiments are discussed in this report; more information can be found in the references given at the end.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoynev, S.; et al.
The development ofmore » $$Nb_3Sn$$ quadrupole magnets for the High-Luminosity LHC upgrade is a joint venture between the US LHC Accelerator Research Program (LARP)* and CERN with the goal of fabricating large aperture quadrupoles for the LHC in-teraction regions (IR). The inner triplet (low-β) NbTi quadrupoles in the IR will be replaced by the stronger Nb3Sn magnets boosting the LHC program of having 10-fold increase in integrated luminos-ity after the foreseen upgrades. Previously LARP conducted suc-cessful tests of short and long models with up to 120 mm aperture. The first short 150 mm aperture quadrupole model MQXFS1 was assembled with coils fabricated by both CERN and LARP. The magnet demonstrated strong performance at the Fermilab’s verti-cal magnet test facility reaching the LHC operating limits. This paper reports the latest results from MQXFS1 tests with changed pre-stress levels. The overall magnet performance, including quench training and memory, ramp rate and temperature depend-ence, is also summarized.« less
Current Lead Design for the Accelerator Project for Upgrade of LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandt, Jeffrey S.; Cheban, Sergey; Feher, Sandor
2010-01-01
The Accelerator Project for Upgrade of LHC (APUL) is a U.S. project participating in and contributing to CERN's Large Hadron Collider (LHC) upgrade program. In collaboration with Brookhaven National Laboratory, Fermilab is developing sub-systems for an upgrade of the LHC final focus magnet systems. A concept of main and auxiliary helium flow was developed that allows the superconductor to remain cold while the lead body warms up to prevent upper section frosting. The auxiliary flow will subsequently cool the thermal shields of the feed box and the transmission line cryostats. A thermal analysis of the current lead central heat exchangemore » section was performed using analytic and FEA techniques. A method of remote soldering was developed that allows the current leads to be field replaceable. The remote solder joint was designed to be made without flux or additional solder, and able to be remade up to ten full cycles. A method of upper section attachment was developed that allows high pressure sealing of the helium volume. Test fixtures for both remote soldering and upper section attachment for the 13 kA lead were produced. The cooling concept, thermal analyses, and test results from both remote soldering and upper section attachment fixtures are presented.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chlachidze, G.; et al.
2016-08-30
The US LHC Accelerator Research Program (LARP) and CERN combined their efforts in developing Nb3Sn magnets for the High-Luminosity LHC upgrade. The ultimate goal of this collaboration is to fabricate large aperture Nb3Sn quadrupoles for the LHC interaction regions (IR). These magnets will replace the present 70 mm aperture NbTi quadrupole triplets for expected increase of the LHC peak luminosity by a factor of 5. Over the past decade LARP successfully fabricated and tested short and long models of 90 mm and 120 mm aperture Nb3Sn quadrupoles. Recently the first short model of 150 mm diameter quadrupole MQXFS was builtmore » with coils fabricated both by the LARP and CERN. The magnet performance was tested at Fermilab’s vertical magnet test facility. This paper reports the test results, including the quench training at 1.9 K, ramp rate and temperature dependence studies.« less
Data acquisition and processing in the ATLAS tile calorimeter phase-II upgrade demonstrator
NASA Astrophysics Data System (ADS)
Valero, A.; Tile Calorimeter System, ATLAS
2017-10-01
The LHC has planned a series of upgrades culminating in the High Luminosity LHC which will have an average luminosity 5-7 times larger than the nominal Run 2 value. The ATLAS Tile Calorimeter will undergo an upgrade to accommodate the HL-LHC parameters. The TileCal readout electronics will be redesigned, introducing a new readout strategy. A Demonstrator program has been developed to evaluate the new proposed readout architecture and prototypes of all the components. In the Demonstrator, the detector data received in the Tile PreProcessors (PPr) are stored in pipeline buffers and upon the reception of an external trigger signal the data events are processed, packed and readout in parallel through the legacy ROD system, the new Front-End Link eXchange system and an ethernet connection for monitoring purposes. This contribution describes in detail the data processing and the hardware, firmware and software components of the TileCal Demonstrator readout system.
Larp Nb3Sn Quadrupole Magnets for the Lhc Luminosity Upgrade
NASA Astrophysics Data System (ADS)
Ferracin, P.
2010-04-01
The US LHC Accelerator Research Program (LARP) is a collaboration between four US laboratories (BNL, FNAL, LBNL, and SLAC) aimed at contributing to the commissioning and operation of the LHC and conducting R&D on its luminosity upgrade. Within LARP, the Magnet Program's main goal is to demonstrate that Nb3Sn superconducting magnets are a viable option for a future upgrade of the LHC Interaction Regions. Over the past four years, LARP has successfully fabricated and tested several R&D magnets: 1) the subscale quadrupole magnet SQ, to perform technology studies with 300 mm long racetrack coils, 2) the technology quadrupole TQ, to investigate support structure behavior with 1 m long cos 2θ coils, and 3) the long racetrack magnet LR, to test 3.6 m long racetrack coils. The next milestone consists in the fabrication and test of the 3.7 m long quadrupole magnet LQ, with the goal of demonstrating that Nb3Sn technology is mature for use in high energy accelerators. After an overview of design features and test result of the LARP magnets fabricated so far, this paper focuses on the status of the fabrication of LQ: we describe the production of the 3.4 m long cos 2θ coils, and the of the qualification support structure. Finally, the status of the development of the next 1 m long model HQ, conceived to explore stress and field limits of Nb3Sn superconducting, magnets, is presented.
Final Report: High Energy Physics at the Energy Frontier at Louisiana Tech
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sawyer, Lee; Wobisch, Markus; Greenwood, Zeno D.
The Louisiana Tech University High Energy Physics group has developed a research program aimed at experimentally testing the Standard Model of particle physics and searching for new phenomena through a focused set of analyses in collaboration with the ATLAS experiment at the Large Hadron Collider (LHC) at the CERN laboratory in Geneva. This research program includes involvement in the current operation and maintenance of the ATLAS experiment and full involvement in Phase 1 and Phase 2 upgrades in preparation for future high luminosity (HL-LHC) operation of the LHC. Our focus is solely on the ATLAS experiment at the LHC, withmore » some related detector development and software efforts. We have established important service roles on ATLAS in five major areas: Triggers, especially jet triggers; Data Quality monitoring; grid computing; GPU applications for upgrades; and radiation testing for upgrades. Our physics research is focused on multijet measurements and top quark physics in final states containing tau leptons, which we propose to extend into related searches for new phenomena. Focusing on closely related topics in the jet and top analyses and coordinating these analyses in our group has led to high efficiency and increased visibility inside the ATLAS collaboration and beyond. Based on our work in the DØ experiment in Run II of the Fermilab Tevatron Collider, Louisiana Tech has developed a reputation as one of the leading institutions pursuing jet physics studies. Currently we are applying this expertise to the ATLAS experiment, with several multijet analyses in progress.« less
Electron Lenses for the Large Hadron Collider
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stancari, Giulio; Valishev, Alexander; Bruce, Roderik
Electron lenses are pulsed, magnetically confined electron beams whose current-density profile is shaped to obtain the desired effect on the circulating beam. Electron lenses were used in the Fermilab Tevatron collider for bunch-by-bunch compensation of long-range beam-beam tune shifts, for removal of uncaptured particles in the abort gap, for preliminary experiments on head-on beam-beam compensation, and for the demonstration of halo scraping with hollow electron beams. Electron lenses for beam-beam compensation are being commissioned in RHIC at BNL. Within the US LHC Accelerator Research Program and the European HiLumi LHC Design Study, hollow electron beam collimation was studied as anmore » option to complement the collimation system for the LHC upgrades. This project is moving towards a technical design in 2014, with the goal to build the devices in 2015-2017, after resuming LHC operations and re-assessing needs and requirements at 6.5 TeV. Because of their electric charge and the absence of materials close to the proton beam, electron lenses may also provide an alternative to wires for long-range beam-beam compensation in LHC luminosity upgrade scenarios with small crossing angles.« less
Upgrade of the ATLAS Hadronic Tile Calorimeter for the High Luminosity LHC
NASA Astrophysics Data System (ADS)
Tortajada, Ignacio Asensi
2018-01-01
The Large Hadron Collider (LHC) has envisaged a series of upgrades towards a High Luminosity LHC (HL-LHC) delivering five times the LHC nominal instantaneous luminosity. The ATLAS Phase II upgrade, in 2024, will accommodate the upgrade of the detector and data acquisition system for the HL-LHC. The Tile Calorimeter (TileCal) will undergo a major replacement of its on- and off-detector electronics. In the new architecture, all signals will be digitized and then transferred directly to the off-detector electronics, where the signals will be reconstructed, stored, and sent to the first level of trigger at the rate of 40 MHz. This will provide better precision of the calorimeter signals used by the trigger system and will allow the development of more complex trigger algorithms. Changes to the electronics will also contribute to the reliability and redundancy of the system. Three different front-end options are presently being investigated for the upgrade, two of them based on ASICs, and a final solution will be chosen after extensive laboratory and test beam studies that are in progress. A hybrid demonstrator module is being developed using the new electronics while conserving compatibility with the current system. The status of the developments will be presented, including results from the several tests with particle beams.
Three Generations of FPGA DAQ Development for the ATLAS Pixel Detector
NASA Astrophysics Data System (ADS)
Mayer, Joseph A., II
The Large Hadron Collider (LHC) at the European Center for Nuclear Research (CERN) tracks a schedule of long physics runs, followed by periods of inactivity known as Long Shutdowns (LS). During these LS phases both the LHC, and the experiments around its ring, undergo maintenance and upgrades. For the LHC these upgrades improve their ability to create data for physicists; the more data the LHC can create the more opportunities there are for rare events to appear that physicists will be interested in. The experiments upgrade so they can record the data and ensure the event won't be missed. Currently the LHC is in Run 2 having completed the first LS of three. This thesis focuses on the development of Field-Programmable Gate Array (FPGA)-based readout systems that span across three major tasks of the ATLAS Pixel data acquisition (DAQ) system. The evolution of Pixel DAQ's Readout Driver (ROD) card is presented. Starting from improvements made to the new Insertable B-Layer (IBL) ROD design, which was part of the LS1 upgrade; to upgrading the old RODs from Run 1 to help them run more efficiently in Run 2. It also includes the research and development of FPGA based DAQs and integrated circuit emulators for the ITk upgrade which will occur during LS3 in 2025.
Muon Physics at Run-I and its upgrade plan
NASA Astrophysics Data System (ADS)
Benekos, Nektarios Chr.
2015-05-01
The Large Hadron Collider (LHC) and its multi-purpose Detector, ATLAS, has been operated successfully at record centre-of-mass energies of 7 and TeV. After this successful LHC Run-1, plans are actively advancing for a series of upgrades, culminating roughly 10 years from now in the high luminosity LHC (HL-LHC) project, delivering of order five times the LHC nominal instantaneous luminosity along with luminosity leveling. The final goal is to extend the data set from about few hundred fb-1 expected for LHC running to 3000 fb-1 by around 2030. To cope with the corresponding rate increase, the ATLAS detector needs to be upgraded. The upgrade will proceed in two steps: Phase I in the LHC shutdown 2018/19 and Phase II in 2023-25. The largest of the ATLAS Phase-1 upgrades concerns the replacement of the first muon station of the highrapidity region, the so called New Small Wheel. This configuration copes with the highest rates expected in Phase II and considerably enhances the performance of the forward muon system by adding triggering functionality to the first muon station. Prospects for the ongoing and future data taking are presented. This article presents the main muon physics results from LHC Run-1 based on a total luminosity of 30 fb^-1. Prospects for the ongoing and future data taking are also presented. We will conclude with an update of the status of the project and the steps towards a complete operational system, ready to be installed in ATLAS in 2018/19.
NASA Astrophysics Data System (ADS)
Perin, A.; Dhalla, F.; Gayet, P.; Serio, L.
2017-12-01
SM18 is CERN main facility for testing superconducting accelerator magnets and superconducting RF cavities. Its cryogenic infrastructure will have to be significantly upgraded in the coming years, starting in 2019, to meet the testing requirements for the LHC High Luminosity project and for the R&D program for superconducting magnets and RF equipment until 2023 and beyond. This article presents the assessment of the cryogenic needs based on the foreseen test program and on past testing experience. The current configuration of the cryogenic infrastructure is presented and several possible upgrade scenarios are discussed. The chosen upgrade configuration is then described and the characteristics of the main newly required cryogenic equipment, in particular a new 35 g/s helium liquefier, are presented. The upgrade implementation strategy and plan to meet the required schedule are then described.
Calibration techniques and strategies for the present and future LHC electromagnetic calorimeters
NASA Astrophysics Data System (ADS)
Aleksa, M.
2018-02-01
This document describes the different calibration strategies and techniques applied by the two general purpose experiments at the LHC, ATLAS and CMS, and discusses them underlining their respective strengths and weaknesses from the view of the author. The resulting performances of both calorimeters are described and compared on the basis of selected physics results. Future upgrade plans for High Luminosity LHC (HL-LHC) are briefly introduced and planned calibration strategies for the upgraded detectors are shown.
Support Structure Design of the $$\\hbox{Nb}_{3}\\hbox{Sn}$$ Quadrupole for the High Luminosity LHC
Juchno, M.; Ambrosio, G.; Anerella, M.; ...
2014-10-31
New low-β quadrupole magnets are being developed within the scope of the High Luminosity LHC (HL-LHC) project in collaboration with the US LARP program. The aim of the HLLHC project is to study and implement machine upgrades necessary for increasing the luminosity of the LHC. The new quadrupoles, which are based on the Nb₃Sn superconducting technology, will be installed in the LHC Interaction Regions and will have to generate a gradient of 140 T/m in a coil aperture of 150 mm. In this paper, we describe the design of the short model magnet support structure and discuss results of themore » detailed 3D numerical analysis performed in preparation for the first short model test.« less
Readout Electronics for the ATLAS LAr Calorimeter at HL-LHC
NASA Astrophysics Data System (ADS)
Chen, Hucheng; ATLAS Liquid Argon Calorimeter Group
The ATLAS Liquid Argon (LAr) calorimeters are high precision, high sensitivity and high granularity detectors designed to provide precision measurements of electrons, photons, jets and missing transverse energy. ATLAS and its LAr calorimeters have been operating and collecting proton-proton collisions at LHC since 2009. The current front-end electronics of the LAr calorimeters need to be upgraded to sustain the higher radiation levels and data rates expected at the upgraded high luminosity LHC machine (HL-LHC), which will have 5 times more luminosity than the LHC in its ultimate configuration. The complexity of the present electronics and the obsolescence of some of components of which it is made, will not allow a partial replacement of the system. A completely new readout architecture scheme is under study and many components are being developed in various R&D programs of the LAr Calorimeter Group.The new front-end readout electronics will send data continuously at each bunch crossing through high speed radiation resistant optical links. The data will be processed real-time with the possibility of implementing trigger algorithms for clusters and electron/photon identification at a higher granularity than that which is currently implemented. The new architecture will eliminate the intrinsic limitation presently existing on Level-1 trigger acceptance. This article is an overview of the R&D activities which covers architectural design aspects of the new electronics as well as some detailed progress on the development of several ASICs needed, and preliminary studies with FPGAs to cover the backend functions including part of the Level-1 trigger requirements. A recently proposed staged upgrade with hybrid Tower Builder Board (TBB) is also described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Daniel W.; Ambrosio, Giorgio; Anderssen, Eric C.
Here, the LHC accelerator research program (LARP), in collaboration with CERN and under the scope of the high luminosity upgrade of the Large Hadron Collider, is in the prototyping stage in the development of a 150 mm aperture high-field Nb 3Sn quadrupole magnet called MQXF. This magnet is mechanically supported using a shell-based support structure, which has been extensively demonstrated on several R&D models within LARP, as well as in the more recent short (1.2 m magnetic length) MQXF model program. The MQXFA magnets are each 4.2 m magnetic length, and the first mechanical long model, MQXFA1M (using aluminum surrogatemore » coils), and MQXFAP1 prototype magnet (the first prototype with Nb 3Sn coils) have been assembled at the LBNL. In this paper, we summarize the tooling and the assembly processes, and discuss the mechanical performance of these first two assemblies, comparing strain gauge data with finite element model analysis, as well as the near-term plans for the long MQXF magnet program.« less
Cheng, Daniel W.; Ambrosio, Giorgio; Anderssen, Eric C.; ...
2018-01-30
Here, the LHC accelerator research program (LARP), in collaboration with CERN and under the scope of the high luminosity upgrade of the Large Hadron Collider, is in the prototyping stage in the development of a 150 mm aperture high-field Nb 3Sn quadrupole magnet called MQXF. This magnet is mechanically supported using a shell-based support structure, which has been extensively demonstrated on several R&D models within LARP, as well as in the more recent short (1.2 m magnetic length) MQXF model program. The MQXFA magnets are each 4.2 m magnetic length, and the first mechanical long model, MQXFA1M (using aluminum surrogatemore » coils), and MQXFAP1 prototype magnet (the first prototype with Nb 3Sn coils) have been assembled at the LBNL. In this paper, we summarize the tooling and the assembly processes, and discuss the mechanical performance of these first two assemblies, comparing strain gauge data with finite element model analysis, as well as the near-term plans for the long MQXF magnet program.« less
SLHC, the High-Luminosity Upgrade (public event)
None
2017-12-09
In the morning of June 23rd a public event is organised in CERN's Council Chamber with the aim of providing the particle physics community with up-to-date information about the strategy for the LHC luminosity upgrade and to describe the current status of preparation work. The presentations will provide an overview of the various accelerator sub-projects, the LHC physics prospects and the upgrade plans of ATLAS and CMS. This event is organised in the framework of the SLHC-PP project, which receives funding from the European Commission for the preparatory phase of the LHC High Luminosity Upgrade project. Informing the public is among the objectives of this EU-funded project. A simultaneous transmission of this meeting will be broadcast, available at the following address: http://webcast.cern.ch/
Field Quality Study of a 1-m-Long Single-Aperture 11-T Nb$$_3$$Sn Dipole Model for LHC Upgrades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chlachidze, G.; DiMarco, J.; Andreev, N.
2014-01-01
FNAL and CERN are carrying out a joint R&D program with the goal of building a 5.5-m-long twin-aperture 11-T Nb_3Sn dipole prototype that is suitable for installation in the LHC. An important part of the program is the development and test of a series of short single-aperture and twin-aperture dipole models with a nominal field of 11 T at the LHC operation current of 11.85 kA and 20% margin. This paper presents the results of magnetic measurements of a 1-m-long single-aperture Nb_3Sn dipole model fabricated and tested recently at FNAL, including geometrical field harmonics and effects of coil magnetization andmore » iron yoke saturation.« less
The 11 T dipole for HL-LHC: Status and plan
Savary, F.; Barzi, E.; Bordini, B.; ...
2016-06-01
The upgrade of the Large Hadron Collider (LHC) collimation system includes additional collimators in the LHC lattice. The longitudinal space for these collimators will be created by replacing some of the LHC main dipoles with shorter but stronger dipoles compatible with the LHC lattice and main systems. The project plan comprises the construction of two cryoassemblies containing each of the two 11-T dipoles of 5.5-m length for possible installation on either side of interaction point 2 of LHC in the years 2018-2019 for ion operation, and the installation of two cryoassemblies on either side of interaction point 7 of LHCmore » in the years 2023-2024 for proton operation. The development program conducted in conjunction between the Fermilab and CERN magnet groups is progressing well. The development activities carried out on the side of Fermilab were concluded in the middle of 2015 with the fabrication and test of a 1-m-long two-in-one model and those on the CERN side are ramping up with the construction of 2-m-long models and the preparation of the tooling for the fabrication of the first full-length prototype. The engineering design of the cryomagnet is well advanced, including the definition of the various interfaces, e.g., with the collimator, powering, protection, and vacuum systems. Several practice coils of 5.5-m length have been already fabricated. This paper describes the overall progress of the project, the final design of the cryomagnet, and the performance of the most recent models. Furthermore, the overall plan toward the fabrication of the series magnets for the two phases of the upgrade of the LHC collimation system is also presented.« less
First Test Results of the 150 mm Aperture IR Quadrupole Models for the High Luminosity LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosio, G.; Chlachidze, G.; Wanderer, P.
2016-10-06
The High Luminosity upgrade of the LHC at CERN will use large aperture (150 mm) quadrupole magnets to focus the beams at the interaction points. The high field in the coils requires Nb3Sn superconductor technology, which has been brought to maturity by the LHC Accelerator Re-search Program (LARP) over the last 10 years. The key design targets for the new IR quadrupoles were established in 2012, and fabrication of model magnets started in 2014. This paper discusses the results from the first single short coil test and from the first short quadrupole model test. Remaining challenges and plans to addressmore » them are also presented and discussed.« less
NASA Astrophysics Data System (ADS)
Nellist, C.; Dinu, N.; Gkougkousis, E.; Lounis, A.
2015-06-01
The LHC accelerator complex will be upgraded between 2020-2022, to the High-Luminosity-LHC, to considerably increase statistics for the various physics analyses. To operate under these challenging new conditions, and maintain excellent performance in track reconstruction and vertex location, the ATLAS pixel detector must be substantially upgraded and a full replacement is expected. Processing techniques for novel pixel designs are optimised through characterisation of test structures in a clean room and also through simulations with Technology Computer Aided Design (TCAD). A method to study non-perpendicular tracks through a pixel device is discussed. Comparison of TCAD simulations with Secondary Ion Mass Spectrometry (SIMS) measurements to investigate the doping profile of structures and validate the simulation process is also presented.
The CMS High Granularity Calorimeter for the High Luminosity LHC
NASA Astrophysics Data System (ADS)
Sauvan, J.-B.
2018-02-01
The High Luminosity LHC (HL-LHC) will integrate 10 times more luminosity than the LHC, posing significant challenges for radiation tolerance and event pileup on detectors, especially for forward calorimetry, and hallmarks the issue for future colliders. As part of its HL-LHC upgrade program, the CMS collaboration is designing a High Granularity Calorimeter to replace the existing endcap calorimeters. It features unprecedented transverse and longitudinal segmentation for both electromagnetic (ECAL) and hadronic (HCAL) compartments. This will facilitate particle-flow calorimetry, where the fine structure of showers can be measured and used to enhance pileup rejection and particle identification, whilst still achieving good energy resolution. The ECAL and a large fraction of HCAL will be based on hexagonal silicon sensors of 0.5-1 cm2 cell size, with the remainder of the HCAL based on highly-segmented scintillators with silicon photomultiplier (SiPM) readout. The intrinsic high-precision timing capabilities of the silicon sensors will add an extra dimension to event reconstruction, especially in terms of pileup rejection.
Electronics for CMS Endcap Muon Level-1 Trigger System Phase-1 and HL LHC upgrades
NASA Astrophysics Data System (ADS)
Madorsky, A.
2017-07-01
To accommodate high-luminosity LHC operation at a 13 TeV collision energy, the CMS Endcap Muon Level-1 Trigger system had to be significantly modified. To provide robust track reconstruction, the trigger system must now import all available trigger primitives generated by the Cathode Strip Chambers and by certain other subsystems, such as Resistive Plate Chambers (RPC). In addition to massive input bandwidth, this also required significant increase in logic and memory resources. To satisfy these requirements, a new Sector Processor unit has been designed. It consists of three modules. The Core Logic module houses the large FPGA that contains the track-finding logic and multi-gigabit serial links for data exchange. The Optical module contains optical receivers and transmitters; it communicates with the Core Logic module via a custom backplane section. The Pt Lookup table (PTLUT) module contains 1 GB of low-latency memory that is used to assign the final Pt to reconstructed muon tracks. The μ TCA architecture (adopted by CMS) was used for this design. The talk presents the details of the hardware and firmware design of the production system based on Xilinx Virtex-7 FPGA family. The next round of LHC and CMS upgrades starts in 2019, followed by a major High-Luminosity (HL) LHC upgrade starting in 2024. In the course of these upgrades, new Gas Electron Multiplier (GEM) detectors and more RPC chambers will be added to the Endcap Muon system. In order to keep up with all these changes, a new Advanced Processor unit is being designed. This device will be based on Xilinx UltraScale+ FPGAs. It will be able to accommodate up to 100 serial links with bit rates of up to 25 Gb/s, and provide up to 2.5 times more logic resources than the device used currently. The amount of PTLUT memory will be significantly increased to provide more flexibility for the Pt assignment algorithm. The talk presents preliminary details of the hardware design program.
Introduction to the HL-LHC Project
NASA Astrophysics Data System (ADS)
Rossi, L.; Brüning, O.
The Large Hadron Collider (LHC) is one of largest scientific instruments ever built. It has been exploring the new energy frontier since 2010, gathering a global user community of 7,000 scientists. To extend its discovery potential, the LHC will need a major upgrade in the 2020s to increase its luminosity (rate of collisions) by a factor of five beyond its design value and the integrated luminosity by a factor of ten. As a highly complex and optimized machine, such an upgrade of the LHC must be carefully studied and requires about ten years to implement. The novel machine configuration, called High Luminosity LHC (HL-LHC), will rely on a number of key innovative technologies, representing exceptional technological challenges, such as cutting-edge 11-12 tesla superconducting magnets, very compact superconducting cavities for beam rotation with ultra-precise phase control, new technology for beam collimation and 300-meter-long high-power superconducting links with negligible energy dissipation. HL-LHC federates efforts and R&D of a large community in Europe, in the US and in Japan, which will facilitate the implementation of the construction phase as a global project.
Induced activation studies for the LHC upgrade to High Luminosity LHC
NASA Astrophysics Data System (ADS)
Adorisio, C.; Roesler, S.
2018-06-01
The Large Hadron Collider (LHC) will be upgraded in 2019/2020 to increase its luminosity (rate of collisions) by a factor of five beyond its design value and the integrated luminosity by a factor ten, in order to maintain scientific progress and exploit its full capacity. The novel machine configuration, called High Luminosity LHC (HL-LHC), will increase consequently the level of activation of its components. The evaluation of the radiological impact of the HL-LHC operation in the Long Straight Sections of the Insertion Region 1 (ATLAS) and Insertion Region 5 (CMS) is presented. Using the Monte Carlo code FLUKA, ambient dose equivalent rate estimations have been performed on the basis of two announced operating scenarios and using the latest available machine layout. The HL-LHC project requires new technical infrastructure with caverns and 300 m long tunnels along the Insertion Regions 1 and 5. The new underground service galleries will be accessible during the operation of the accelerator machine. The radiological risk assessment for the Civil Engineering work foreseen to start excavating the new galleries in the next LHC Long Shutdown and the radiological impact of the machine operation will be discussed.
NASA Astrophysics Data System (ADS)
Castro, Andrew; Alice-Usa Collaboration; Alice-Tpc Collaboration
2017-09-01
The Time Projection Chamber (TPC) currently used for ALICE (A Large Ion Collider Experiment at CERN) is a gaseous tracking detector used to study both proton-proton and heavy-ion collisions at the Large Hadron Collider (LHC) In order to accommodate the higher luminosit collisions planned for the LHC Run-3 starting in 2021, the ALICE-TPC will undergo a major upgrade during the next LHC shut down. The TPC is limited to a read out of 1000 Hz in minimum bias events due to the intrinsic dead time associated with back ion flow in the multi wire proportional chambers (MWPC) in the TPC. The TPC upgrade will handle the increase in event readout to 50 kHz for heavy ion minimum bias triggered events expected with the Run-3 luminosity by switching the MWPCs to a stack of four Gaseous Electron Multiplier (GEM) foils. The GEM layers will combine different hole pitches to reduce the dead time while maintaining the current spatial and energy resolution of the existing TPC. Undertaking the upgrade of the TPC represents a massive endeavor in terms of design, production, construction, quality assurance, and installation, thus the upgrade is coordinated over a number of institutes worldwide. The talk will go over the physics motivation for the upgrade, the ALICE-USA contribution to the construction of Inner Read Out Chambers IROCs, and QA from the first chambers built in the U.S
Implications of the 750 GeV γγ Resonance as a Case Study for the International Linear Collider
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujii, Keisuke; Grojean, Christophe; Peskin, Michael E.
If the γγ resonance at 750 GeV suggested by 2015 LHC data turns out to be a real effect, what are the implications for the physics case and upgrade path of the International Linear Collider? Whether or not the resonance is confirmed, this question provides an interesting case study testing the robustness of the ILC physics case. In this note, we address this question with two points: (1) Almost all models proposed for the new 750 GeV particle require additional new particles with electroweak couplings. The key elements of the 500 GeV ILC physics program - precision measurements of themore » Higgs boson, the top quark, and 4-fermion interactions - will powerfully discriminate among these models. This information will be important in conjunction with new LHC data, or alone, if the new particles accompanying the 750 GeV resonance are beyond the mass reach of the LHC. (2) Over a longer term, the energy upgrade of the ILC to 1 TeV already discussed in the ILC TDR will enable experiments in γγ and e +e - collisions to directly produce and study the 750 GeV particle from these unique initial states.« less
Magnetic analysis of the Nb$$_3$$Sn low-beta quadrupole for the high luminosity LHC
Bermudez, Susana Izquierdo; Ambrosio, G.; Chlachidze, G.; ...
2017-01-10
As part of the Large Hadron Collider Luminosity upgrade (HiLumi-LHC) program, the US LARP collaboration and CERN are working together to design and build 150 mm aperture Nb 3Sn quadrupoles for the LHC interaction regions. A first series of 1.5 m long coils were fabricated, assembled and tested in the first short model. This paper presents the magnetic analysis, comparing magnetic field measurements with the expectations and the field quality requirements. The analysis is focused on the geometrical harmonics, iron saturation effect and cold-warm correlation. Three dimensional effects such as the variability of the field harmonics along the magnet axismore » and the contribution of the coil ends are also discussed. Furthemore, we present the influence of the conductor magnetization and the dynamic effects.« less
Velev, G. V.; Chlachidze, G.; DiMarco, J.; ...
2016-01-06
In the past 10 years, Fermilab has been executing an intensive R&D program on accelerator magnets based on Nb 3Sn superconductor technology. This R&D effort includes dipole and quadrupole models for different programs, such as LARP and 11 T dipoles for the LHC high-luminosity upgrade. Before the Nb 3Sn R&D program, Fermilab was involved in the production of the low-beta quadrupole magnets for LHC based on the NbTi superconductor. Additionally, during the 2003-2005 campaign to optimize the operation of the Tevatron, a large number of Tevatron magnets were re-measured. As a result of this field analysis, a systematic study ofmore » the persistent current decay and snapback effect in these magnets was performed. This paper summarizes the result of this study and presents a comparison between Nb 3Sn and NbTi dipoles and quadrupoles.« less
The LHCb trigger and its upgrade
NASA Astrophysics Data System (ADS)
Dziurda, A.; LHCb Trigger Group
2016-07-01
The current LHCb trigger system consists of a hardware level, which reduces the LHC inelastic collision rate of 30 MHz, at which the entire detector is read out. In a second level, implemented in a farm of 20 k parallel-processing CPUs, the event rate is reduced to about 5 kHz. We review the performance of the LHCb trigger system during Run I of the LHC. Special attention is given to the use of multivariate analyses in the High Level Trigger. The major bottleneck for hadronic decays is the hardware trigger. LHCb plans a major upgrade of the detector and DAQ system in the LHC shutdown of 2018, enabling a purely software based trigger to process the full 30 MHz of inelastic collisions delivered by the LHC. We demonstrate that the planned architecture will be able to meet this challenge.
NASA Astrophysics Data System (ADS)
Senkin, Sergey
2018-01-01
The ATLAS Collaboration has started a vast programme of upgrades in the context of high-luminosity LHC (HL-LHC) foreseen in 2024. We present here one of the frontend readout options, an ASIC called FATALIC, proposed for the high-luminosity phase LHC upgrade of the ATLAS Tile Calorimeter. Based on a 130 nm CMOS technology, FATALIC performs the complete signal processing, including amplification, shaping and digitisation. We describe the full characterisation of FATALIC and also the Optimal Filtering signal reconstruction method adapted to fully exploit the FATALIC three-range layout. Additionally we present the resolution performance of the whole chain measured using the charge injection system designed for calibration. Finally we discuss the results of the signal reconstruction used on real data collected during a preliminary beam test at CERN.
Development of superconducting links for the Large Hadron Collider machine
NASA Astrophysics Data System (ADS)
Ballarino, Amalia
2014-04-01
In the framework of the upgrade of the Large Hadron Collider (LHC) machine, new superconducting lines are being developed for the feeding of the LHC magnets. The proposed electrical layout envisages the location of the power converters in surface buildings, and the transfer of the current from the surface to the LHC tunnel, where the magnets are located, via superconducting links containing tens of cables feeding different circuits and transferring altogether more than 150 kA. Depending on the location, the links will have a length ranging from 300 m to 500 m, and they will span a vertical distance of about 80 m. An overview of the R&D program that has been launched by CERN is presented, with special attention to the development of novel types of cables made from MgB2 and high temperature superconductors (Bi-2223 and REBCO) and to the results of the tests performed on prototype links. Plans for future activities are presented, together with a timeline for potential future integration in the LHC machine.
Detector Developments for the High Luminosity LHC Era (4/4)
Bortoletto, Daniela
2018-02-09
Tracking Detectors - Part II. Calorimetry, muon detection, vertexing, and tracking will play a central role in determining the physics reach for the High Luminosity LHC Era. In these lectures we will cover the requirements, options, and the R&D; efforts necessary to upgrade the current LHC detectors and enabling discoveries.
Detector Developments for the High Luminosity LHC Era (3/4)
Bortoletto, Daniela
2018-01-23
Tracking Detectors - Part I. Calorimetry, muon detection, vertexing, and tracking will play a central role in determining the physics reach for the High Luminosity LHC Era. In these lectures we will cover the requirements, options, and the R&D; efforts necessary to upgrade the current LHC detectors and enabling discoveries.
NASA Astrophysics Data System (ADS)
Massironi, A.
2018-04-01
The upgrade of the Compact Muon Solenoid (CMS) crystal electromagnetic calorimeter (ECAL), which will operate at the High Luminosity Large Hadron Collider (HL-LHC), will achieve a timing resolution of around 30 ps for high energy photons and electrons. In this talk we will discuss the benefits of precision timing for the ECAL event reconstruction at HL-LHC. Simulation studies focused on the timing properties of PbWO4 crystals, as well as the impact of the photosensors and the readout electronics on the timing performance, will be presented. Test beam studies intended to measure the timing performance of the PbWO4 crystals with different photosensors and readout electronics will be shown.
Fermilab Heroes of the LHC: Steve Nahn and Vivian O’Dell
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nahn, Steve; O’Dell, Vivian
2017-09-11
The experiments based at the Large Hadron Collider in Switzerland are undergoing a constant series of upgrades. Fermilab scientists Steve Nahn and Vivian O’Dell lead these upgrade efforts in the United States.
Pixel sensors with slim edges and small pitches for the CMS upgrades for HL-LHC
Vernieri, Caterina; Bolla, Gino; Rivera, Ryan; ...
2016-06-07
Here, planar n-in-n silicon detectors with small pitches and slim edges are being investigated for the innermost layers of tracking devices for the foreseen upgrades of the LHC experiments. Sensor prototypes compatible with the CMS readout, fabricated by Sintef, were tested in the laboratory and with a 120 GeV/c proton beam at the Fermilab test beam facility before and after irradiation with up to 2 × 10 15 neq/cm 2 fluence. Preliminary results of the data analysis are presented.
The CMS Level-1 trigger for LHC Run II
NASA Astrophysics Data System (ADS)
Tapper, A.
2018-02-01
During LHC Run II the centre-of-mass energy of pp collisions has increased from 8 TeV up to 13 TeV and the instantaneous luminosity has progressed towards 2 × 1034 cm-2s-1. In order to guarantee a successful and ambitious physics programme under these conditions, the CMS trigger system has been upgraded. The upgraded CMS Level-1 trigger is designed to improve performance at high luminosity and large number of simultaneous inelastic collisions per crossing. The trigger design, implementation and commissioning are summarised, and performance results are described.
Modeling radiation damage to pixel sensors in the ATLAS detector
NASA Astrophysics Data System (ADS)
Ducourthial, A.
2018-03-01
Silicon pixel detectors are at the core of the current and planned upgrade of the ATLAS detector at the Large Hadron Collider (LHC) . As the closest detector component to the interaction point, these detectors will be subject to a significant amount of radiation over their lifetime: prior to the High-Luminosity LHC (HL-LHC) [1], the innermost layers will receive a fluence in excess of 1015 neq/cm2 and the HL-LHC detector upgrades must cope with an order of magnitude higher fluence integrated over their lifetimes. Simulating radiation damage is essential in order to make accurate predictions for current and future detector performance that will enable searches for new particles and forces as well as precision measurements of Standard Model particles such as the Higgs boson. We present a digitization model that includes radiation damage effects on the ATLAS pixel sensors for the first time. In addition to thoroughly describing the setup, we present first predictions for basic pixel cluster properties alongside early studies with LHC Run 2 proton-proton collision data.
The CMS Data Acquisition - Architectures for the Phase-2 Upgrade
NASA Astrophysics Data System (ADS)
Andre, J.-M.; Behrens, U.; Branson, J.; Brummer, P.; Chaze, O.; Cittolin, S.; Contescu, C.; Craigs, B. G.; Darlea, G.-L.; Deldicque, C.; Demiragli, Z.; Dobson, M.; Doualot, N.; Erhan, S.; Fulcher, J. F.; Gigi, D.; Gładki, M.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Janulis, M.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; O'Dell, V.; Orsini, L.; Paus, C.; Petrova, P.; Pieri, M.; Racz, A.; Reis, T.; Sakulin, H.; Schwick, C.; Simelevicius, D.; Zejdl, P.
2017-10-01
The upgraded High Luminosity LHC, after the third Long Shutdown (LS3), will provide an instantaneous luminosity of 7.5 × 1034 cm-2 s -1 (levelled), at the price of extreme pileup of up to 200 interactions per crossing. In LS3, the CMS Detector will also undergo a major upgrade to prepare for the phase-2 of the LHC physics program, starting around 2025. The upgraded detector will be read out at an unprecedented data rate of up to 50 Tb/s and an event rate of 750 kHz. Complete events will be analysed by software algorithms running on standard processing nodes, and selected events will be stored permanently at a rate of up to 10 kHz for offline processing and analysis. In this paper we discuss the baseline design of the DAQ and HLT systems for the phase-2, taking into account the projected evolution of high speed network fabrics for event building and distribution, and the anticipated performance of general purpose CPU. Implications on hardware and infrastructure requirements for the DAQ “data center” are analysed. Emerging technologies for data reduction are considered. Novel possible approaches to event building and online processing, inspired by trending developments in other areas of computing dealing with large masses of data, are also examined. We conclude by discussing the opportunities offered by reading out and processing parts of the detector, wherever the front-end electronics allows, at the machine clock rate (40 MHz). This idea presents interesting challenges and its physics potential should be studied.
The CMS Data Acquisition - Architectures for the Phase-2 Upgrade
Andre, J-M; Behrens, U.; Branson, J.; ...
2017-10-01
The upgraded High Luminosity LHC, after the third Long Shutdown (LS3), will provide an instantaneous luminosity of 7.5 × 10 34 cm -2 s -1 (levelled), at the price of extreme pileup of up to 200 interactions per crossing. In LS3, the CMS Detector will also undergo a major upgrade to prepare for the phase-2 of the LHC physics program, starting around 2025. The upgraded detector will be read out at an unprecedented data rate of up to 50 Tb/s and an event rate of 750 kHz. Complete events will be analysed by software algorithms running on standard processing nodes,more » and selected events will be stored permanently at a rate of up to 10 kHz for offline processing and analysis. Here in this paper we discuss the baseline design of the DAQ and HLT systems for the phase-2, taking into account the projected evolution of high speed network fabrics for event building and distribution, and the anticipated performance of general purpose CPU. Implications on hardware and infrastructure requirements for the DAQ “data center” are analysed. Emerging technologies for data reduction are considered. Novel possible approaches to event building and online processing, inspired by trending developments in other areas of computing dealing with large masses of data, are also examined. We conclude by discussing the opportunities offered by reading out and processing parts of the detector, wherever the front-end electronics allows, at the machine clock rate (40 MHz). This idea presents interesting challenges and its physics potential should be studied.« less
The front-end data conversion and readout electronics for the CMS ECAL upgrade
NASA Astrophysics Data System (ADS)
Mazza, G.; Cometti, S.
2018-03-01
The High Luminosity LHC (HL-LHC) will require a significant upgrade of the readout electronics for the CMS Electromagnetic Calorimeter (ECAL). The Very Front-End (VFE) output signal will be sampled at 160 MS/s (i.e. four times the current sampling rate) with a 13 bits resolution. Therefore, a high-speed, high-resolution ADC is required. Moreover, each readout channel will produce 2.08 Gb/s, thus requiring a fast data transmission circuitry. A new readout architecture, based on two 12 bit, 160 MS/s ADCs, lossless data compression algorithms and fast serial links have been developed for the ECAL upgrade. These functions will be integrated in a single ASIC which is currently under design in a commercial CMOS 65 nm technology using radiation damage mitigation techniques.
Quench protection studies of the 11-T Nb 3Sn dipole for the LHC upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bermudez, Susana Izquierdo; Auchmann, Bernhard; Bajas, Hugues
The planned upgrade of the LHC collimation system foresees additional collimators to be installed in the dispersion suppressor areas. Fermilab and CERN are developing an 11 T Nb 3Sn dipole to replace some 8.33 T-15-m-long Nb-Ti LHC main dipoles providing longitudinal space for the collimators. In case of a quench, the large stored energy and the low copper stabilizer fraction make the protection of the 11 T Nb 3Sn dipoles challenging. This paper presents the results of quench protection analysis, including quench protection heater design and efficiency, quench propagation and coil heating. The numerical results are compared with the experimentalmore » data from the 2-m-long Nb 3Sn dipole models. Here, the validated model is used to predict the current decay and hot spot temperature under operating conditions in the LHC and the presently foreseen magnet protection scheme is discussed.« less
Quench protection studies of the 11-T Nb 3Sn dipole for the LHC upgrade
Bermudez, Susana Izquierdo; Auchmann, Bernhard; Bajas, Hugues; ...
2016-06-01
The planned upgrade of the LHC collimation system foresees additional collimators to be installed in the dispersion suppressor areas. Fermilab and CERN are developing an 11 T Nb 3Sn dipole to replace some 8.33 T-15-m-long Nb-Ti LHC main dipoles providing longitudinal space for the collimators. In case of a quench, the large stored energy and the low copper stabilizer fraction make the protection of the 11 T Nb 3Sn dipoles challenging. This paper presents the results of quench protection analysis, including quench protection heater design and efficiency, quench propagation and coil heating. The numerical results are compared with the experimentalmore » data from the 2-m-long Nb 3Sn dipole models. Here, the validated model is used to predict the current decay and hot spot temperature under operating conditions in the LHC and the presently foreseen magnet protection scheme is discussed.« less
Quench Protection Studies of 11T Nb$$_3$$Sn Dipole Models for LHC Upgrades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zlobin, Alexander; Chlachidze, Guram; Nobrega, Alfred
CERN and FNAL are developing 11 T Nb3Sn dipole magnets for the LHC collimation system upgrade. Due to the large stored energy, protection of these magnets during a quench is a challenging problem. This paper reports the results of experimental studies of key quench protection parameters including longitudinal and radial quench propagation in the coil, coil heating due to a quench, and energy extraction and quench-back effect. The studies were performed using a 1 m long 11 T Nb3Sn dipole coil tested in a magnetic mirror configuration.
LHC Status and Upgrade Challenges
NASA Astrophysics Data System (ADS)
Smith, Jeffrey
2009-11-01
The Large Hadron Collider has had a trying start-up and a challenging operational future lays ahead. Critical to the machine's performance is controlling a beam of particles whose stored energy is equivalent to 80 kg of TNT. Unavoidable beam losses result in energy deposition throughout the machine and without adequate protection this power would result in quenching of the superconducting magnets. A brief overview of the machine layout and principles of operation will be reviewed including a summary of the September 2008 accident. The current status of the LHC, startup schedule and upgrade options to achieve the target luminosity will be presented.
Numerical Analysis of Parasitic Crossing Compensation with Wires in DA$$\\Phi$$NE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valishev, A.; Shatilov, D.; Milardi, C.
2015-06-24
Current-bearing wire compensators were successfully used in the 2005-2006 run of the DAΦNE collider to mitigate the detrimental effects of parasitic beam-beam interactions. A marked improvement of the positron beam lifetime was observed in machine operation with the KLOE detector. In view of the possible application of wire beam-beam compensators for the High Luminosity LHC upgrade, we revisit the DAΦNE experiments. We use an improved model of the accelerator with the goal to validate the modern simulation tools and provide valuable input for the LHC upgrade project.
NASA Astrophysics Data System (ADS)
Solovyanov, Oleg
2017-10-01
The Tile Calorimeter (TileCal) of the ATLAS experiment at the LHC is the central hadronic calorimeter designed for energy reconstruction of hadrons, jets, tauparticles and missing transverse energy. TileCal is a scintillator-steel sampling calorimeter and it covers the region of pseudo-rapidity up to 1.7, with almost 10000 channels measuring energies ranging from ˜30 MeV to ˜2 TeV. Each stage of the signal production, from scintillation light to the signal reconstruction, is monitored and calibrated. The performance of the Tile calorimeter has been studied in-situ employing cosmic ray muons and a large sample of proton-proton collisions, acquired during the operations of the LHC. Prompt isolated muons of high momentum from electroweak bosons decays are employed to study the energy response of the calorimeter at the electromagnetic scale. The calorimeter response to hadronic particles is evaluated with a sample of isolated hadrons. The modelling of the response by the Monte Carlo simulation is discussed. The calorimeter timing calibration and resolutions are studied with a sample of multijets events. Results on the calorimeter operation and performance are presented, including the calibration, stability, absolute energy scale, uniformity and time resolution. TileCal performance satisfies the design requirements and has provided an essential contribution to physics results in ATLAS. The Large Hadron Collider (LHC) has envisaged a series of upgrades towards a High Luminosity LHC (HL-LHC), delivering five times the LHC nominal instantaneous luminosity. The ATLAS Phase II upgrade, in 2024, will accommodate the detector and data acquisition system for the HL-LHC. In particular, the Tile Calorimeter will undergo a major replacement of its on- and off-detector electronics. All signals will be digitised and then transferred directly to the off-detector electronics, where the signals will be reconstructed, stored, and sent to the first level of trigger at a rate of 40 MHz. This will provide better precision for the calorimeter signals used by the trigger system and will allow the development of more complex trigger algorithms. Changes to the electronics will also contribute to the reliability and redundancy of the system. Three different front-end options are presently being investigated for the upgrade. Results of extensive laboratory tests and with beams of the three options will be presented, as well as the latest results on the development of the power distribution and the off-detector electronics.
Second-generation coil design of the Nb 3Sn low-β quadrupole for the high luminosity LHC
Bermudez, S. Izquierdo; Ambrosio, G.; Ballarino, A.; ...
2016-01-18
As part of the Large Hadron Collider Luminosity upgrade (HiLumi-LHC) program, the US LARP collaboration and CERN are working together to design and build a 150 mm aperture Nb 3Sn quadrupole for the LHC interaction regions. A first series of 1.5 m long coils were fabricated and assembled in a first short model. A detailed visual inspection of the coils was carried out to investigate cable dimensional changes during heat treatment and the position of the windings in the coil straight section and in the end region. The analyses allow identifying a set of design changes which, combined with amore » fine tune of the cable geometry and a field quality optimization, were implemented in a new, second-generation, coil design. In this study, we review the main characteristics of the first generation coils, describe the modification in coil lay-out, and discuss their impact on parts design and magnet analysis.« less
Development of micromegas muon chambers for the ATLAS upgrade
NASA Astrophysics Data System (ADS)
Wotschack, J.
2012-02-01
Large-area particle detectors based on the bulk-micromegas technology are an attractive choice for the upgrade of LHC detectors and/or detectors for the ILC or other experiments. In the context of the R&D for the ATLAS Muon System upgrade, we have built detectors of order 1 m2. In order to overcome the spark problem in micromegas a novel protection scheme using resistive strips above the readout electrode has been developed. This technology has undergone extensive tests with hadron beams at the CERN-SPS, X-rays in the lab, as well as in a neutron beam. In addition, four 10 × 10 cm2 micromegas chambers have been installed in the ATLAS cavern and are taking data under LHC conditions. We will discuss the underlying design of the chambers and present results on the performance of these chambers.
Detector Developments for the High Luminosity LHC Era (1/4)
Straessner, Arno
2018-04-27
Calorimetry and Muon Spectrometers - Part I : In the first part of the lecture series, the motivation for a high luminosity upgrade of the LHC will be quickly reviewed together with the challenges for the LHC detectors. In particular, the plans and ongoing research for new calorimeter detectors will be explained. The main issues in the high-luminosity era are an improved radiation tolerance, natural ageing of detector components and challenging trigger and physics requirements. The new technological solutions for calorimetry at a high-luminosity LHC will be reviewed.
Searching for New Physics with Top Quarks and Upgrade to the Muon Spectrometer at ATLAS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwarz, Thomas Andrew
2015-06-29
Over the funding period of this award, my research has focused on searching for new physics with top quarks and in the Higgs sector. The highly energetic top quark events at the LHC are an excellent venue to search for new physics, as well as make standard model measurements. Further, the recent discovery of the Higgs boson motivates searching for new physics that could be associated with it. This one-year award has facilitated the beginning of my research program, which has resulted in four publications, several conference talks, and multiple leadership positions within physics groups. Additionally, we are contributing tomore » ATLAS upgrades and operations. As part of the Phase I upgrade, I have taken on the responsibility of the design, prototyping, and quality control of a signal packet router for the trigger electronics of the New Small Wheel. This is a critical component of the upgrade, as the router is the main switchboard for all trigger signals to track finding processors. I am also leading the Phase II upgrade of the readout electronics of the muon spectrometer, and have been selected as the USATLAS Level-2 manager of the Phase II upgrade of the muon spectrometer. The award has been critical in these contributions to the experiment.« less
Large Hadron Collider commissioning and first operation.
Myers, S
2012-02-28
A history of the commissioning and the very successful early operation of the Large Hadron Collider (LHC) is described. The accident that interrupted the first commissioning, its repair and the enhanced protection system put in place are fully described. The LHC beam commissioning and operational performance are reviewed for the period from 2010 to mid-2011. Preliminary plans for operation and future upgrades for the LHC are given for the short and medium term.
Physics Goals and Experimental Challenges of the Proton-Proton High-Luminosity Operation of the LHC
NASA Astrophysics Data System (ADS)
Campana, P.; Klute, M.; Wells, P. S.
2016-10-01
The completion of Run 1 of the Large Hadron Collider (LHC) at CERN has seen the discovery of the Higgs boson and an unprecedented number of precise measurements of the Standard Model, and Run 2 has begun to provide the first data at higher energy. The high-luminosity upgrade of the LHC (HL-LHC) and the four experiments (ATLAS, CMS, ALICE, and LHCb) will exploit the full potential of the collider to discover and explore new physics beyond the Standard Model. We review the experimental challenges and the physics opportunities in proton-proton collisions at the HL-LHC.
P-Type Silicon Strip Sensors for the new CMS Tracker at HL-LHC
NASA Astrophysics Data System (ADS)
Adam, W.; Bergauer, T.; Brondolin, E.; Dragicevic, M.; Friedl, M.; Frühwirth, R.; Hoch, M.; Hrubec, J.; König, A.; Steininger, H.; Waltenberger, W.; Alderweireldt, S.; Beaumont, W.; Janssen, X.; Lauwers, J.; Van Mechelen, P.; Van Remortel, N.; Van Spilbeeck, A.; Beghin, D.; Brun, H.; Clerbaux, B.; Delannoy, H.; De Lentdecker, G.; Fasanella, G.; Favart, L.; Goldouzian, R.; Grebenyuk, A.; Karapostoli, G.; Lenzi, Th.; Léonard, A.; Luetic, J.; Postiau, N.; Seva, T.; Vanlaer, P.; Vannerom, D.; Wang, Q.; Zhang, F.; Abu Zeid, S.; Blekman, F.; De Bruyn, I.; De Clercq, J.; D'Hondt, J.; Deroover, K.; Lowette, S.; Moortgat, S.; Moreels, L.; Python, Q.; Skovpen, K.; Van Mulders, P.; Van Parijs, I.; Bakhshiansohi, H.; Bondu, O.; Brochet, S.; Bruno, G.; Caudron, A.; Delaere, C.; Delcourt, M.; De Visscher, S.; Francois, B.; Giammanco, A.; Jafari, A.; Komm, M.; Krintiras, G.; Lemaitre, V.; Magitteri, A.; Mertens, A.; Michotte, D.; Musich, M.; Piotrzkowski, K.; Quertenmont, L.; Szilasi, N.; Vidal Marono, M.; Wertz, S.; Beliy, N.; Caebergs, T.; Daubie, E.; Hammad, G. H.; Härkönen, J.; Lampén, T.; Luukka, P.; Peltola, T.; Tuominen, E.; Tuovinen, E.; Eerola, P.; Tuuva, T.; Baulieu, G.; Boudoul, G.; Caponetto, L.; Combaret, C.; Contardo, D.; Dupasquier, T.; Gallbit, G.; Lumb, N.; Mirabito, L.; Perries, S.; Vander Donckt, M.; Viret, S.; Agram, J.-L.; Andrea, J.; Bloch, D.; Bonnin, C.; Brom, J.-M.; Chabert, E.; Chanon, N.; Charles, L.; Conte, E.; Fontaine, J.-Ch.; Gross, L.; Hosselet, J.; Jansova, M.; Tromson, D.; Autermann, C.; Feld, L.; Karpinski, W.; Kiesel, K. M.; Klein, K.; Lipinski, M.; Ostapchuk, A.; Pierschel, G.; Preuten, M.; Rauch, M.; Schael, S.; Schomakers, C.; Schulz, J.; Schwering, G.; Wlochal, M.; Zhukov, V.; Pistone, C.; Fluegge, G.; Kuensken, A.; Pooth, O.; Stahl, A.; Aldaya, M.; Asawatangtrakuldee, C.; Beernaert, K.; Bertsche, D.; Contreras-Campana, C.; Eckerlin, G.; Eckstein, D.; Eichhorn, T.; Gallo, E.; Garay Garcia, J.; Hansen, K.; Haranko, M.; Harb, A.; Hauk, J.; Keaveney, J.; Kalogeropoulos, A.; Kleinwort, C.; Lohmann, W.; Mankel, R.; Maser, H.; Mittag, G.; Muhl, C.; Mussgiller, A.; Pitzl, D.; Reichelt, O.; Savitskyi, M.; Schuetze, P.; Walsh, R.; Zuber, A.; Biskop, H.; Buhmann, P.; Centis-Vignali, M.; Garutti, E.; Haller, J.; Hoffmann, M.; Lapsien, T.; Matysek, M.; Perieanu, A.; Scharf, Ch.; Schleper, P.; Schmidt, A.; Schwandt, J.; Sonneveld, J.; Steinbrück, G.; Vormwald, B.; Wellhausen, J.; Abbas, M.; Amstutz, C.; Barvich, T.; Barth, Ch.; Boegelspacher, F.; De Boer, W.; Butz, E.; Caselle, M.; Colombo, F.; Dierlamm, A.; Freund, B.; Hartmann, F.; Heindl, S.; Husemann, U.; Kornmayer, A.; Kudella, S.; Muller, Th.; Simonis, H. J.; Steck, P.; Weber, M.; Weiler, Th.; Anagnostou, G.; Asenov, P.; Assiouras, P.; Daskalakis, G.; Kyriakis, A.; Loukas, D.; Paspalaki, L.; Siklér, F.; Veszprémi, V.; Bhardwaj, A.; Dalal, R.; Jain, G.; Ranjan, K.; Bakhshiansohl, H.; Behnamian, H.; Khakzad, M.; Naseri, M.; Cariola, P.; Creanza, D.; De Palma, M.; De Robertis, G.; Fiore, L.; Franco, M.; Loddo, F.; Silvestris, L.; Maggi, G.; Martiradonna, S.; My, S.; Selvaggi, G.; Albergo, S.; Cappello, G.; Chiorboli, M.; Costa, S.; Di Mattia, A.; Giordano, F.; Potenza, R.; Saizu, M. A.; Tricomi, A.; Tuve, C.; Barbagli, G.; Brianzi, M.; Ciaranfi, R.; Ciulli, V.; Civinini, C.; D'Alessandro, R.; Focardi, E.; Latino, G.; Lenzi, P.; Meschini, M.; Paoletti, S.; Russo, L.; Scarlini, E.; Sguazzoni, G.; Strom, D.; Viliani, L.; Ferro, F.; Lo Vetere, M.; Robutti, E.; Dinardo, M. E.; Fiorendi, S.; Gennai, S.; Malvezzi, S.; Manzoni, R. A.; Menasce, D.; Moroni, L.; Pedrini, D.; Azzi, P.; Bacchetta, N.; Bisello, D.; Dall'Osso, M.; Pozzobon, N.; Tosi, M.; De Canio, F.; Gaioni, L.; Manghisoni, M.; Nodari, B.; Riceputi, E.; Re, V.; Traversi, G.; Comotti, D.; Ratti, L.; Alunni Solestizi, L.; Biasini, M.; Bilei, G. M.; Cecchi, C.; Checcucci, B.; Ciangottini, D.; Fanò, L.; Gentsos, C.; Ionica, M.; Leonardi, R.; Manoni, E.; Mantovani, G.; Marconi, S.; Mariani, V.; Menichelli, M.; Modak, A.; Morozzi, A.; Moscatelli, F.; Passeri, D.; Placidi, P.; Postolache, V.; Rossi, A.; Saha, A.; Santocchia, A.; Storchi, L.; Spiga, D.; Androsov, K.; Azzurri, P.; Arezzini, S.; Bagliesi, G.; Basti, A.; Boccali, T.; Borrello, L.; Bosi, F.; Castaldi, R.; Ciampa, A.; Ciocci, M. A.; Dell'Orso, R.; Donato, S.; Fedi, G.; Giassi, A.; Grippo, M. T.; Ligabue, F.; Lomtadze, T.; Magazzu, G.; Martini, L.; Mazzoni, E.; Messineo, A.; Moggi, A.; Morsani, F.; Palla, F.; Palmonari, F.; Raffaelli, F.; Rizzi, A.; Savoy-Navarro, A.; Spagnolo, P.; Tenchini, R.; Tonelli, G.; Venturi, A.; Verdini, P. G.; Bellan, R.; Costa, M.; Covarelli, R.; Da Rocha Rolo, M.; Demaria, N.; Rivetti, A.; Dellacasa, G.; Mazza, G.; Migliore, E.; Monteil, E.; Pacher, L.; Ravera, F.; Solano, A.; Fernandez, M.; Gomez, G.; Jaramillo Echeverria, R.; Moya, D.; Gonzalez Sanchez, F. J.; Vila, I.; Virto, A. L.; Abbaneo, D.; Ahmed, I.; Albert, E.; Auzinger, G.; Berruti, G.; Bianchi, G.; Blanchot, G.; Bonnaud, J.; Caratelli, A.; Ceresa, D.; Christiansen, J.; Cichy, K.; Daguin, J.; D'Auria, A.; Detraz, S.; Deyrail, D.; Dondelewski, O.; Faccio, F.; Frank, N.; Gadek, T.; Gill, K.; Honma, A.; Hugo, G.; Jara Casas, L. M.; Kaplon, J.; Kornmayer, A.; Kottelat, L.; Kovacs, M.; Krammer, M.; Lenoir, P.; Mannelli, M.; Marchioro, A.; Marconi, S.; Mersi, S.; Martina, S.; Michelis, S.; Moll, M.; Onnela, A.; Orfanelli, S.; Pavis, S.; Peisert, A.; Pernot, J.-F.; Petagna, P.; Petrucciani, G.; Postema, H.; Rose, P.; Tropea, P.; Troska, J.; Tsirou, A.; Vasey, F.; Vichoudis, P.; Verlaat, B.; Zwalinski, L.; Bachmair, F.; Becker, R.; di Calafiori, D.; Casal, B.; Berger, P.; Djambazov, L.; Donega, M.; Grab, C.; Hits, D.; Hoss, J.; Kasieczka, G.; Lustermann, W.; Mangano, B.; Marionneau, M.; Martinez Ruiz del Arbol, P.; Masciovecchio, M.; Meinhard, M.; Perozzi, L.; Roeser, U.; Starodumov, A.; Tavolaro, V.; Wallny, R.; Zhu, D.; Amsler, C.; Bösiger, K.; Caminada, L.; Canelli, F.; Chiochia, V.; de Cosa, A.; Galloni, C.; Hreus, T.; Kilminster, B.; Lange, C.; Maier, R.; Ngadiuba, J.; Pinna, D.; Robmann, P.; Taroni, S.; Yang, Y.; Bertl, W.; Deiters, K.; Erdmann, W.; Horisberger, R.; Kaestli, H.-C.; Kotlinski, D.; Langenegger, U.; Meier, B.; Rohe, T.; Streuli, S.; Cussans, D.; Flacher, H.; Goldstein, J.; Grimes, M.; Jacob, J.; Seif El Nasr-Storey, S.; Cole, J.; Hoad, C.; Hobson, P.; Morton, A.; Reid, I. D.; Auzinger, G.; Bainbridge, R.; Dauncey, P.; Hall, G.; James, T.; Magnan, A.-M.; Pesaresi, M.; Raymond, D. M.; Uchida, K.; Garabedian, A.; Heintz, U.; Narain, M.; Nelson, J.; Sagir, S.; Speer, T.; Swanson, J.; Tersegno, D.; Watson-Daniels, J.; Chertok, M.; Conway, J.; Conway, R.; Flores, C.; Lander, R.; Pellett, D.; Ricci-Tam, F.; Squires, M.; Thomson, J.; Yohay, R.; Burt, K.; Ellison, J.; Hanson, G.; Olmedo, M.; Si, W.; Yates, B. R.; Gerosa, R.; Sharma, V.; Vartak, A.; Yagil, A.; Zevi Della Porta, G.; Dutta, V.; Gouskos, L.; Incandela, J.; Kyre, S.; Mullin, S.; Patterson, A.; Qu, H.; White, D.; Dominguez, A.; Bartek, R.; Cumalat, J. P.; Ford, W. T.; Jensen, F.; Johnson, A.; Krohn, M.; Leontsinis, S.; Mulholland, T.; Stenson, K.; Wagner, S. R.; Apresyan, A.; Bolla, G.; Burkett, K.; Butler, J. N.; Canepa, A.; Cheung, H. W. K.; Chramowicz, J.; Christian, D.; Cooper, W. E.; Deptuch, G.; Derylo, G.; Gingu, C.; Grünendahl, S.; Hasegawa, S.; Hoff, J.; Howell, J.; Hrycyk, M.; Jindariani, S.; Johnson, M.; Kahlid, F.; Lei, C. M.; Lipton, R.; Lopes De Sá, R.; Liu, T.; Los, S.; Matulik, M.; Merkel, P.; Nahn, S.; Prosser, A.; Rivera, R.; Schneider, B.; Sellberg, G.; Shenai, A.; Spiegel, L.; Tran, N.; Uplegger, L.; Voirin, E.; Berry, D. R.; Chen, X.; Ennesser, L.; Evdokimov, A.; Evdokimov, O.; Gerber, C. E.; Hofman, D. J.; Makauda, S.; Mills, C.; Sandoval Gonzalez, I. D.; Alimena, J.; Antonelli, L. J.; Francis, B.; Hart, A.; Hill, C. S.; Parashar, N.; Stupak, J.; Bortoletto, D.; Bubna, M.; Hinton, N.; Jones, M.; Miller, D. H.; Shi, X.; Tan, P.; Baringer, P.; Bean, A.; Khalil, S.; Kropivnitskaya, A.; Majumder, D.; Wilson, G.; Ivanov, A.; Mendis, R.; Mitchell, T.; Skhirtladze, N.; Taylor, R.; Anderson, I.; Fehling, D.; Gritsan, A.; Maksimovic, P.; Martin, C.; Nash, K.; Osherson, M.; Swartz, M.; Xiao, M.; Bloom, K.; Claes, D. R.; Fangmeier, C.; Gonzalez Suarez, R.; Monroy, J.; Siado, J.; Hahn, K.; Sevova, S.; Sung, K.; Trovato, M.; Bartz, E.; Gershtein, Y.; Halkiadakis, E.; Kyriacou, S.; Lath, A.; Nash, K.; Osherson, M.; Schnetzer, S.; Stone, R.; Walker, M.; Malik, S.; Norberg, S.; Ramirez Vargas, J. E.; Alyari, M.; Dolen, J.; Godshalk, A.; Harrington, C.; Iashvili, I.; Kharchilava, A.; Nguyen, D.; Parker, A.; Rappoccio, S.; Roozbahani, B.; Alexander, J.; Chaves, J.; Chu, J.; Dittmer, S.; McDermott, K.; Mirman, N.; Rinkevicius, A.; Ryd, A.; Salvati, E.; Skinnari, L.; Soffi, L.; Tao, Z.; Thom, J.; Tucker, J.; Zientek, M.; Akgün, B.; Ecklund, K. M.; Kilpatrick, M.; Nussbaum, T.; Zabel, J.; Betchart, B.; Covarelli, R.; Demina, R.; Hindrichs, O.; Petrillo, G.; Eusebi, R.; Osipenkov, I.; Perloff, A.; Ulmer, K. A.
2017-06-01
The upgrade of the LHC to the High-Luminosity LHC (HL-LHC) is expected to increase the LHC design luminosity by an order of magnitude. This will require silicon tracking detectors with a significantly higher radiation hardness. The CMS Tracker Collaboration has conducted an irradiation and measurement campaign to identify suitable silicon sensor materials and strip designs for the future outer tracker at the CMS experiment. Based on these results, the collaboration has chosen to use n-in-p type silicon sensors and focus further investigations on the optimization of that sensor type. This paper describes the main measurement results and conclusions that motivated this decision.
NASA Astrophysics Data System (ADS)
Zhang, Zhicai
2018-04-01
Many physics analyses using the Compact Muon Solenoid (CMS) detector at the LHC require accurate, high-resolution electron and photon energy measurements. Following the excellent performance achieved during LHC Run I at center-of-mass energies of 7 and 8 TeV, the CMS electromagnetic calorimeter (ECAL) is operating at the LHC with proton-proton collisions at 13 TeV center-of-mass energy. The instantaneous luminosity delivered by the LHC during Run II has achieved unprecedented levels. The average number of concurrent proton-proton collisions per bunch-crossing (pileup) has reached up to 40 interactions in 2016 and may increase further in 2017. These high pileup levels necessitate a retuning of the ECAL readout and trigger thresholds and reconstruction algorithms. In addition, the energy response of the detector must be precisely calibrated and monitored. We present new reconstruction algorithms and calibration strategies that were implemented to maintain the excellent performance of the CMS ECAL throughout Run II. We will show performance results from the 2015-2016 data taking periods and provide an outlook on the expected Run II performance in the years to come. Beyond the LHC, challenging running conditions for CMS are expected after the High-Luminosity upgrade of the LHC (HL-LHC) . We review the design and R&D studies for the CMS ECAL and present first test beam studies. Particular challenges at HL-LHC are the harsh radiation environment, the increasing data rates, and the extreme level of pile-up events, with up to 200 simultaneous proton-proton collisions. We present test beam results of hadron irradiated PbWO crystals up to fluences expected at the HL-LHC . We also report on the R&D for the new readout and trigger electronics, which must be upgraded due to the increased trigger and latency requirements at the HL-LHC.
Physics with CMS and Electronic Upgrades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohlf, James W.
2016-08-01
The current funding is for continued work on the Compact Muon Solenoid (CMS) at the CERN Large Hadron Collider (LHC) as part of the Energy Frontier experimental program. The current budget year covers the first year of physics running at 13 TeV (Run 2). During this period we have concentrated on commisioning of the μTCA electronics, a new standard for distribution of CMS trigger and timing control signals and high bandwidth data aquistiion as well as participating in Run 2 physics.
NASA Astrophysics Data System (ADS)
Cavallari, Francesca
2015-09-01
The seminar presents an introduction to calorimetry in particle physics. Initially the purpose of electromagnetic and hadronic calorimeters in particle physics is shown. Then the paper focusses on electromagnetic calorimeters and it describes the microscopic phenomena that drive the formation of electromagnetic showers. Homogeneous and sampling calorimeters are presented and the energy resolution of both is analyzed. A few examples of past and present electromagnetic calorimeters at particle colliders are presented, with particular attention to the ones employed in the Atlas and CMS experiments at the LHC, their design constraints, challenges and adopted choices. Both these calorimeters were designed to operate for a minimum of ten years at the LHC, with an instantaneous luminosity of 1· 1034/cm2/s and for an integrated luminosity of 500/fb. From 2023 a new program will start: the high luminosity LHC (HL-LHC), which is expected to provide an instantaneous luminosity of around 5· 1034/cm2/s and integrate a total luminosity of around 3000/fb in ten years of data taking. The evolution of the CMS and Atlas calorimeters is assessed and needed upgrades are presented.
High-Luminosity Large Hadron Collider (HL-LHC) : Preliminary Design Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apollinari, G.; Béjar Alonso, I.; Brüning, O.
2015-12-17
The Large Hadron Collider (LHC) is one of the largest scientific instruments ever built. Since opening up a new energy frontier for exploration in 2010, it has gathered a global user community of about 7,000 scientists working in fundamental particle physics and the physics of hadronic matter at extreme temperature and density. To sustain and extend its discovery potential, the LHC will need a major upgrade in the 2020s. This will increase its luminosity (rate of collisions) by a factor of five beyond the original design value and the integrated luminosity (total collisions created) by a factor ten. The LHCmore » is already a highly complex and exquisitely optimised machine so this upgrade must be carefully conceived and will require about ten years to implement. The new configuration, known as High Luminosity LHC (HL-LHC), will rely on a number of key innovations that push accelerator technology beyond its present limits. Among these are cutting-edge 11-12 tesla superconducting magnets, compact superconducting cavities for beam rotation with ultra-precise phase control, new technology and physical processes for beam collimation and 300 metre-long high-power superconducting links with negligible energy dissipation. The present document describes the technologies and components that will be used to realise the project and is intended to serve as the basis for the detailed engineering design of HL-LHC.« less
The new Inner Tracking System of the ALICE experiment
NASA Astrophysics Data System (ADS)
Martinengo, P.; Alice Collaboration
2017-11-01
The ALICE experiment will undergo a major upgrade during the next LHC Long Shutdown scheduled in 2019-20 that will enable a detailed study of the properties of the QGP, exploiting the increased Pb-Pb luminosity expected during Run 3 and Run 4. The replacement of the existing Inner Tracking System with a completely new ultra-light, high-resolution detector is one of the cornerstones within this upgrade program. The main motivation of the ITS upgrade is to provide ALICE with an improved tracking capability and impact parameter resolution at very low transverse momentum, as well as to enable a substantial increase of the readout rate. The new ITS will consist of seven layers of innovative Monolithic Active Pixel Sensors with the innermost layer sitting at only 23 mm from the interaction point. This talk will focus on the design and the physics performance of the new ITS, as well as the technology choices adopted. The status of the project and the results from the prototypes characterization will also be presented.
Cleaning Insertions and Collimation Challenges
NASA Astrophysics Data System (ADS)
Redaelli, S.; Appleby, R. B.; Bertarelli, A.; Bruce, R.; Jowett, J. M.; Lechner, A.; Losito, R.
High-performance collimation systems are essential for operating efficiently modern hadron machine with large beam intensities. In particular, at the LHC the collimation system ensures a clean disposal of beam halos in the superconducting environment. The challenges of the HL-LHC study pose various demanding requests for beam collimation. In this paper we review the present collimation system and its performance during the LHC Run 1 in 2010-2013. Various collimation solutions under study to address the HL-LHC requirements are then reviewed, identifying the main upgrade baseline and pointing out advanced collimation concept for further enhancement of the performance.
Challenges and Plans for Injection and Beam Dump
NASA Astrophysics Data System (ADS)
Barnes, M.; Goddard, B.; Mertens, V.; Uythoven, J.
The injection and beam dumping systems of the LHC will need to be upgraded to comply with the requirements of operation with the HL-LHC beams. The elements of the injection system concerned are the fixed and movable absorbers which protect the LHC in case of an injection kicker error and the injection kickers themselves. The beam dumping system elements under study are the absorbers which protect the aperture in case of an asynchronous beam dump and the beam absorber block. The operational limits of these elements and the new developments in the context of the HL-LHC project are described.
Development of CMOS pixel sensors for the upgrade of the ALICE Inner Tracking System
NASA Astrophysics Data System (ADS)
Molnar, L.
2014-12-01
The ALICE Collaboration is preparing a major upgrade of the current detector, planned for installation during the second long LHC shutdown in the years 2018-19, in order to enhance its low-momentum vertexing and tracking capability, and exploit the planned increase of the LHC luminosity with Pb beams. One of the cornerstones of the ALICE upgrade strategy is to replace the current Inner Tracking System in its entirety with a new, high resolution, low-material ITS detector. The new ITS will consist of seven concentric layers equipped with Monolithic Active Pixel Sensors (MAPS) implemented using the 0.18 μm CMOS technology of TowerJazz. In this contribution, the main key features of the ITS upgrade will be illustrated with emphasis on the functionality of the pixel chip. The ongoing developments on the readout architectures, which have been implemented in several fabricated prototypes, will be discussed. The operational features of these prototypes as well as the results of the characterisation tests before and after irradiation will also be presented.
P-Type Silicon Strip Sensors for the new CMS Tracker at HL-LHC
Adam, W.; Bergauer, T.; Brondolin, E.; ...
2017-06-27
The upgrade of the LHC to the High-Luminosity LHC (HL-LHC) is expected to increase the LHC design luminosity by an order of magnitude. This will require silicon tracking detectors with a significantly higher radiation hardness. The CMS Tracker Collaboration has conducted an irradiation and measurement campaign to identify suitable silicon sensor materials and strip designs for the future outer tracker at the CMS experiment. Based on these results, the collaboration has chosen to use n-in-p type silicon sensors and focus further investigations on the optimization of that sensor type. Furthermore, this paper describes the main measurement results and conclusions thatmore » motivated this decision.« less
P-Type Silicon Strip Sensors for the new CMS Tracker at HL-LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adam, W.; Bergauer, T.; Brondolin, E.
The upgrade of the LHC to the High-Luminosity LHC (HL-LHC) is expected to increase the LHC design luminosity by an order of magnitude. This will require silicon tracking detectors with a significantly higher radiation hardness. The CMS Tracker Collaboration has conducted an irradiation and measurement campaign to identify suitable silicon sensor materials and strip designs for the future outer tracker at the CMS experiment. Based on these results, the collaboration has chosen to use n-in-p type silicon sensors and focus further investigations on the optimization of that sensor type. Furthermore, this paper describes the main measurement results and conclusions thatmore » motivated this decision.« less
Readout systems for inner detectors at the LHC and SLHC
NASA Astrophysics Data System (ADS)
Issever, Cigdem
2006-12-01
A general overview of the optoelectronic readout and control systems of the ATLAS and CMS inner detectors is given. The talk will also cover challenges and issues of future optoelectronic readout systems at the upgraded LHC (SLHC). First results of radiation tests of VCSELs and optical fibres which were irradiated up to SLHC fluences will be presented.
Assembly Tests of the First Nb 3 Sn Low-Beta Quadrupole Short Model for the Hi-Lumi LHC
Pan, H.; Felice, H.; Cheng, D. W.; ...
2016-01-18
In preparation for the high-luminosity upgrade of the Large Hadron Collider (LHC), the LHC Accelerator Research Program (LARP) in collaboration with CERN is pursuing the development of MQXF: a 150-mm-aperture high-field Nb3Sn quadrupole magnet. Moreover, the development phase starts with the fabrication and test of several short models (1.2-m magnetic length) and will continue with the development of several long prototypes. All of them are mechanically supported using a shell-based support structure, which has been extensively demonstrated on several R&D models within LARP. The first short model MQXFS-AT has been assembled at LBNL with coils fabricated by LARP and CERN.more » In our paper, we summarize the assembly process and show how it relies strongly on experience acquired during the LARP 120-mm-aperture HQ magnet series. We also present comparison between strain gauges data and finite-element model analysis. Finally, we present the implication of the MQXFS-AT experience on the design of the long prototype support structure.« less
Conceptual design of the cryostat for the new high luminosity (HL-LHC) triplet magnets
NASA Astrophysics Data System (ADS)
Ramos, D.; Parma, V.; Moretti, M.; Eymin, C.; Todesco, E.; Van Weelderen, R.; Prin, H.; Berkowitz Zamora, D.
2017-12-01
The High Luminosity LHC (HL-LHC) is a project to upgrade the LHC collider after 2020-2025 to increase the integrated luminosity by about one order of magnitude and extend the physics production until 2035. An upgrade of the focusing triplets insertion system for the ATLAS and CMS experiments is foreseen using superconducting magnets operating in a pressurised superfluid helium bath at 1.9 K. This will require the design and construction of four continuous cryostats, each about sixty meters in length and one meter in diameter, for the final beam focusing quadrupoles, corrector magnets and beam separation dipoles. The design is constrained by the dimensions of the existing tunnel and accessibility restrictions imposing the integration of cryogenic piping inside the cryostat, thus resulting in a very compact integration. As the alignment and position stability of the magnets is crucial for the luminosity performance of the machine, the magnet support system must be carefully designed in order to cope with parasitic forces and thermo-mechanical load cycles. In this paper, we present the conceptual design of the cryostat and discuss the approach to address the stringent and often conflicting requirements of alignment, integration and thermal aspects.
Upgrade to the Birmingham Irradiation Facility
NASA Astrophysics Data System (ADS)
Dervan, P.; French, R.; Hodgson, P.; Marin-Reyes, H.; Parker, K.; Wilson, J.; Baca, M.
2015-10-01
The Birmingham Irradiation Facility was developed in 2013 at the University of Birmingham using the Medical Physics MC40 cyclotron. It can achieve High Luminosity LHC (HL-LHC) fluences of 1015 (1 MeV neutron equivalent (neq)) cm-2 in 80 s with proton beam currents of 1 μA and so can evaluate effectively the performance and durability of detector technologies and new components to be used for the HL-LHC. Irradiations of silicon sensors and passive materials can be carried out in a temperature controlled cold box which moves continuously through the homogenous beamspot. This movement is provided by a pre-configured XY-axis Cartesian robot scanning system. In 2014 the cooling system and cold box were upgraded from a recirculating glycol chiller system to a liquid nitrogen evaporative system. The new cooling system achieves a stable temperature of -50 °C in 30 min and aims to maintain sub-0 °C temperatures on the sensors during irradiations. This paper reviews the design, development, commissioning and performance of the new cooling system.
ATLAS level-1 calorimeter trigger: Run-2 performance and Phase-1 upgrades
NASA Astrophysics Data System (ADS)
Carlson, Ben; Hong, Tae Min; Atlas Collaboration
2017-01-01
The Run-2 performance and Phase-1 upgrade are presented for the hardware-based level-1 calorimeter trigger (L1Calo) for the ATLAS Experiment. This trigger has a latency of about 2.2 microseconds to make a decision to help ATLAS select about 100 kHz of the most interesting collisions from the nominal LHC rate of 40 MHz. We summarize the upgrade after Run-1 (2009-2012) and discuss its performance in Run-2 (2015-current). We also outline the on-going Phase-1 upgrade for the next run (2021-2024) and its expected performance.
LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN
NASA Astrophysics Data System (ADS)
Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor
2017-12-01
The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.
Small-strip Thin Gap Chambers for the muon spectrometer upgrade of the ATLAS experiment
NASA Astrophysics Data System (ADS)
Perez Codina, E.; ATLAS Muon Collaboration
2016-07-01
The ATLAS muon system upgrade to be installed during the LHC long shutdown in 2018/19, the so-called New Small Wheel (NSW), is designed to cope with the increased instantaneous luminosity in LHC Run 3. The small-strip Thin Gap Chambers (sTGC) will provide the NSW with a fast trigger and high precision tracking. The construction protocol has been validated by test beam experiments on a full-size prototype sTGC detector, showing the performance requirements are met. The intrinsic spatial resolution for a single layer has been found to be about 45 μm for a perpendicular incident angle, and the transition region between pads has been measured to be about 4 mm.
Torsion limits from t t macr production at the LHC
NASA Astrophysics Data System (ADS)
de Almeida, F. M. L.; de Andrade, F. R.; do Vale, M. A. B.; Nepomuceno, A. A.
2018-04-01
Torsion models constitute a well-known class of extended quantum gravity models. In this work, one investigates the phenomenological consequences of a torsion field interacting with top quarks at the LHC. A torsion field could appear as a new heavy state characterized by its mass and couplings to fermions. This new state would form a resonance decaying into a top antitop pair. The latest ATLAS t t ¯ production results from LHC 13 TeV data are used to set limits on torsion parameters. The integrated luminosity needed to observe torsion resonance at the next LHC upgrades are also evaluated, considering different values for the torsion mass and its couplings to Standard Model fermions. Finally, prospects for torsion exclusion at the future LHC phases II and III are obtained using fast detector simulations.
Machine Protection with a 700 MJ Beam
NASA Astrophysics Data System (ADS)
Baer, T.; Schmidt, R.; Wenninger, J.; Wollmann, D.; Zerlauth, M.
After the high luminosity upgrade of the LHC, the stored energy per proton beam will increase by a factor of two as compared to the nominal LHC. Therefore, many damage studies need to be revisited to ensure a safe machine operation with the new beam parameters. Furthermore, new accelerator equipment like crab cavities might cause new failure modes, which are not sufficiently covered by the current machine protection system of the LHC. These failure modes have to be carefully studied and mitigated by new protection systems. Finally the ambitious goals for integrated luminosity delivered to the experiments during the era of HL-LHC require an increase of the machine availability without jeopardizing equipment protection.
Status and Plan for The Upgrade of The CMS Pixel Detector
NASA Astrophysics Data System (ADS)
Lu, Rong-Shyang; CMS Collaboration
2016-04-01
The silicon pixel detector is the innermost component of the CMS tracking system and plays a crucial role in the all-silicon CMS tracker. While the current pixel tracker is designed for and performing well at an instantaneous luminosity of up to 1 ×1034cm-2s-1, it can no longer be operated efficiently at significantly higher values. Based on the strong performance of the LHC accelerator, it is anticipated that peak luminosities of two times the design luminosity are likely to be reached before 2018 and perhaps significantly exceeded in the running period until 2022, referred to as LHC Run 3. Therefore, an upgraded pixel detector, referred to as the phase 1 upgrade, is planned for the year-end technical stop in 2016. With a new pixel readout chip (ROC), an additional fourth layer, two additional endcap disks, and a significantly reduced material budget the upgraded pixel detector will be able to sustain the efficiency of the pixel tracker at the increased requirements imposed by high luminosities and pile-up. The main new features of the upgraded pixel detector will be an ultra-light mechanical design, a digital readout chip with higher rate capability and a new cooling system. These and other design improvements, along with results of Monte Carlo simulation studies for the expected performance of the new pixel detector, will be discussed and compared to those of the current CMS detector.
The upgrade of the ATLAS first-level calorimeter trigger
NASA Astrophysics Data System (ADS)
Yamamoto, Shimpei; Atlas Collaboration
2016-07-01
The first-level calorimeter trigger (L1Calo) had operated successfully through the first data taking phase of the ATLAS experiment at the CERN Large Hadron Collider. Towards forthcoming LHC runs, a series of upgrades is planned for L1Calo to face new challenges posed by the upcoming increases of the beam energy and the luminosity. This paper reviews the ATLAS L1Calo trigger upgrade project that introduces new architectures for the liquid-argon calorimeter trigger readout and the L1Calo trigger processing system.
Fabrication and Analysis of 150-mm-Aperture Nb 3Sn MQXF Coils
Holik, E. F.; Ambrosio, G.; Anerella, M.; ...
2016-01-12
The U.S. LHC Accelerator Research Program (LARP) and CERN are combining efforts for the HiLumi-LHC upgrade to design and fabricate 150-mm-aperture, interaction region quadrupoles with a nominal gradient of 130 T/m using Nb 3Sn. To successfully produce the necessary long MQXF triplets, the HiLumi-LHC collaboration is systematically reducing risk and design modification by heavily relying upon the experience gained from the successful 120-mm-aperture LARP HQ program. First generation MQXF short (MQXFS) coils were predominately a scaling up of the HQ quadrupole design allowing comparable cable expansion during Nb 3Sn formation heat treatment and increased insulation fraction for electrical robustness. Amore » total of 13 first generation MQXFS coils were fabricated between LARP and CERN. Systematic differences in coil size, coil alignment symmetry, and coil length contraction during heat treatment are observed and likely due to slight variances in tooling and insulation/cable systems. Analysis of coil cross sections indicate that field-shaping wedges and adjacent coil turns are systematically displaced from the nominal location and the cable is expanding less than nominally designed. Lastly, a second generation MQXF coil design seeks to correct the expansion and displacement discrepancies by increasing insulation and adding adjustable shims at the coil pole and midplanes to correct allowed magnetic field harmonics.« less
The upgrade of the CMS hadron calorimeter with silicon photomultipliers
Strobbe, N.
2017-01-26
The upgrade of the hadron calorimeter of the CMS experiment at the CERN Large Hadron Collider is currently underway. The endcap sections will be upgraded in the winter of 2016–2017 and the barrel sections during the second LHC long shutdown in 2019. The existing photosensors will be replaced with about 16 000 new silicon photomultipliers (SiPMs), resulting in the first large installation of SiPMs in a radiation environment. All associated front-end electronics will also be upgraded. Here, this paper discusses the motivation for the upgrade and provides a description 17 of the new system, including the SiPMs with associated controlmore » electronics and the front-end readout cards.« less
Radiation Hard Active Media R&D for CMS Hadron Endcap Calorimetry
NASA Astrophysics Data System (ADS)
Tiras, Emrah; CMS-HCAL Collaboration
2015-04-01
The High Luminosity LHC era imposes unprecedented radiation conditions on the CMS detectors targeting a factor of 5-10 higher than the LHC design luminosity. The CMS detectors will need to be upgraded in order to withstand these conditions yet maintain/improve the physics measurement capabilities. One of the upgrade options is reconstructing the CMS Endcap Calorimeters with a shashlik design electromagnetic section and replacing active media of the hadronic section with radiation-hard scintillation materials. In this context, we have studied various radiation-hard materials such as Polyethylene Naphthalate (PEN), Polyethylene Terephthalate (PET), HEM and quartz plates coated with various organic materials such as p-Terphenyl (pTp), Gallium doped Zinc Oxide (ZnO:Ga) and Anthracene. Here we discuss the related test beam activities, laboratory measurements and recent developments.
Deployment of IPv6-only CPU resources at WLCG sites
NASA Astrophysics Data System (ADS)
Babik, M.; Chudoba, J.; Dewhurst, A.; Finnern, T.; Froy, T.; Grigoras, C.; Hafeez, K.; Hoeft, B.; Idiculla, T.; Kelsey, D. P.; López Muñoz, F.; Martelli, E.; Nandakumar, R.; Ohrenberg, K.; Prelz, F.; Rand, D.; Sciabà, A.; Tigerstedt, U.; Traynor, D.
2017-10-01
The fraction of Internet traffic carried over IPv6 continues to grow rapidly. IPv6 support from network hardware vendors and carriers is pervasive and becoming mature. A network infrastructure upgrade often offers sites an excellent window of opportunity to configure and enable IPv6. There is a significant overhead when setting up and maintaining dual-stack machines, so where possible sites would like to upgrade their services directly to IPv6 only. In doing so, they are also expediting the transition process towards its desired completion. While the LHC experiments accept there is a need to move to IPv6, it is currently not directly affecting their work. Sites are unwilling to upgrade if they will be unable to run LHC experiment workflows. This has resulted in a very slow uptake of IPv6 from WLCG sites. For several years the HEPiX IPv6 Working Group has been testing a range of WLCG services to ensure they are IPv6 compliant. Several sites are now running many of their services as dual-stack. The working group, driven by the requirements of the LHC VOs to be able to use IPv6-only opportunistic resources, continues to encourage wider deployment of dual-stack services to make the use of such IPv6-only clients viable. This paper presents the working group’s plan and progress so far to allow sites to deploy IPv6-only CPU resources. This includes making experiment central services dual-stack as well as a number of storage services. The monitoring, accounting and information services that are used by jobs also need to be upgraded. Finally the VO testing that has taken place on hosts connected via IPv6-only is reported.
LHC: The Large Hadron Collider
Lincoln, Don
2018-01-16
The Large Hadron Collider (or LHC) is the worldâs most powerful particle accelerator. In 2012, scientists used data taken by it to discover the Higgs boson, before pausing operations for upgrades and improvements. In the spring of 2015, the LHC will return to operations with 163% the energy it had before and with three times as many collisions per second. Itâs essentially a new and improved version of itself. In this video, Fermilabâs Dr. Don Lincoln explains both some of the absolutely amazing scientific and engineering properties of this modern scientific wonder.
Study of new FNAL-NICADD extruded scintillator as active media of large EMCal of ALICE at LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oleg A. Grachov et al.
The current conceptual design of proposed Large EMCal of ALICE at LHC is based largely on the scintillating mega-tile/fiber technology implemented in CDF Endplug upgrade project and in both barrel and endcap electromagnetic calorimeters of the STAR. The cost of scintillating material leads us to the choice of extruded polystyrene based scintillator, which is available in new FNAL-NICADD facility. Result of optical measurements, such as light yield and light yield variation, show that it is possible to use this material as active media of Large EMCal of ALICE at LHC.
Run II of the LHC: The Accelerator Science
NASA Astrophysics Data System (ADS)
Redaelli, Stefano
2015-04-01
In 2015 the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) starts its Run II operation. After the successful Run I at 3.5 TeV and 4 TeV in the 2010-2013 period, a first long shutdown (LS1) was mainly dedicated to the consolidation of the LHC magnet interconnections, to allow the LHC to operate at its design beam energy of 7 TeV. Other key accelerator systems have also been improved to optimize the performance reach at higher beam energies. After a review of the LS1 activities, the status of the LHC start-up progress is reported, addressing in particular the status of the LHC hardware commissioning and of the training campaign of superconducting magnets that will determine the operation beam energy in 2015. Then, the plans for the Run II operation are reviewed in detail, covering choice of initial machine parameters and strategy to improve the Run II performance. Future prospects of the LHC and its upgrade plans are also presented.
Will there be energy frontier colliders after LHC?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiltsev, Vladimir
2016-09-15
High energy particle colliders have been in the forefront of particle physics for more than three decades. At present the near term US, European and international strategies of the particle physics community are centered on full exploitation of the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). The future of the world-wide HEP community critically depends on the feasibility of possible post-LHC colliders. The concept of the feasibility is complex and includes at least three factors: feasibility of energy, feasibility of luminosity and feasibility of cost. Here we overview all current options for post-LHC collidersmore » from such perspective (ILC, CLIC, Muon Collider, plasma colliders, CEPC, FCC, HE-LHC) and discuss major challenges and accelerator R&D required to demonstrate feasibility of an energy frontier accelerator facility following the LHC. We conclude by taking a look into ultimate energy reach accelerators based on plasmas and crystals, and discussion on the perspectives for the far future of the accelerator-based particle physics.« less
NASA Astrophysics Data System (ADS)
Chiuchiolo, A.; Bajas, H.; Bajko, M.; Castaldo, B.; Consales, M.; Cusano, A.; Giordano, M.; Giloux, C.; Perez, J. C.; Sansone, L.; Viret, P.
2017-12-01
The magnets for the next steps in accelerator physics, such as the High Luminosity upgrade of the LHC (HL- LHC) and the Future Circular Collider (FCC), require the development of new technologies for manufacturing and monitoring. To meet the HL-LHC new requirements, a large upgrade of the CERN SM18 cryogenic test facilities is ongoing with the implementation of new cryostats and cryogenic instrumentation. The paper deals with the advances in the development and the calibration of fiber optic sensors in the range 300 - 4 K using a dedicated closed-cycle refrigerator system composed of a pulse tube and a cryogen-free cryostat. The calibrated fiber optic sensors (FOS) have been installed in three vertical cryostats used for testing superconducting magnets down to 1.9 K or 4.2 K and in the variable temperature test bench (100 - 4.2 K). Some examples of FOS measurements of cryostat temperature evolution are presented as well as measurements of strain performed on a subscale of High Temperature Superconducting magnet during its powering tests.
NASA Astrophysics Data System (ADS)
Battilana, C.
2017-01-01
The CMS muon system has played a key role for many physics results obtained from the LHC Run-1 and Run-2 data. During the Long Shutdown (2013-2014), as well as during the last year-end technical stop (2015-2016), significant consolidation and upgrades have been carried out on the muon detectors and on the L1 muon trigger. The algorithms for muon reconstruction and identification have also been improved for both the High-Level Trigger and the offline reconstruction. Results of the performance of muon detectors, reconstruction and trigger, obtained using data collected at 13 TeV centre-of-mass energy during the 2015 and 2016 LHC runs, will be presented. Comparison of simulation with experimental data will also be discussed where relevant. The system's state of the art performance will be shown, and the improvements foreseen to achieve excellent overall quality of muon reconstruction in CMS, in the conditions expected during the high-luminosity phase of Run-2, will be described.
Future HEP Accelerators: The US Perspective
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhat, Pushpalatha; Shiltsev, Vladimir
2015-11-02
Accelerator technology has advanced tremendously since the introduction of accelerators in the 1930s, and particle accelerators have become indispensable instruments in high energy physics (HEP) research to probe Nature at smaller and smaller distances. At present, accelerator facilities can be classified into Energy Frontier colliders that enable direct discoveries and studies of high mass scale particles and Intensity Frontier accelerators for exploration of extremely rare processes, usually at relatively low energies. The near term strategies of the global energy frontier particle physics community are centered on fully exploiting the physics potential of the Large Hadron Collider (LHC) at CERN throughmore » its high-luminosity upgrade (HL-LHC), while the intensity frontier HEP research is focused on studies of neutrinos at the MW-scale beam power accelerator facilities, such as Fermilab Main Injector with the planned PIP-II SRF linac project. A number of next generation accelerator facilities have been proposed and are currently under consideration for the medium- and long-term future programs of accelerator-based HEP research. In this paper, we briefly review the post-LHC energy frontier options, both for lepton and hadron colliders in various regions of the world, as well as possible future intensity frontier accelerator facilities.« less
Upgrade of the LHC magnet interconnections thermal shielding
NASA Astrophysics Data System (ADS)
Musso, Andrea; Barlow, Graeme; Bastard, Alain; Charrondiere, Maryline; Chrul, Anna; Damianoglou, Dimitrios; Deferne, Guy; Dib, Gaëlle; Duret, Max; Guinchard, Michael; Prin, Hervé; Strychalski, Michał; Craen, Arnaud Vande; Villiger, Gilles; Wright, Loren
2014-01-01
The about 1700 interconnections (ICs) between the Large Hadron Collider (LHC) superconducting magnets include thermal shielding at 50-75 K, providing continuity to the thermal shielding of the magnet cryostats to reduce the overall radiation heat loads to the 1.9 K helium bath of the magnets. The IC shield, made of aluminum, is conduction-cooled via a welded bridge to the thermal shield of the adjacent magnets which is actively cooled. TIG welding of these bridges made in the LHC tunnel at installation of the magnets induced a considerable risk of fire hazard due to the proximity of the multi-layer insulation of the magnet shields. A fire incident occurred in one of the machine sectors during machine installation, but fortunately with limited consequences thanks to prompt intervention of the operators. LHC is now undergoing a 2 years technical stop during which all magnet's ICs will have to be opened to consolidate the magnet electrical connections. The IC thermal shields will therefore have to be removed and re-installed after the work is completed. In order to eliminate the risk of fire hazard when re-welding, it has been decided to review the design of the IC shields, by replacing the welded bridges with a mechanical clamping which also preserves its thermal function. An additional advantage of this new solution is the ease in dismantling for maintenance, and eliminating weld-grinding operations at removal needing radioprotection measures because of material activation after long-term operation of the LHC. This paper describes the new design of the IC shields and in particular the theoretical and experimental validation of its thermal performance. Furthermore a status report of the on-going upgrade work in the LHC is given.
Field Quality Measurements in the FNAL Twin-Aperture 11 T Dipole for LHC Upgrades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strauss, T.; Apollinari, G.; Apollinari, G.
2016-11-08
FNAL and CERN are developing an 11 T Nb3Sn dipole suitable for installation in the LHC to provide room for additional collimators. Two 1 m long collared coils previously tested at FNAL in single-aperture dipole configuration were assembled into the twin-aperture configuration and tested including magnet quench performance and field quality. The results of magnetic measurements are reported and discussed in this paper.
Upgrade project and plans for the ATLAS detector and trigger
NASA Astrophysics Data System (ADS)
Pastore, Francesca; Atlas Collaboration
2013-08-01
The LHC is expected to under go upgrades over the coming years in order to extend its scientific potential. Through two different phases (namely Phase-I and Phase-II), the average luminosity will be increased by a factor 5-10 above the design luminosity, 1034 cm-2 s-1. Consequently, the LHC experiments will need upgraded detectors and new infrastructure of the trigger and DAQ systems, to take into account the increase of radiation level and of particle rates foreseen at such high luminosity. In this paper we describe the planned changes and the investigations for the ATLAS experiment, focusing on the requirements for the trigger system to handle the increase rate of collisions per beam crossing, while maintaining widely inclusive selections. In different steps, the trigger detectors will improve their selectivity by benefiting from increased granularity. To improve the flexibility of the system, the use of the tracking information in the lower levels of the trigger selection is also discussed. Lastly different scenarios are compared, based on the expected physics potential of ATLAS in this high luminosity regime.
Tests with beam setup of the TileCal phase-II upgrade electronics
NASA Astrophysics Data System (ADS)
Reward Hlaluku, Dingane
2017-09-01
The LHC has planned a series of upgrades culminating in the High Luminosity LHC which will have an average luminosity 5-7 times larger than the nominal Run-2 value. The ATLAS Tile calorimeter plans to introduce a new readout architecture by completely replacing the back-end and front-end electronics for the High Luminosity LHC. The photomultiplier signals will be fully digitized and transferred for every bunch crossing to the off-detector Tile PreProcessor. The Tile PreProcessor will further provide preprocessed digital data to the first level of trigger with improved spatial granularity and energy resolution in contrast to the current analog trigger signals. A single super-drawer module commissioned with the phase-II upgrade electronics is to be inserted into the real detector to evaluate and qualify the new readout and trigger concepts in the overall ATLAS data acquisition system. This new super-drawer, so-called hybrid Demonstrator, must provide analog trigger signals for backward compatibility with the current system. This Demonstrator drawer has been inserted into a Tile calorimeter module prototype to evaluate the performance in the lab. In parallel, one more module has been instrumented with two other front-end electronics options based on custom ASICs (QIE and FATALIC) which are under evaluation. These two modules together with three other modules composed of the current system electronics were exposed to different particles and energies in three test-beam campaigns during 2015 and 2016.
Detector Developments for the High Luminosity LHC Era (2/4)
Straessner, Arno
2018-04-16
Calorimetry and Muon Spectromers - Part II: When upgrading the LHC to higher luminosities, the detector and trigger performance shall be preserved - if not improved - with respect to the nominal performance. The ongoing R&D; for new radiation tolerant front-end electronics for calorimeters with higher read-out bandwidth are summarized and new possibilities for the trigger systems are presented. Similar developments are foreseen for the muon spectrometers, where also radiation tolerance of the muon detectors and functioning at high background rates is important. The corresponding plans and research work for the calorimeter and muon detectors at a LHC with highest luminsity are presented.
Conductor Specification and Validation for High-Luminosity LHC Quadrupole Magnets
Cooley, L. D.; Ghosh, A. K.; Dietderich, D. R.; ...
2017-06-01
The High Luminosity Upgrade of the Large Hadron Collider (HL-LHC) at CERN will replace the main ring inner triplet quadrupoles, identified by the acronym MQXF, adjacent to the main ring intersection regions. For the past decade, the U.S. LHC Accelerator R&D Program, LARP, has been evaluating conductors for the MQXFA prototypes, which are the outer magnets of the triplet. Recently, the requirements for MQXF magnets and cables have been published in P. Ferracin et al., IEEE Trans. Appl. Supercond., vol. 26, no. 4, 2016, Art. no.4000207, along with the final specification for Ti-alloyed Nb3Sn conductor determined jointly by CERN andmore » LARP. This paper describes the rationale beneath the 0.85 mm diameter strand’s chief parameters, which are 108 or more sub-elements, a copper fraction not less than 52.4%, strand critical current at 4.22 K not less than 631 A at 12 T and 331 A at 15 T, and residual resistance ratio of not less than 150. This paper also compares the performance for ~100 km production lots of the five most recent LARP conductors to the first 163 km of strand made according to the HL-LHC specification. Two factors emerge as significant for optimizing performance and minimizing risk: a modest increase of the sub-element diameter from 50 to 55 μm, and a Nb:Sn molar ratio of 3.6 instead of 3.4. Furthermore, the statistics acquired so far give confidence that the present conductor can balance competing demands in production for the HL-LHC project.« less
Proposal to upgrade the MIPP data acquisition system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, W.; Carey, D.; Johnstone, C.
2005-03-01
The MIPP TPC is the largest contributor to the MIPP event size by far. Its readout system and electronics were designed in the 1990's and limit it to a readout rate of 60 Hz in simple events and {approx} 20 Hz in complicated events. With the readout chips designed for the ALICE collaboration at the LHC, we propose a low cost effective scheme of upgrading the MIPP data acquisition speed to 3000 Hz.
High Luminosity LHC: challenges and plans
NASA Astrophysics Data System (ADS)
Arduini, G.; Barranco, J.; Bertarelli, A.; Biancacci, N.; Bruce, R.; Brüning, O.; Buffat, X.; Cai, Y.; Carver, L. R.; Fartoukh, S.; Giovannozzi, M.; Iadarola, G.; Li, K.; Lechner, A.; Medina Medrano, L.; Métral, E.; Nosochkov, Y.; Papaphilippou, Y.; Pellegrini, D.; Pieloni, T.; Qiang, J.; Redaelli, S.; Romano, A.; Rossi, L.; Rumolo, G.; Salvant, B.; Schenk, M.; Tambasco, C.; Tomás, R.; Valishev, S.; Van der Veken, F. F.
2016-12-01
The Large Hadron Collider (LHC) is one of the largest scientific instruments ever built. Since opening up a new energy frontier for exploration in 2010, it has gathered a global user community working in fundamental particle physics and the physics of hadronic matter at extreme temperature and density. To sustain and extend its discovery potential, the LHC will undergo a major upgrade in the 2020s. This will increase its rate of collisions by a factor of five beyond the original design value and the integrated luminosity by a factor ten. The new configuration, known as High Luminosity LHC (HL-LHC), will rely on a number of key innovations that push accelerator technology beyond its present limits. Among these are cutting-edge 11-12 T superconducting magnets, including Nb3Sn-based magnets never used in accelerators before, compact superconducting cavities for longitudinal beam rotation, new technology and physical processes for beam collimation. The dynamics of the HL-LHC beams will be also particularly challenging and this aspect is the main focus of this paper.
NASA Astrophysics Data System (ADS)
Ballarino, A.; Giannelli, S.; Jacquemod, A.; Leclercq, Y.; Ortiz Ferrer, C.; Parma, V.
2017-12-01
The High Luminosity LHC (HL-LHC) is a project aiming to upgrade the Large Hadron Collider (LHC) after 2020-2025 in order to increase the integrated luminosity by about one order of magnitude and extend the operational capabilities until 2035. The upgrade of the focusing triplet insertions for the Atlas and CMS experiments foresees using superconducting magnets operating in a pressurised superfluid helium bath at 1.9 K. The increased radiation levels from the particle debris produced by particle collisions in the experiments require that the power converters are placed in radiation shielded zones located in a service gallery adjacent to the main tunnel. The powering of the magnets from the gallery is achieved by means of MgB2 superconducting cables in a 100-m long flexible cryostat transfer line, actively cooled by 4.5 K to 20 K gaseous helium generated close to the magnets. At the highest temperature end, the helium flow cools the High Temperature Superconducting (HTS) current leads before being recovered at room temperature. At the magnet connection side, a dedicated connection box allows connection to the magnets and a controlled boil-off production of helium for the cooling needs of the powering system. This paper presents the overall concept of the cryostat system from the magnet connection boxes, through the flexible cryostat transfer line, to the connection box of the current leads.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lackner, Friedrich; Ferracin, Paolo; Todesco, Ezio
The High luminosity LHC upgrade target is to increase the integrated luminosity by a factor 10, resulting in an integrated luminosity of 3000 fb-1. One major improvement foreseen is the reduction of the beam size at the collision points. This requires the development of 150 mm single aperture quadrupoles for the interaction regions. These quadrupoles are under development in a joint collaboration between CERN and the US-LHC Accelerator Research Program (LARP). The chosen approach for achieving a nominal quadrupole field gradient of 132.6 T/m is based on the Nb3Sn technology. The coils with a length of 7281 mm will bemore » the longest Nb3Sn coils fabricated so far for accelerator magnets. The production of the long coils was launched in 2016 based on practise coils made from copper. This paper provides a status of the production of the first low grade and full performance coils and describes the production process and applied quality control. Furthermore an outlook for the prototype assembly is provided.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rossi, Adriana; et al.
Long-range beam-beam (LRBB) interactions can be a source of emittance growth and beam losses in the LHC during physics and will become even more relevant with the smaller '* and higher bunch intensities foreseen for the High Luminosity LHC upgrade (HL-LHC), in particular if operated without crab cavities. Both beam losses and emittance growth could be mitigated by compensat-ing the non-linear LRBB kick with a correctly placed current carrying wire. Such a compensation scheme is currently being studied in the LHC through a demonstration test using current-bearing wires embedded into col-limator jaws, installed either side of the high luminosity interactionmore » regions. For HL-LHC two options are considered, a current-bearing wire as for the demonstrator, or electron lenses, as the ideal distance between the particle beam and compensating current may be too small to allow the use of solid materials. This paper reports on the ongoing activities for both options, covering the progress of the wire-in-jaw collimators, the foreseen LRBB experiments at the LHC, and first considerations for the design of the electron lenses to ultimately replace material wires for HL-LHC.« less
NASA Astrophysics Data System (ADS)
Borges de Sousa, P.; Morrone, M.; Hovenga, N.; Garion, C.; van Weelderen, R.; Koettig, T.; Bremer, J.
2017-12-01
The High-Luminosity upgrade of the Large Hadron Collider (HL-LHC) will increase the accelerator’s luminosity by a factor 10 beyond its original design value, giving rise to more collisions and generating an intense flow of debris. A new beam screen has been designed for the inner triplets that incorporates tungsten alloy blocks to shield the superconducting magnets and the 1.9 K superfluid helium bath from incoming radiation. These screens will operate between 60 K and 80 K and are designed to sustain a nominal head load of 15 Wm-1, over 10 times the nominal heat load for the original LHC design. Their overall new and more complex design requires them and their constituent parts to be characterised from a thermal performance standpoint. In this paper we describe the experimental parametric study carried out on two principal thermal components: a representative sample of the beam screen with a tungsten-based alloy block and thermal link and the supporting structure composed of an assembly of ceramic spheres and titanium springs. Results from both studies are shown and discussed regarding their impact on the baseline considerations for the thermal design of the beam screens.
Diamond detectors for the TOTEM timing upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antchev, G.; Aspell, P.; Atanassov, I.
This paper describes the design and the performance of the timing detector developed by the TOTEM Collaboration for the Roman Pots (RPs) to measure the Time-Of-Flight (TOF) of the protons produced in central diffractive interactions at the LHC . The measurement of the TOF of the protons allows the determination of the longitudinal position of the proton interaction vertex and its association with one of the vertices reconstructed by the CMS detectors. The TOF detector is based on single crystal Chemical Vapor Deposition (scCVD) diamond plates and is designed to measure the protons TOF with about 50 ps time precision.more » This upgrade to the TOTEM apparatus will be used in the LHC run 2 and will tag the central diffractive events up to an interaction pileup of about 1. A dedicated fast and low noise electronics for the signal amplification has been developed. The digitization of the diamond signal is performed by sampling the waveform. In conclusion, after introducing the physics studies that will most profit from the addition of these new detectors, we discuss in detail the optimization and the performance of the first TOF detector installed in the LHC in November 2015.« less
Diamond detectors for the TOTEM timing upgrade
Antchev, G.; Aspell, P.; Atanassov, I.; ...
2017-03-09
This paper describes the design and the performance of the timing detector developed by the TOTEM Collaboration for the Roman Pots (RPs) to measure the Time-Of-Flight (TOF) of the protons produced in central diffractive interactions at the LHC . The measurement of the TOF of the protons allows the determination of the longitudinal position of the proton interaction vertex and its association with one of the vertices reconstructed by the CMS detectors. The TOF detector is based on single crystal Chemical Vapor Deposition (scCVD) diamond plates and is designed to measure the protons TOF with about 50 ps time precision.more » This upgrade to the TOTEM apparatus will be used in the LHC run 2 and will tag the central diffractive events up to an interaction pileup of about 1. A dedicated fast and low noise electronics for the signal amplification has been developed. The digitization of the diamond signal is performed by sampling the waveform. In conclusion, after introducing the physics studies that will most profit from the addition of these new detectors, we discuss in detail the optimization and the performance of the first TOF detector installed in the LHC in November 2015.« less
Coil End Parts Development Using BEND and Design for MQXF by LARP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Miao; Ambrosio, G.; Bermudez, S. Izquierdo
2016-09-06
End parts are critical components for saddle-shaped coils. They have a structural function where the cables are deformed in order to cross over the magnet aperture. Based on the previous design of the US LARP program for 90 mm aperture quadrupoles (TQ/LQ) and 120 mm aperture quadrupoles (HQ/LHQ) using BEND, the coil ends of the low-β quadruples (MQXF) for the HiLumi LHC upgrade were developed. This paper shows the design of the MQXF coil ends, the analysis of the coil ends during the coil fabrication, the autopsy analysis of the coil ends and the feedback to BEND parameters.
Two-Layer 16 Tesla Cosθ Dipole Design for the FCC
Holik, Eddie Frank; Ambrosio, Giorgio; Apollinari, G.
2018-02-13
The Future Circular Collider or FCC is a study aimed at exploring the possibility to reach 100 TeV total collision energy which would require 16 tesla dipoles. Upon the conclusion of the High Luminosity Upgrade, the US LHC Accelerator Upgrade Pro-ject in collaboration with CERN will have extensive Nb3Sn magnet fabrication experience. This experience includes robust Nb3Sn conductor and insulation scheming, 2-layer cos2θ coil fabrication, and bladder-and-key structure and assembly. By making im-provements and modification to existing technology the feasibility of a two-layer 16 tesla dipole is investigated. Preliminary designs indicate that fields up to 16.6 tesla are feasible withmore » conductor grading while satisfying the HE-LHC and FCC specifications. Key challenges include accommodating high-aspect ratio conductor, narrow wedge design, Nb3Sn conductor grading, and especially quench protection of a 16 tesla device.« less
Two-Layer 16 T Cos θ Dipole Design for the FCC
Holik, Eddie Frank; Ambrosio, Giorgio; Apollinari, Giorgio
2018-02-22
Here, the Future Circular Collider or FCC is a study aimed at exploring the possibility to reach 100 TeV total collision energy which would require 16 tesla dipoles. Upon the conclusion of the High Luminosity Upgrade, the US LHC Accelerator Upgrade Pro-ject in collaboration with CERN will have extensive Nb 3Sn magnet fabrication experience. This experience includes robust Nb 3Sn conductor and insulation scheming, 2-layer cos2θ coil fabrication, and bladder-and-key structure and assembly. By making im-provements and modification to existing technology the feasibility of a two-layer 16 tesla dipole is investigated. Preliminary designs indicate that fields up to 16.6 teslamore » are feasible with conductor grading while satisfying the HE-LHC and FCC specifications. Key challenges include accommodating high-aspect ratio conductor, narrow wedge design, Nb 3Sn conductor grading, and especially quench protection of a 16 tesla device.« less
Module and electronics developments for the ATLAS ITk pixel system
NASA Astrophysics Data System (ADS)
Muñoz, F. J.
2018-03-01
The ATLAS experiment is preparing for an extensive modification of its detectors in the course of the planned HL-LHC accelerator upgrade around 2025. The ATLAS upgrade includes the replacement of the entire tracking system by an all-silicon detector (Inner Tracker, ITk). The five innermost layers of ITk will be a pixel detector built of new sensor and readout electronics technologies to improve the tracking performance and cope with the severe HL-LHC environment in terms of occupancy and radiation. The total area of the new pixel system could measure up to 14 m2, depending on the final layout choice, which is expected to take place in 2018. In this paper an overview of the ongoing R&D activities on modules and electronics for the ATLAS ITk is given including the main developments and achievements in silicon planar and 3D sensor technologies, readout and power challenges.
Two-Layer 16 T Cos θ Dipole Design for the FCC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holik, Eddie Frank; Ambrosio, Giorgio; Apollinari, Giorgio
Here, the Future Circular Collider or FCC is a study aimed at exploring the possibility to reach 100 TeV total collision energy which would require 16 tesla dipoles. Upon the conclusion of the High Luminosity Upgrade, the US LHC Accelerator Upgrade Pro-ject in collaboration with CERN will have extensive Nb 3Sn magnet fabrication experience. This experience includes robust Nb 3Sn conductor and insulation scheming, 2-layer cos2θ coil fabrication, and bladder-and-key structure and assembly. By making im-provements and modification to existing technology the feasibility of a two-layer 16 tesla dipole is investigated. Preliminary designs indicate that fields up to 16.6 teslamore » are feasible with conductor grading while satisfying the HE-LHC and FCC specifications. Key challenges include accommodating high-aspect ratio conductor, narrow wedge design, Nb 3Sn conductor grading, and especially quench protection of a 16 tesla device.« less
RICH upgrade in LHCb experiment
NASA Astrophysics Data System (ADS)
Pistone, A.; LHCb RICH Collaboration
2017-01-01
The LHCb experiment is dedicated to precision measurements of CP violation and rare decays of B hadrons at the Large Hadron Collider (LHC) at CERN (Geneva). The second long shutdown of the LHC is currently scheduled to begin in 2019. During this period the LHCb experiment with all its sub-detectors will be upgraded in order to run at an instantaneous luminosity of 2 × 10^{33} cm ^{-2} s ^{-1} , about a factor 5 higher than the current luminosity, and to read out data at a rate of 40MHz into a flexible software-based trigger. The Ring Imaging CHerenkov (RICH) system will require new photon detectors and modifications to the optics of the upstream detector. Tests of the prototype of the smallest constituent of the new RICH system have been performed during testbeam sessions at the North Area test beam facility at CERN in the last years.
New high-precision drift-tube detectors for the ATLAS muon spectrometer
NASA Astrophysics Data System (ADS)
Kroha, H.; Fakhrutdinov, R.; Kozhin, A.
2017-06-01
Small-diameter muon drift tube (sMDT) detectors have been developed for upgrades of the ATLAS muon spectrometer. With a tube diameter of 15 mm, they provide an about an order of magnitude higher rate capability than the present ATLAS muon tracking detectors, the MDT chambers with 30 mm tube diameter. The drift-tube design and the construction methods have been optimised for mass production and allow for complex shapes required for maximising the acceptance. A record sense wire positioning accuracy of 5 μm has been achieved with the new design. In the serial production, the wire positioning accuracy is routinely better than 10 μm. 14 new sMDT chambers are already operational in ATLAS, further 16 are under construction for installation in the 2019-2020 LHC shutdown. For the upgrade of the barrel muon spectrometer for High-Luminosity LHC, 96 sMDT chambers will be contructed between 2020 and 2024.
A PCIe Gen3 based readout for the LHCb upgrade
NASA Astrophysics Data System (ADS)
Bellato, M.; Collazuol, G.; D'Antone, I.; Durante, P.; Galli, D.; Jost, B.; Lax, I.; Liu, G.; Marconi, U.; Neufeld, N.; Schwemmer, R.; Vagnoni, V.
2014-06-01
The architecture of the data acquisition system foreseen for the LHCb upgrade, to be installed by 2018, is devised to readout events trigger-less, synchronously with the LHC bunch crossing rate at 40 MHz. Within this approach the readout boards act as a bridge between the front-end electronics and the High Level Trigger (HLT) computing farm. The baseline design for the LHCb readout is an ATCA board requiring dedicated crates. A local area standard network protocol is implemented in the on-board FPGAs to read out the data. The alternative solution proposed here consists in building the readout boards as PCIe peripherals of the event-builder servers. The main architectural advantage is that protocol and link-technology of the event-builder can be left open until very late, to profit from the most cost-effective industry technology available at the time of the LHC LS2.
Multi-Threaded Algorithms for GPGPU in the ATLAS High Level Trigger
NASA Astrophysics Data System (ADS)
Conde Muíño, P.; ATLAS Collaboration
2017-10-01
General purpose Graphics Processor Units (GPGPU) are being evaluated for possible future inclusion in an upgraded ATLAS High Level Trigger farm. We have developed a demonstrator including GPGPU implementations of Inner Detector and Muon tracking and Calorimeter clustering within the ATLAS software framework. ATLAS is a general purpose particle physics experiment located on the LHC collider at CERN. The ATLAS Trigger system consists of two levels, with Level-1 implemented in hardware and the High Level Trigger implemented in software running on a farm of commodity CPU. The High Level Trigger reduces the trigger rate from the 100 kHz Level-1 acceptance rate to 1.5 kHz for recording, requiring an average per-event processing time of ∼ 250 ms for this task. The selection in the high level trigger is based on reconstructing tracks in the Inner Detector and Muon Spectrometer and clusters of energy deposited in the Calorimeter. Performing this reconstruction within the available farm resources presents a significant challenge that will increase significantly with future LHC upgrades. During the LHC data taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further to 7.5 times the design value in 2026 following LHC and ATLAS upgrades. Corresponding improvements in the speed of the reconstruction code will be needed to provide the required trigger selection power within affordable computing resources. Key factors determining the potential benefit of including GPGPU as part of the HLT processor farm are: the relative speed of the CPU and GPGPU algorithm implementations; the relative execution times of the GPGPU algorithms and serial code remaining on the CPU; the number of GPGPU required, and the relative financial cost of the selected GPGPU. We give a brief overview of the algorithms implemented and present new measurements that compare the performance of various configurations exploiting GPGPU cards.
Development of a modular test system for the silicon sensor R&D of the ATLAS Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, H.; Benoit, M.; Chen, H.
High Voltage CMOS sensors are a promising technology for tracking detectors in collider experiments. Extensive R&D studies are being carried out by the ATLAS Collaboration for a possible use of HV-CMOS in the High Luminosity LHC upgrade of the Inner Tracker detector. CaRIBOu (Control and Readout Itk BOard) is a modular test system developed to test Silicon based detectors. It currently includes five custom designed boards, a Xilinx ZC706 development board, FELIX (Front-End LInk eXchange) PCIe card and a host computer. A software program has been developed in Python to control the CaRIBOu hardware. CaRIBOu has been used in themore » testbeam of the HV-CMOS sensor AMS180v4 at CERN. Preliminary results have shown that the test system is very versatile. In conclusion, further development is ongoing to adapt to different sensors, and to make it available to various lab test stands.« less
Development of a modular test system for the silicon sensor R&D of the ATLAS Upgrade
Liu, H.; Benoit, M.; Chen, H.; ...
2017-01-11
High Voltage CMOS sensors are a promising technology for tracking detectors in collider experiments. Extensive R&D studies are being carried out by the ATLAS Collaboration for a possible use of HV-CMOS in the High Luminosity LHC upgrade of the Inner Tracker detector. CaRIBOu (Control and Readout Itk BOard) is a modular test system developed to test Silicon based detectors. It currently includes five custom designed boards, a Xilinx ZC706 development board, FELIX (Front-End LInk eXchange) PCIe card and a host computer. A software program has been developed in Python to control the CaRIBOu hardware. CaRIBOu has been used in themore » testbeam of the HV-CMOS sensor AMS180v4 at CERN. Preliminary results have shown that the test system is very versatile. In conclusion, further development is ongoing to adapt to different sensors, and to make it available to various lab test stands.« less
Interaction region design driven by energy deposition
NASA Astrophysics Data System (ADS)
Martin, Roman; Besana, Maria Ilaria; Cerutti, Francesco; Langner, Andy; Tomás, Rogelio; Cruz-Alaniz, Emilia; Dalena, Barbara
2017-08-01
The European Strategy Group for High Energy Physics recommends to study collider designs for the post-LHC era. Among the suggested projects there is the circular 100 TeV proton-proton collider FCC-hh. Starting from LHC and its proposed upgrade HL-LHC, this paper outlines the development of the interaction region design for FCC-hh. We identify energy deposition from debris of the collision events as a driving factor for the layout and draft the guiding principles to unify protection of the superconducting final focus magnets from radiation with a high luminosity performance. Furthermore, we offer a novel strategy to mitigate the lifetime limitation of the first final focus magnet due to radiation load, the Q1 split.
MicroTCA-based Global Trigger Upgrade project for the CMS experiment at LHC
NASA Astrophysics Data System (ADS)
Rahbaran, B.; Arnold, B.; Bergauer, H.; Eichberger, M.; Rabady, D.
2011-12-01
The electronics of the first Level Global Trigger (GT) of CMS is the last stage of the Level-1 trigger system [1]. At LHC up to 40 million collisions of proton bunches occur every second, resulting in about 800 million proton collisions. The CMS Level-1 Global Trigger [1], a custom designed electronics system based on FPGA technology and the VMEbus system, performs a quick on-line analysis of each collision every 25 ns and decides whether to reject or to accept it for further analysis. The CMS trigger group of the Institute of High Energy Physics in Vienna (HEPHY) is involved in the Level-1 trigger of the CMS experiment at CERN. As part of the Trigger Upgrade, the Level-1 Global Trigger will be redesigned and implemented in MicroTCA based technology, which allows engineers to detect all possible faults on plug-in boards, in the power supply and in the cooling system. The upgraded Global Trigger will be designed to have the same basic categories of functions as the present GT, but will have more algorithms and more possibilities for combining trigger candidates. Additionally, reconfigurability and testability will be supported based on the next system generation.
Upgraded Readout Electronics for the ATLAS Liquid Argon Calorimeters at the High Luminosity LHC
NASA Astrophysics Data System (ADS)
Andeen, Timothy R.; ATLAS Liquid Argon Calorimeter Group
2012-12-01
The ATLAS liquid-argon calorimeters produce a total of 182,486 signals which are digitized and processed by the front-end and back-end electronics at every triggered event. In addition, the front-end electronics sum analog signals to provide coarsely grained energy sums, called trigger towers, to the first-level trigger system, which is optimized for nominal LHC luminosities. However, the pile-up background expected during the high luminosity phases of the LHC will be increased by factors of 3 to 7. An improved spatial granularity of the trigger primitives is therefore proposed in order to improve the identification performance for trigger signatures, like electrons or photons, at high background rejection rates. For the first upgrade phase in 2018, new Liquid Argon Trigger Digitizer Boards are being designed to receive higher granularity signals, digitize them on detector and send them via fast optical links to a new, off-detector digital processing system. The digital processing system applies digital filtering and identifies significant energy depositions. The refined trigger primitives are then transmitted to the first level trigger system to extract improved trigger signatures. The general concept of the upgraded liquid-argon calorimeter readout together with the various electronics components to be developed for such a complex system is presented. The research activities and architectural studies undertaken by the ATLAS Liquid Argon Calorimeter Group are described, particularly details of the on-going design of mixed-signal front-end electronics, of radiation tolerant optical-links, and of the high-speed off-detector digital processing system.
Commissioning of the first chambers of the CMS GE1/1 muon station
NASA Astrophysics Data System (ADS)
Ressegotti, Martina; CMS Muon Group
2017-12-01
The upgrades of the LHC planned in the next years will increase the instantaneous luminosity up to 5 × 1034 cm -2 s -1 after Long Shutdown 3, a value about five times higher than the nominal one for which the CMS experiment was designed. The resulting larger rate of interactions will produce a higher pileup environment that will challenge the trigger system of the CMS experiment in its original configuration, in particular in the endcap region. As part of the upgrade program of the CMS muon endcaps, additional muon detectors based on Gas Electron Multiplier (GEM) technology will be installed, in order to be able to sustain a physics program during high-luminosity operation without performance losses. The installation of the GE1/1 station is scheduled for Long Shutdown 2 in 2019-2020 already a demonstrator composed of five superchambers has been installed during the Extended Year-End Technical Stop at the beginning of 2017. Its goal is to test the system’s operational conditions and also to demonstrate the integration of the GE1/1 chambers into the CMS online system. The status of the installation and commissioning of the GE1/1 demonstrator is presented.
Trigger readout electronics upgrade for the ATLAS Liquid Argon Calorimeters
NASA Astrophysics Data System (ADS)
Dinkespiler, B.
2017-09-01
The upgrade of the Large Hadron Collider (LHC) scheduled for the 2019-2020 shut-down period, referred to as Phase-I upgrade, will increase the instantaneous luminosity to about three times the design value. Since the current ATLAS trigger system does not allow sufficient increase of the trigger rate, an improvement of the trigger system is required. The Liquid Argon (LAr) Calorimeter read-out will therefore be modified to deliver digital trigger signals with a higher spatial granularity in order to improve the identification efficiencies of electrons, photons, tau, jets and missing energy, at high background rejection rates at the Level-1 trigger. The new trigger signals will be arranged in 34000 so-called Super Cells which achieves 5-10 times better granularity than the trigger towers currently used and allows an improved background rejection. The readout of the trigger signals will process the signal of the Super Cells at every LHC bunch-crossing at 12-bit precision and a frequency of 40 MHz. The data will be transmitted to the Back End using a custom serializer and optical converter and 5.12 Gb/s optical links. In order to verify the full functionality of the future Liquid Argon trigger system, a demonstrator set-up has been installed on the ATLAS detector and is operated in parallel to the regular ATLAS data taking during the LHC Run-2 in 2015 and 2016. Noise level and linearity on the energy measurement have been verified to be within our requirements. In addition, we have collected data from 13 TeV proton collisions during the LHC 2015 and 2016 runs, and have observed real pulses from the detector through the demonstrator system. The talk will give an overview of the Phase-I Upgrade of the ATLAS Liquid Argon Calorimeter readout and present the custom developed hardware including their role in real-time data processing and fast data transfer. This contribution will also report on the performance of the newly developed ASICs including their radiation tolerance and on the performance of the prototype boards in the demonstrator system based on various measurements with the 13 TeV collision data. Results of the high-speed link test with the prototypes of the final electronic boards will be also reported.
Upgrade of Tile Calorimeter of the ATLAS Detector for the High Luminosity LHC.
NASA Astrophysics Data System (ADS)
Valdes Santurio, Eduardo; Tile Calorimeter System, ATLAS
2017-11-01
The Tile Calorimeter (TileCal) is the hadronic calorimeter of ATLAS covering the central region of the ATLAS experiment. TileCal is a sampling calorimeter with steel as absorber and scintillators as active medium. The scintillators are read out by wavelength shifting fibers coupled to photomultiplier tubes (PMT). The analogue signals from the PMTs are amplified, shaped and digitized by sampling the signal every 25 ns. The High Luminosity Large Hadron Collider (HL-LHC) will have a peak luminosity of 5 × 1034 cm -2 s -1, five times higher than the design luminosity of the LHC. TileCal will undergo a major replacement of its on- and off-detector electronics for the high luminosity programme of the LHC in 2026. The calorimeter signals will be digitized and sent directly to the off-detector electronics, where the signals are reconstructed and shipped to the first level of trigger at a rate of 40 MHz. This will provide a better precision of the calorimeter signals used by the trigger system and will allow the development of more complex trigger algorithms. Three different options are presently being investigated for the front-end electronic upgrade. Extensive test beam studies will determine which option will be selected. Field Programmable Gate Arrays (FPGAs) are extensively used for the logic functions of the off- and on-detector electronics. One hybrid demonstrator prototype module with the new calorimeter module electronics, but still compatible with the present system, may be inserted in ATLAS at the end of 2016.
High Luminosity LHC: Challenges and plans
Arduini, G.; Barranco, J.; Bertarelli, A.; ...
2016-12-28
The Large Hadron Collider (LHC) is one of the largest scientific instruments ever built. Since opening up a new energy frontier for exploration in 2010, it has gathered a global user community working in fundamental particle physics and the physics of hadronic matter at extreme temperature and density. To sustain and extend its discovery potential, the LHC will undergo a major upgrade in the 2020s. This will increase its rate of collisions by a factor of five beyond the original design value and the integrated luminosity by a factor ten. The new configuration, known as High Luminosity LHC (HL-LHC), willmore » rely on a number of key innovations that push accelerator technology beyond its present limits. Among these are cutting-edge 11–12 T superconducting magnets, including Nb 3Sn-based magnets never used in accelerators before, compact superconducting cavities for longitudinal beam rotation, new technology and physical processes for beam collimation. As a result, the dynamics of the HL-LHC beams will be also particularly challenging and this aspect is the main focus of this paper.« less
Numerical simulations of a proposed hollow electron beam collimator for the LHC upgrade at CERN.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Previtali, V.; Stancari, G.; Valishev, A.
2013-07-12
In the last years the LHC collimation system has been performing over the expectations, providing the machine with a nearly perfect e cient cleaning system[1]. Nonetheless, when trying to push the existing accelerators to - and over - their design limits, all the accelerator components are required to boost their performances. In particular, in view of the high luminosity frontier for the LHC, the increased intensity would ask for a more e cient cleaning system. In this framework innovative collimation solutions are under evaluation[2]: one option is the usage of an hollow electron lens for beam halo cleaning. This workmore » intends to study the applicability of an the hollow electron lens for the LHC collimation, by evaluating the case of the existing Tevatron e-lens applied to the nominal LHC 7 TeV beam. New e-lens operation modes are here proposed to standard enhance the electron lens halo removal e ect.« less
Considerations on Energy Frontier Colliders after LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiltsev, Vladimir
2016-11-15
Since 1960’s, particle colliders have been in the forefront of particle physics, 29 total have been built and operated, 7 are in operation now. At present the near term US, European and international strategies of the particle physics community are centered on full exploitation of the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). The future of the world-wide HEP community critically depends on the feasibility of possible post-LHC colliders. The concept of the feasibility is complex and includes at least three factors: feasibility of energy, feasibility of luminosity and feasibility of cost. Here wemore » overview all current options for post-LHC colliders from such perspective (ILC, CLIC, Muon Collider, plasma colliders, CEPC, FCC, HE-LHC) and discuss major challenges and accelerator R&D required to demonstrate feasibility of an energy frontier accelerator facility following the LHC. We conclude by taking a look into ultimate energy reach accelerators based on plasmas and crystals, and discussion on the perspectives for the far future of the accelerator-based particle physics. This paper largely follows previous study [1] and the presenta ion given at the ICHEP’2016 conference in Chicago [2].« less
UPR/Mayaguez High Energy Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendez, Hector
This year the University of Puerto Rico at Mayaguez (UPRM) High Energy Physics (HEP) group continued with the ongoing research program outlined in the grant proposal. The program is centered on the Compact Muon Solenoid (CMS) experiment at the proton-proton (pp) collisions at the Large Hadron Collider (LHC) at CERN in Geneva, Switzerland. The main research focus is on data analysis and on the preparation for the High Luminosity (HL) LHC or experiment detector upgrade. The physics data analysis included Higgs Doublet Search and measurement of the (1) Λ 0 b branching fraction, (2) B meson mass, and (3) hyperonmore » θ - b lifetime. The detector upgrade included work on the preparations for the Forward Pixel (FPIX) detector Silicon Sensor Testing in a production run at Fermilab. In addition, the group has taken responsibilities on the Software Release through our former research associate Dr. Eric Brownson who acted until last December as a Level Two Offline Manager for the CMS Upgrade. In support of the CMS data analysis activities carried out locally, the UPRM group has built and maintains an excellent Tier3 analysis center in Mayaguez. This allowed us to analyze large data samples and to continue the development of algorithms for the upgrade tracking robustness we started several years ago, and we plan to resume in the near future. This project involves computer simulation of the radiation damage to be suffered at the higher luminosities of the upgraded LHC. This year we continued to serve as a source of outstanding students for the field of high energy physics. Three of our graduate students finished their MS work in May, 2014, Their theses research were on data analysis of heavy quark b-physics. All of them are currently enrolled at Ph.D. physics program across the nation. One of them (Hector Moreno) at New Mexico University (Hector Moreno), one at University of New Hampshire (Sandra Santiesteban) and one at University of Puerto Rico-Rio Piedras (Carlos Malca). The students H. Moreno and C. Malca has been directly supervised by Dr. Mendez and S. Santiesteban supervised by Dr. Ramirez. During the last 13 years, our group have graduated 23 MS students on experimental High Energy Physics data analysis and applied hardware techniques. Most of the students have been supported by DOE grants, included this grant. Since 2001, Dr. Mendez have directly supervised eleven students, Dr. Ramirez three students and the former PI (Dr. Lopez) nine students. These theses work are fully documented in the group web page (http://charma.uprm.edu). The High Energy Physics group at Mayaguez is small and presently consists of three Physics faculty members, the Senior Investigators Dr. Hector Mendez (Professor) and Dr. Juan Eduardo Ramirez (Professor), and Dr. Sudhir Malik who was just hired in July 2014. Dr. Ramirez is in charge of the UPRM Tier-3 computing and will be building the network bandwidth infrastructure for the campus, while Dr. Mendez will continues his effort in finishing the heavy quark physics data analysis and moving to work on SUSY analysis for the 2015 data. Our last grant application in 2012 was awarded only for 2013-2014. As a result our postdoc position was lost last month of March. Since then, we have hired Dr. Malik as a new faculty in order to reinforce the group and to continue our efforts with the CMS experiment. Our plan is to hire another junior faculty in the next two years to strengthen the HEP group even further. Dr. Mendez continues with QuarkNet activities involving an ever larger group of high school physics teachers from all around Puerto Rico.« less
A portable gas recirculation unit for gaseous detectors
NASA Astrophysics Data System (ADS)
Guida, R.; Mandelli, B.
2017-10-01
The use of greenhouse gases (usually C2H2F4, CF4 and SF6) is sometimes necessary to achieve the required performance for some gaseous detectors. The consumption of these gases in the LHC systems is reduced by recycling the gas mixture thanks to a complex gas recirculation system. Beyond greenhouse gas consumption due to LHC systems, a considerable contribution is generated by setups used for LHC detector upgrade projects, R&D activities, detector quality assurance or longevity tests. In order to minimise this emission, a new flexible and portable gas recirculation unit has been developed. Thanks to its low price, flexibility and user-friendly operation it can be easily adapted for the different types of detector systems and set-ups.
The future of the Large Hadron Collider and CERN.
Heuer, Rolf-Dieter
2012-02-28
This paper presents the Large Hadron Collider (LHC) and its current scientific programme and outlines options for high-energy colliders at the energy frontier for the years to come. The immediate plans include the exploitation of the LHC at its design luminosity and energy, as well as upgrades to the LHC and its injectors. This may be followed by a linear electron-positron collider, based on the technology being developed by the Compact Linear Collider and the International Linear Collider collaborations, or by a high-energy electron-proton machine. This contribution describes the past, present and future directions, all of which have a unique value to add to experimental particle physics, and concludes by outlining key messages for the way forward.
Powering the High-Luminosity Triplets
NASA Astrophysics Data System (ADS)
Ballarino, A.; Burnet, J. P.
The powering of the magnets in the LHC High-Luminosity Triplets requires production and transfer of more than 150 kA of DC current. High precision power converters will be adopted, and novel High Temperature Superconducting (HTS) current leads and MgB2 based transfer lines will provide the electrical link between the power converters and the magnets. This chapter gives an overview of the systems conceived in the framework of the LHC High-Luminosity upgrade for feeding the superconducting magnet circuits. The focus is on requirements, challenges and novel developments.
Upgrade of the ATLAS Tile Calorimeter Electronics
NASA Astrophysics Data System (ADS)
Moreno, Pablo; ATLAS Tile Calorimeter System
2016-04-01
The Tile Calorimeter (TileCal) is the hadronic calorimeter covering the central region of the ATLAS experiment at LHC. The TileCal readout consists of 9852 channels. The bulk of its upgrade will occur for the High Luminosity LHC phase (Phase II) where the peak luminosity will increase 5× compared to the design luminosity (1034 cm-2s-1) at center of mass energy of 14 TeV. The TileCal upgrade aims at replacing the majority of the on- and off-detector electronics to the extent that all calorimeter signals will be digitized and sent to the off-detector electronics in the counting room. To achieve the required reliability, redundancy has been introduced at different levels. Three different options are presently being investigated for the front-end electronic upgrade. Extensive test beam studies will determine which option will be selected. 10.24 Gbps optical links are used to read out all digitized data to the counting room while 4.8 Gbps down-links are used for synchronization, configuration and detector control. For the off-detector electronics a pre-processor (sROD) is being developed, which takes care of the initial trigger processing while temporarily storing the main data flow in pipeline and de-randomizer memories. Field Programmable Gate Arrays are extensively used for the logic functions off- and on-detector. One demonstrator prototype module with the new calorimeter module electronics, but still compatible with the present system, is planned to be inserted in ATLAS at the end of 2015.
A new strips tracker for the upgraded ATLAS ITk detector
NASA Astrophysics Data System (ADS)
David, C.
2018-01-01
The ATLAS detector has been designed and developed to function in the environment of the present Large Hadron Collider (LHC). At the next-generation tracking detector proposed for the High Luminosity LHC (HL-LHC), the so-called ATLAS Phase-II Upgrade, the fluences and radiation levels will be higher by as much as a factor of ten. The new sub-detectors must thus be faster, of larger area, more segmented and more radiation hard while the amount of inactive material should be minimized and the power supply to the front-end systems should be increased. For those reasons, the current inner tracker of the ATLAS detector will be fully replaced by an all-silicon tracking system that consists of a pixel detector at small radius close to the beam line and a large area strip tracker surrounding it. This document gives an overview of the design of the strip inner tracker (Strip ITk) and summarises the intensive R&D activities performed over the last years by the numerous institutes within the Strips ITk collaboration. These studies are accompanied with a strong prototyping effort to contribute to the optimisation of the Strip ITk's structure and components. This effort culminated recently in the release of the ATLAS Strips ITk Technical Design Report (TDR).
Flexible trigger menu implementation on the Global Trigger for the CMS Level-1 trigger upgrade
NASA Astrophysics Data System (ADS)
MATSUSHITA, Takashi;
2017-10-01
The CMS experiment at the Large Hadron Collider (LHC) has continued to explore physics at the high-energy frontier in 2016. The integrated luminosity delivered by the LHC in 2016 was 41 fb-1 with a peak luminosity of 1.5 × 1034 cm-2s-1 and peak mean pile-up of about 50, all exceeding the initial estimations for 2016. The CMS experiment has upgraded its hardware-based Level-1 trigger system to maintain its performance for new physics searches and precision measurements at high luminosities. The Global Trigger is the final step of the CMS Level-1 trigger and implements a trigger menu, a set of selection requirements applied to the final list of objects from calorimeter and muon triggers, for reducing the 40 MHz collision rate to 100 kHz. The Global Trigger has been upgraded with state-of-the-art FPGA processors on Advanced Mezzanine Cards with optical links running at 10 GHz in a MicroTCA crate. The powerful processing resources of the upgraded system enable implementation of more algorithms at a time than previously possible, allowing CMS to be more flexible in how it handles the available trigger bandwidth. Algorithms for a trigger menu, including topological requirements on multi-objects, can be realised in the Global Trigger using the newly developed trigger menu specification grammar. Analysis-like trigger algorithms can be represented in an intuitive manner and the algorithms are translated to corresponding VHDL code blocks to build a firmware. The grammar can be extended in future as the needs arise. The experience of implementing trigger menus on the upgraded Global Trigger system will be presented.
Readout of the upgraded ALICE-ITS
NASA Astrophysics Data System (ADS)
Szczepankiewicz, A.; ALICE Collaboration
2016-07-01
The ALICE experiment will undergo a major upgrade during the second long shutdown of the CERN LHC. As part of this program, the present Inner Tracking System (ITS), which employs different layers of hybrid pixels, silicon drift and strip detectors, will be replaced by a completely new tracker composed of seven layers of monolithic active pixel sensors. The upgraded ITS will have more than twelve billion pixels in total, producing 300 Gbit/s of data when tracking 50 kHz Pb-Pb events. Two families of pixel chips realized with the TowerJazz CMOS imaging process have been developed as candidate sensors: the ALPIDE, which uses a proprietary readout and sparsification mechanism and the MISTRAL-O, based on a proven rolling shutter architecture. Both chips can operate in continuous mode, with the ALPIDE also supporting triggered operations. As the communication IP blocks are shared among the two chip families, it has been possible to develop a common Readout Electronics. All the sensor components (analog stages, state machines, buffers, FIFOs, etc.) have been modelled in a system level simulation, which has been extensively used to optimize both the sensor and the whole readout chain design in an iterative process. This contribution covers the progress of the R&D efforts and the overall expected performance of the ALICE-ITS readout system.
Cryogenic Design of the New High Field Magnet Test Facility at CERN
NASA Astrophysics Data System (ADS)
Benda, V.; Pirotte, O.; De Rijk, G.; Bajko, M.; Craen, A. Vande; Perret, Ph.; Hanzelka, P.
In the framework of the R&D program related to the Large Hadron Collider (LHC) upgrades, a new High Field Magnet (HFM) vertical test bench is required. This facility located in the SM18 cryogenic test hall shall allow testing of up to 15 tons superconducting magnets with energy up to 10 MJ in a temperature range between 1.9 K and 4.5 K. The article describes the cryogenic architecture to be inserted in the general infrastructure of SM18 including the process and instrumentation diagram, the different operating phases including strategy for magnet cool down and warm up at controlled speed and quench management as well as the design of the main components.
The INFN-FBK pixel R&D program for HL-LHC
NASA Astrophysics Data System (ADS)
Meschini, M.; Dalla Betta, G. F.; Boscardin, M.; Calderini, G.; Darbo, G.; Giacomini, G.; Messineo, A.; Ronchin, S.
2016-09-01
We report on the ATLAS and CMS joint research activity, which is aiming at the development of new, thin silicon pixel detectors for the Large Hadron Collider Phase-2 detector upgrades. This R&D is performed under special agreement between Istituto Nazionale di Fisica Nucleare and FBK foundation (Trento, Italy). New generations of 3D and planar pixel sensors with active edges are being developed in the R&D project, and will be fabricated at FBK. A first planar pixel batch, which was produced by the end of year 2014, will be described in this paper. First clean room measurement results on planar sensors obtained before and after neutron irradiation will be presented.
NASA Astrophysics Data System (ADS)
Strizenec, P.
2014-09-01
The ATLAS experiment is designed to study the proton-proton collisions produced at the Large Hadron Collider (LHC) at CERN. Liquid Argon sampling calorimeters are used for all electromagnetic calorimetry covering the pseudorapidity region up to 3.2, as well as for hadronic calorimetry in the range 1.4-4.9. The electromagnetic calorimeters use lead as passive material and are characterized by an accordion geometry that allows a fast and uniform azimuthal response. Copper and tungsten were chosen as passive material for the hadronic calorimetry; whereas a parallel plate geometry was adopted at large polar angles, an innovative one based on cylindrical electrodes with thin argon gaps was designed for the coverage at low angles, where the particles flow is higher. All detectors are housed in three cryostats kept at 88.5 K. After installation in 2004-2006, the calorimeters were extensively commissioned over the three years period prior to first collisions in 2009, using cosmic rays and single LHC beams. Since then, around 27 fb-1 of data have been collected at a unprecedented center of mass energies between 7 TeV and 8 TeV. During all these stages, the calorimeter and its electronics have been operating with performances very close to the specification ones. After 2019, the instantaneous luminosity will reach 2-3 × 1034 cm-2s-1, well above the luminosity for which the calorimeter was designed. In order to preserve its triggering capabilities, the detector will be upgraded with a new fully digital trigger system with a refined granularity. In 2023, the instantaneous luminosity will ultimately reach 5-7 × 1034 cm-2s-1, requiring a complete replacement of the readout electronics. Moreover, with an increased particle flux, several phenomena (liquid argon boiling, space charge effects...) will affect the performance of the forward calorimeter (FCal). A replacement with a new FCal with smaller LAr gaps or a new calorimeter module are considered. The performance of these new calorimeters is being studied in highest intensity particle beams. This contribution covers all aspects of the first three years of operation. The excellent performance achieved is especially detailed in the context of the discovery of the Higgs boson announced in July 2012. The future plans to preserve this performance until the end of the LHC program are also presented.
The evolution of the Trigger and Data Acquisition System in the ATLAS experiment
NASA Astrophysics Data System (ADS)
Krasznahorkay, A.; Atlas Collaboration
2014-06-01
The ATLAS experiment, aimed at recording the results of LHC proton-proton collisions, is upgrading its Trigger and Data Acquisition (TDAQ) system during the current LHC first long shutdown. The purpose of the upgrade is to add robustness and flexibility to the selection and the conveyance of the physics data, simplify the maintenance of the infrastructure, exploit new technologies and, overall, make ATLAS data-taking capable of dealing with increasing event rates. The TDAQ system used to date is organised in a three-level selection scheme, including a hardware-based first-level trigger and second- and third-level triggers implemented as separate software systems distributed on separate, commodity hardware nodes. While this architecture was successfully operated well beyond the original design goals, the accumulated experience stimulated interest to explore possible evolutions. We will also be upgrading the hardware of the TDAQ system by introducing new elements to it. For the high-level trigger, the current plan is to deploy a single homogeneous system, which merges the execution of the second and third trigger levels, still separated, on a unique hardware node. Prototyping efforts already demonstrated many benefits to the simplified design. In this paper we report on the design and the development status of this new system.
Testbeam results of irradiated ams H18 HV-CMOS pixel sensor prototypes
NASA Astrophysics Data System (ADS)
Benoit, M.; Braccini, S.; Casse, G.; Chen, H.; Chen, K.; Di Bello, F. A.; Ferrere, D.; Golling, T.; Gonzalez-Sevilla, S.; Iacobucci, G.; Kiehn, M.; Lanni, F.; Liu, H.; Meng, L.; Merlassino, C.; Miucci, A.; Muenstermann, D.; Nessi, M.; Okawa, H.; Perić, I.; Rimoldi, M.; Ristić, B.; Barrero Pinto, M. Vicente; Vossebeld, J.; Weber, M.; Weston, T.; Wu, W.; Xu, L.; Zaffaroni, E.
2018-02-01
HV-CMOS pixel sensors are a promising option for the tracker upgrade of the ATLAS experiment at the LHC, as well as for other future tracking applications in which large areas are to be instrumented with radiation-tolerant silicon pixel sensors. We present results of testbeam characterisations of the 4th generation of Capacitively Coupled Pixel Detectors (CCPDv4) produced with the ams H18 HV-CMOS process that have been irradiated with different particles (reactor neutrons and 18 MeV protons) to fluences between 1× 1014 and 5× 1015 1-MeV- neq. The sensors were glued to ATLAS FE-I4 pixel readout chips and measured at the CERN SPS H8 beamline using the FE-I4 beam telescope. Results for all fluences are very encouraging with all hit efficiencies being better than 97% for bias voltages of 85 V. The sample irradiated to a fluence of 1× 1015 neq—a relevant value for a large volume of the upgraded tracker—exhibited 99.7% average hit efficiency. The results give strong evidence for the radiation tolerance of HV-CMOS sensors and their suitability as sensors for the experimental HL-LHC upgrades and future large-area silicon-based tracking detectors in high-radiation environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooley, L. D.; Ghosh, A. K.; Dietderich, D. R.
The High Luminosity Upgrade of the Large Hadron Collider (HL-LHC) at CERN will replace the main ring inner triplet quadrupoles, identified by the acronym MQXF, adjacent to the main ring intersection regions. For the past decade, the U.S. LHC Accelerator R&D Program, LARP, has been evaluating conductors for the MQXFA prototypes, which are the outer magnets of the triplet. Recently, the requirements for MQXF magnets and cables have been published in P. Ferracin et al., IEEE Trans. Appl. Supercond., vol. 26, no. 4, 2016, Art. no.4000207, along with the final specification for Ti-alloyed Nb3Sn conductor determined jointly by CERN andmore » LARP. This paper describes the rationale beneath the 0.85 mm diameter strand’s chief parameters, which are 108 or more sub-elements, a copper fraction not less than 52.4%, strand critical current at 4.22 K not less than 631 A at 12 T and 331 A at 15 T, and residual resistance ratio of not less than 150. This paper also compares the performance for ~100 km production lots of the five most recent LARP conductors to the first 163 km of strand made according to the HL-LHC specification. Two factors emerge as significant for optimizing performance and minimizing risk: a modest increase of the sub-element diameter from 50 to 55 μm, and a Nb:Sn molar ratio of 3.6 instead of 3.4. Furthermore, the statistics acquired so far give confidence that the present conductor can balance competing demands in production for the HL-LHC project.« less
BPM Design and Impedance Considerations for a Rotatable Collimator for the LHC Collimation Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Jeffrey Claiborne; /SLAC; Keller, Lewis
2010-08-26
The Phase II upgrade to the LHC collimation system calls for complementing the 30 high robust Phase I graphite secondary collimators with 30 high Z Phase II collimators. This paper reports on BPM and impedance considerations and measurements of the integrated BPMs in the prototype rotatable collimator to be installed in the Super Proton Synchrotron (SPS) at CERN. The BPMs are necessary to align the jaws with the beam. Without careful design the beam impedance can result in unacceptable heating of the chamber wall or beam instabilities. The impedance measurements involve utilizing both a single displaced wire and two wiresmore » excited in opposite phase to disentangle the driving and detuning transverse impedances. Trapped mode resonances and longitudinal impedance are to also be measured and compared with simulations. These measurements, when completed, will demonstrate the device is fully operational and has the impedance characteristics and BPM performance acceptable for installation in the SPS.« less
Numerical Investigation on Electron and Ion Transmission of GEM-based Detectors
NASA Astrophysics Data System (ADS)
Bhattacharya, Purba; Sahoo, Sumanya Sekhar; Biswas, Saikat; Mohanty, Bedangadas; Majumdar, Nayana; Mukhopadhyay, Supratik
2018-02-01
ALICE at the LHC is planning a major upgrade of its detector systems, including the TPC, to cope with an increase of the LHC luminosity after 2018. Different R&D activities are currently concentrated on the adoption of the Gas Electron Multiplier (GEM) as the gas amplification stage of the ALICE-TPC upgrade version. The major challenge is to have low ion feedback in the drift volume as well as to ensure a collection of good percentage of primary electrons in the signal generation process. In the present work, Garfield simulation framework has been adopted to numerically estimate the electron transparency and ion backflow fraction of GEM-based detectors. In this process, extensive simulations have been carried out to enrich our understanding of the complex physical processes occurring within single, triple and quadruple GEM detectors. A detailed study has been performed to observe the effect of detector geometry, field configuration and magnetic field on the above mentioned characteristics.
NASA Astrophysics Data System (ADS)
Borg, M.; Bertarelli, A.; Carra, F.; Gradassi, P.; Guardia-Valenzuela, J.; Guinchard, M.; Izquierdo, G. Arnau; Mollicone, P.; Sacristan-de-Frutos, O.; Sammut, N.
2018-03-01
The CERN Large Hadron Collider is currently being upgraded to operate at a stored beam energy of 680 MJ through the High Luminosity upgrade. The LHC performance is dependent on the functionality of beam collimation systems, essential for safe beam cleaning and machine protection. A dedicated beam experiment at the CERN High Radiation to Materials facility is created under the HRMT-23 experimental campaign. This experiment investigates the behavior of three collimation jaws having novel composite absorbers made of copper diamond, molybdenum carbide graphite, and carbon fiber carbon, experiencing accidental scenarios involving the direct beam impact on the material. Material characterization is imperative for the design, execution, and analysis of such experiments. This paper presents new data and analysis of the thermostructural characteristics of some of the absorber materials commissioned within CERN facilities. In turn, characterized elastic properties are optimized through the development and implementation of a mixed numerical-experimental optimization technique.
Study of the dE/dx resolution of a GEM Readout Chamber prototype for the upgrade of the ALICE TPC
NASA Astrophysics Data System (ADS)
Mathis, Andreas
2018-02-01
The ALICE Collaboration is planning a major upgrade of its central barrel detectors to be able to cope with the increased LHC luminosity beyond 2020. For the TPC, this implies a replacement of the currently used gated MWPCs (Multi-Wire Proportional Chamber) by GEM (Gas Electron Multiplier) based readout chambers. In order to prove, that the present particle identification capabilities via measurement of the specific energy loss are retained after the upgrade, a prototype of the ALICE IROC (Inner Readout Chamber) has been evaluated in a test beam campaign at the CERN PS. The dE/dx resolution of the prototype has been proven to be fully compatible with the current MWPCs.
NASA Astrophysics Data System (ADS)
Anderson, J.; Bauer, K.; Borga, A.; Boterenbrood, H.; Chen, H.; Chen, K.; Drake, G.; Dönszelmann, M.; Francis, D.; Guest, D.; Gorini, B.; Joos, M.; Lanni, F.; Lehmann Miotto, G.; Levinson, L.; Narevicius, J.; Panduro Vazquez, W.; Roich, A.; Ryu, S.; Schreuder, F.; Schumacher, J.; Vandelli, W.; Vermeulen, J.; Whiteson, D.; Wu, W.; Zhang, J.
2016-12-01
The ATLAS Phase-I upgrade (2019) requires a Trigger and Data Acquisition (TDAQ) system able to trigger and record data from up to three times the nominal LHC instantaneous luminosity. The Front-End LInk eXchange (FELIX) system provides an infrastructure to achieve this in a scalable, detector agnostic and easily upgradeable way. It is a PC-based gateway, interfacing custom radiation tolerant optical links from front-end electronics, via PCIe Gen3 cards, to a commodity switched Ethernet or InfiniBand network. FELIX enables reducing custom electronics in favour of software running on commercial servers. The FELIX system, the design of the PCIe prototype card and the integration test results are presented in this paper.
Heavy flavor results at RHIC - A comparative overview
Dong, Xin
2012-01-01
I review the latest heavy flavor measurements at RHIC experiments. Measurements from RHIC together with preliminary results from LHC offer us an opportunity to systematically study the sQGP medium properties. In the end, I will outlook a prospective future on precision heavy flavor measurements with detector upgrades at RHIC.
Precision Timing with Silicon Sensors for Use in Calorimetry
NASA Astrophysics Data System (ADS)
Bornheim, A.; Ronzhin, A.; Kim, H.; Bolla, G.; Pena, C.; Xie, S.; Apresyan, A.; Los, S.; Spiropulu, M.; Ramberg, E.
2017-11-01
The high luminosity upgrade of the Large Hadron Collider (HL-LHC) at CERN is expected to provide instantaneous luminosities of 5 × 1034 cm -2 s -1. The high luminosities expected at the HL-LHC will be accompanied by a factor of 5 to 10 more pileup compared with LHC conditions in 2015, causing general confusion for particle identification and event reconstruction. Precision timing allows to extend calorimetric measurements into such a high density environment by subtracting the energy deposits from pileup interactions. Calorimeters employing silicon as the active component have recently become a popular choice for the HL- LHC and future collider experiments which face very high radiation environments. We present studies of basic calorimetric and precision timing measurements using a prototype composed of tungsten absorber and silicon sensor as the active medium. We show that for the bulk of electromagnetic showers induced by electrons in the range of 20 GeV to 30 GeV, we can achieve time resolutions better than 25 ps per single pad sensor.
Measurement of electrodynamics characteristics of higher order modes for harmonic cavity at 2400 MHz
NASA Astrophysics Data System (ADS)
Shashkov, Ya V.; Sobenin, N. P.; Gusarova, M. A.; Lalayan, M. V.; Bazyl, D. S.; Donetskiy, R. V.; Orlov, A. I.; Zobov, M. M.; Zavadtsev, A. A.
2016-09-01
In the frameworks of the High Luminosity Large Hadron Collider (HL-LHC) upgrade program an application of additional superconducting harmonic cavities operating at 800 MHz is currently under discussion. As a possible candidate, an assembly of two cavities with grooved beam pipes connected by a drift tube and housed in a common cryomodule, was proposed. In this article we discuss measurements of loaded Q-factors of higher order modes (HOM) performed on a scaled aluminium single cell cavity prototype with the fundamental frequency of 2400 MHz and on an array of two such cavities connected by a narrow beam pipe. The measurements were performed for the system with and without the matching load in the drift tube..
Progress on the Development of the Nb 3Sn 11T Dipole for the High Luminosity Upgrade of LHC
Savary, Frederic; Bajko, Marta; Bordini, Bernardo; ...
2017-02-08
The high-luminosity large hadron collider (LHC) project at CERN entered into the production phase in October 2015 after the completion of the design study phase. In the meantime, the development of the 11 T dipole needed for the upgrade of the collimation system of the machine made significant progress with very good performance of the first two-in-one magnet model of 2-m length made at CERN. The 11 T dipole, which is more powerful than the current main dipoles of LHC, can be made shorter with an equivalent integrated field. This will allow creating space for the installation of additional collimatorsmore » in specific locations of the dispersion suppressor regions. Following tests carried out during heavy ions runs of LHC in the end of 2015, and a more recent review of the project budget, the installation plan for the 11 T dipole was revised. Consequently, one 11 T dipole full assembly containing two 11 T dipoles of 5.5-m length will be installed on either side of interaction point 7. These two units shall be installed during the long shutdown 2 in years 2019-2020. After a brief reminder on the design features of the magnet, this paper describes the current status of the development activities, in particular the short model programme and the construction of the first full scale prototype at CERN. Finally, critical operations such as the reaction treatment and the coil impregnation are discussed, the quench performance tests results of the two-in-one model are reviewed and finally, the plan toward the production for the long shut down 2 is described.« less
Transverse emittance growth due to rf noise in the high-luminosity LHC crab cavities
NASA Astrophysics Data System (ADS)
Baudrenghien, P.; Mastoridis, T.
2015-10-01
The high-luminosity LHC (HiLumi LHC) upgrade with planned operation from 2025 onward has a goal of achieving a tenfold increase in the number of recorded collisions thanks to a doubling of the intensity per bunch (2.2e11 protons) and a reduction of β* to 15 cm. Such an increase would significantly expedite new discoveries and exploration. To avoid detrimental effects from long-range beam-beam interactions, the half crossing angle must be increased to 295 microrad. Without bunch crabbing, this large crossing angle and small transverse beam size would result in a luminosity reduction factor of 0.3 (Piwinski angle). Therefore, crab cavities are an important component of the LHC upgrade, and will contribute strongly to achieving an increase in the number of recorded collisions. The proposed crab cavities are electromagnetic devices with a resonance in the radio frequency (rf) region of the spectrum (400.789 MHz). They cause a kick perpendicular to the direction of motion (transverse kick) to restore an effective head-on collision between the particle beams, thereby restoring the geometric factor to 0.8 [K. Oide and K. Yokoya, Phys. Rev. A 40, 315 (1989).]. Noise injected through the rf/low level rf (llrf) system could cause significant transverse emittance growth and limit luminosity lifetime. In this work, a theoretical relationship between the phase and amplitude rf noise spectrum and the transverse emittance growth rate is derived, for a hadron machine assuming zero synchrotron radiation damping and broadband rf noise, excluding infinitely narrow spectral lines. This derivation is for a single beam. Both amplitude and phase noise are investigated. The potential improvement in the presence of the transverse damper is also investigated.
Testbeam results of irradiated ams H18 HV-CMOS pixel sensor prototypes
Benoit, M.; Braccini, S.; Casse, G.; ...
2018-02-08
HV-CMOS pixel sensors are a promising option for the tracker upgrade of the ATLAS experiment at the LHC, as well as for other future tracking applications in which large areas are to be instrumented with radiation-tolerant silicon pixel sensors. We present results of testbeam characterisations of the 4 th generation of Capacitively Coupled Pixel Detectors (CCPDv4) produced with the ams H18 HV-CMOS process that have been irradiated with different particles (reactor neutrons and 18 MeV protons) to fluences between 1×10 14 and 5×10 15 1–MeV– n eq. The sensors were glued to ATLAS FE-I4 pixel readout chips and measured atmore » the CERN SPS H8 beamline using the FE-I4 beam telescope. Results for all fluences are very encouraging with all hit efficiencies being better than 97% for bias voltages of 85 V. The sample irradiated to a fluence of 1×10 15 neq—a relevant value for a large volume of the upgraded tracker—exhibited 99.7% average hit efficiency. Furthermore, the results give strong evidence for the radiation tolerance of HV-CMOS sensors and their suitability as sensors for the experimental HL-LHC upgrades and future large-area silicon-based tracking detectors in high-radiation environments.« less
Testbeam results of irradiated ams H18 HV-CMOS pixel sensor prototypes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benoit, M.; Braccini, S.; Casse, G.
HV-CMOS pixel sensors are a promising option for the tracker upgrade of the ATLAS experiment at the LHC, as well as for other future tracking applications in which large areas are to be instrumented with radiation-tolerant silicon pixel sensors. We present results of testbeam characterisations of the 4 th generation of Capacitively Coupled Pixel Detectors (CCPDv4) produced with the ams H18 HV-CMOS process that have been irradiated with different particles (reactor neutrons and 18 MeV protons) to fluences between 1×10 14 and 5×10 15 1–MeV– n eq. The sensors were glued to ATLAS FE-I4 pixel readout chips and measured atmore » the CERN SPS H8 beamline using the FE-I4 beam telescope. Results for all fluences are very encouraging with all hit efficiencies being better than 97% for bias voltages of 85 V. The sample irradiated to a fluence of 1×10 15 neq—a relevant value for a large volume of the upgraded tracker—exhibited 99.7% average hit efficiency. Furthermore, the results give strong evidence for the radiation tolerance of HV-CMOS sensors and their suitability as sensors for the experimental HL-LHC upgrades and future large-area silicon-based tracking detectors in high-radiation environments.« less
NASA Astrophysics Data System (ADS)
Rubbia, André
2009-06-01
The current focus of the CERN program is the Large Hadron Collider (LHC), however, CERN is engaged in long baseline neutrino physics with the CNGS project and supports T2K as recognized CERN RE13, and for good reasons: a number of observed phenomena in high-energy physics and cosmology lack their resolution within the Standard Model of particle physics; these puzzles include the origin of neutrino masses, CP-violation in the leptonic sector, and baryon asymmetry of the Universe. They will only partially be addressed at LHC. A positive measurement of sin2 2θ13 > 0.01 would certainly give a tremendous boost to neutrino physics by opening the possibility to study CP violation in the lepton sector and the determination of the neutrino mass hierarchy with upgraded conventional super-beams. These experiments (so called 'Phase II') require, in addition to an upgraded beam power, next generation very massive neutrino detectors with excellent energy resolution and high detection efficiency in a wide neutrino energy range, to cover 1st and 2nd oscillation maxima, and excellent particle identification and p0 background suppression. Two generations of large water Cherenkov detectors at Kamioka (Kamiokande and Super-Kamiokande) have been extremely successful. And there are good reasons to consider a third generation water Cherenkov detector with an order of magnitude larger mass than Super-Kamiokande for both non-accelerator (proton decay, supernovae,...) and accelerator-based physics. On the other hand, a very massive underground liquid Argon detector of about 100 kton could represent a credible alternative for the precision measurements of 'Phase II' and aim at significantly new results in neutrino astroparticle and non-accelerator-based particle physics (e.g. proton decay).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, J.; Bauer, K.; Borga, A.
The ATLAS Phase-I upgrade (2019) requires a Trigger and Data Acquisition (TDAQ) system able to trigger and record data from up to three times the nominal LHC instantaneous luminosity. Furthermore, the Front-End LInk eXchange (FELIX) system provides an infrastructure to achieve this in a scalable, detector agnostic and easily upgradeable way. It is a PC-based gateway, interfacing custom radiation tolerant optical links from front-end electronics, via PCIe Gen3 cards, to a commodity switched Ethernet or InfiniBand network. FELIX enables reducing custom electronics in favour of software running on commercial servers. Here, the FELIX system, the design of the PCIe prototypemore » card and the integration test results are presented.« less
Anderson, J.; Bauer, K.; Borga, A.; ...
2016-12-13
The ATLAS Phase-I upgrade (2019) requires a Trigger and Data Acquisition (TDAQ) system able to trigger and record data from up to three times the nominal LHC instantaneous luminosity. Furthermore, the Front-End LInk eXchange (FELIX) system provides an infrastructure to achieve this in a scalable, detector agnostic and easily upgradeable way. It is a PC-based gateway, interfacing custom radiation tolerant optical links from front-end electronics, via PCIe Gen3 cards, to a commodity switched Ethernet or InfiniBand network. FELIX enables reducing custom electronics in favour of software running on commercial servers. Here, the FELIX system, the design of the PCIe prototypemore » card and the integration test results are presented.« less
A new detector at RHIC, sPHENIX goals and status
NASA Astrophysics Data System (ADS)
Reed, Rosi;
2017-01-01
The study of heavy-ion collisions, which can create a new form matter, a nearly ideal strongly interacting fluid where quarks and gluons are no longer confined into nucleons, called Quark Gluon Plasma (QGP), is on the frontier of QCD studies. The Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Lab (BNL) has had a long and successful program of QGP study since 2000, with many upgrades that have increased the delivered luminosity considerably in the last decade. The sPHENIX proposal is for a second generation experiment at RHIC, which will take advantage of the increased luminosity, and allow measurements of jets, jet correlations and Upsilons (ϒs), with a kinematic reach that will overlap with measurements made at the Large Hadron Collider (LHC). Complementary measurements at RHIC and at the LHC probe the QGP at different temperatures and densities, which are necessary to determine the temperature dependence of transport coefficients of the QGP. The sPHENIX detector will have large acceptance electromagnetic and hadronic calorimetry, as well as precision tracking, and high rate capability which are necessary for precision jet and ϒ observables. The experiment will enable a program of systematic measurements at RHIC, with a detector capable of acquiring a large sample of events in p+p, p+A, and A+A collisions. This proceedings outlines the key measurements enabled by the new detector, and status of the project itself.
Hints from Run 1 and prospects from Run 2 at ATLAS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernius, Catrin, E-mail: Catrin.Bernius@cern.ch
2016-06-21
The Large Hadron Collider at CERN has allowed the ATLAS experiment to collect a large amount of proton-proton collision data at 7 TeV and 8 TeV centre-of-mass energies throughout Run 1. This dataset was used to discover a Higgs boson with Standard Model-like properties at a mass of about 125 GeV. Furthermore, an impressive number of searches for deviations from the Standard Model expectations have been carried out. To date, no evidence for new physics beyond the SM has been found. However, a few hints in form of 2-3 σ deviations have been observed. After an 18-month shutdown, in whichmore » the ATLAS detector has undergone various upgrades, the LHC has again started to deliver collision data at an increased centre-of-mass energy of 13 TeV, providing a much improved sensitivity for various searches, in particular for high mass particles. Some representative hints at the LHC Run 1 are presented, a brief overview of ATLAS upgrades and prospects for SUSY searches with early Run 2 data are given.« less
The CMS electron and photon trigger for the LHC Run 2
NASA Astrophysics Data System (ADS)
Dezoort, Gage; Xia, Fan
2017-01-01
The CMS experiment implements a sophisticated two-level triggering system composed of Level-1, instrumented by custom-design hardware boards, and a software High-Level-Trigger. A new Level-1 trigger architecture with improved performance is now being used to maintain the thresholds that were used in LHC Run I for the more challenging luminosity conditions experienced during Run II. The upgrades to the calorimetry trigger will be described along with performance data. The algorithms for the selection of final states with electrons and photons, both for precision measurements and for searches of new physics beyond the Standard Model, will be described in detail.
NASA Astrophysics Data System (ADS)
Quast, Thorben
2018-02-01
As part of its HL-LHC upgrade program, CMS is developing a High Granularity Calorimeter (HGCAL) to replace the existing endcap calorimeters. The HGCAL will be realised as a sampling calorimeter, including an electromagnetic compartment comprising 28 layers of silicon pad detectors with pad areas of 0.5-1.0 cm2 interspersed with absorbers. Prototype modules, based on 6-inch hexagonal silicon pad sensors with 128 channels, have been constructed and include many of the features required for this challenging detector. In 2016, beam tests of sampling configurations made from these modules have been conducted both at FNAL and at CERN using the Skiroc2 front-end ASIC (designed by the CALICE collaboration for ILC). In 2017, the setup has been extended with CALICE's AHCAL prototype, a scinitillator based sampling calorimeter, and it was further tested in dedicated beam tests at CERN. There, the new Skiroc2-CMS front-end ASIC was used for the first time. We highlight final results from our studies in 2016, including position resolution as well as precision timing-measurements. Furthermore, the extended setup in 2017 is discussed and first results from beam tests with electrons and pions are shown.
Mechanical Design Studies of the MQXF Long Model Quadrupole for the HiLumi LHC
Pan, Heng; Anderssen, Eric; Ambrosio, Giorgio; ...
2016-12-20
The Large Hadron Collider Luminosity upgrade (HiLumi) program requires new low-β triplet quadrupole magnets, called MQXF, in the Interaction Region (IR) to increase the LHC peak and integrated luminosity. The MQXF magnets, designed and fabricated in collaboration between CERN and the U.S. LARP, will all have the same cross section. The MQXF long model, referred as MQXFA, is a quadrupole using the Nb3Sn superconducting technology with 150 mm aperture and a 4.2 m magnetic length and is the first long prototype of the final MQXF design. The MQXFA magnet is based on the previous LARP HQ and MQXFS designs. Inmore » this paper we present the baseline design of the MQXFA structure with detailed 3D numerical analysis. A detailed tolerance analysis of the baseline case has been performed by using a 3D finite element model, which allows fast computation of structures modelled with actual tolerances. Tolerance sensitivity of each component is discussed to verify the actual tolerances to be achieved by vendors. In conclusion, tolerance stack-up analysis is presented in the end of this paper.« less
Real time analysis with the upgraded LHCb trigger in Run III
NASA Astrophysics Data System (ADS)
Szumlak, Tomasz
2017-10-01
The current LHCb trigger system consists of a hardware level, which reduces the LHC bunch-crossing rate of 40 MHz to 1.1 MHz, a rate at which the entire detector is read out. A second level, implemented in a farm of around 20k parallel processing CPUs, the event rate is reduced to around 12.5 kHz. The LHCb experiment plans a major upgrade of the detector and DAQ system in the LHC long shutdown II (2018-2019). In this upgrade, a purely software based trigger system is being developed and it will have to process the full 30 MHz of bunch crossings with inelastic collisions. LHCb will also receive a factor of 5 increase in the instantaneous luminosity, which further contributes to the challenge of reconstructing and selecting events in real time with the CPU farm. We discuss the plans and progress towards achieving efficient reconstruction and selection with a 30 MHz throughput. Another challenge is to exploit the increased signal rate that results from removing the 1.1 MHz readout bottleneck, combined with the higher instantaneous luminosity. Many charm hadron signals can be recorded at up to 50 times higher rate. LHCb is implementing a new paradigm in the form of real time data analysis, in which abundant signals are recorded in a reduced event format that can be fed directly to the physics analyses. These data do not need any further offline event reconstruction, which allows a larger fraction of the grid computing resources to be devoted to Monte Carlo productions. We discuss how this real-time analysis model is absolutely critical to the LHCb upgrade, and how it will evolve during Run-II.
5-year operation experience with the 1.8 K refrigeration units of the LHC cryogenic system
NASA Astrophysics Data System (ADS)
Ferlin, G.; Tavian, L.; Claudet, S.; Pezzetti, M.
2015-12-01
Since 2009, the Large Hadron Collider (LHC) is in operation at CERN. The LHC superconducting magnets distributed over eight sectors of 3.3-km long are cooled at 1.9 K in pressurized superfluid helium. The nominal operating temperature of 1.9 K is produced by eight 1.8-K refrigeration units based on centrifugal cold compressors (3 or 4 stages depending to the vendor) combined with warm volumetric screw compressors with sub-atmospheric suction. After about 5 years of continuous operation, we will present the results concerning the availability for the final user of these refrigeration units and the impact of the design choice on the recovery time after a system trip. We will also present the individual results for each rotating machinery in terms of failure origin and of Mean Time between Failure (MTBF), as well as the consolidations and upgrades applied to these refrigeration units.
A review of advances in pixel detectors for experiments with high rate and radiation
NASA Astrophysics Data System (ADS)
Garcia-Sciveres, Maurice; Wermes, Norbert
2018-06-01
The large Hadron collider (LHC) experiments ATLAS and CMS have established hybrid pixel detectors as the instrument of choice for particle tracking and vertexing in high rate and radiation environments, as they operate close to the LHC interaction points. With the high luminosity-LHC upgrade now in sight, for which the tracking detectors will be completely replaced, new generations of pixel detectors are being devised. They have to address enormous challenges in terms of data throughput and radiation levels, ionizing and non-ionizing, that harm the sensing and readout parts of pixel detectors alike. Advances in microelectronics and microprocessing technologies now enable large scale detector designs with unprecedented performance in measurement precision (space and time), radiation hard sensors and readout chips, hybridization techniques, lightweight supports, and fully monolithic approaches to meet these challenges. This paper reviews the world-wide effort on these developments.
L1 track triggers for ATLAS in the HL-LHC
Lipeles, E.
2012-01-01
The HL-LHC, the planned high luminosity upgrade for the LHC, will increase the collision rate in the ATLAS detector approximately a factor of 5 beyond the luminosity for which the detectors were designed, while also increasing the number of pile-up collisions in each event by a similar factor. This means that the level-1 trigger must achieve a higher rejection factor in a more difficult environment. This presentation discusses the challenges that arise in this environment and strategies being considered by ATLAS to include information from the tracking systems in the level-1 decision. The main challenges involve reducing the data volumemore » exported from the tracking system for which two options are under consideration: a region of interest based system and an intelligent sensor method which filters on hits likely to come from higher transverse momentum tracks.« less
Performance of the Prototype Readout System for the CMS Endcap Hadron Calorimeter Upgrade
NASA Astrophysics Data System (ADS)
Chaverin, Nate; Dittmann, Jay; Hatakeyama, Kenichi; Pastika, Nathaniel; CMS Collaboration
2016-03-01
The Compact Muon Solenoid (CMS) experiment at the CERN Large Hadron Collider (LHC) will upgrade the photodetectors and readout systems of the endcap hadron calorimeter during the technical stop scheduled for late 2016 and early 2017. A major milestone for this project was a highly successful testbeam run at CERN in August 2015. The testbeam run served as a full integration test of the electronics, allowing a study of the response of the preproduction electronics to the true detector light profile, as well as a test of the light yield of various new plastic scintillator materials. We present implications for the performance of the hadron calorimeter front-end electronics based on testbeam data, and we report on the production status of various components of the system in preparation for the upgrade.
Test results of the LARP Nb$$_3$$Sn quadrupole HQ03a
DiMarco, J.; G. Ambrosio; Chlachidze, G.; ...
2016-03-09
The US LHC Accelerator Research Program (LARP) has been developingmore » $$Nb_3Sn$$ quadrupoles of progressively increasing performance for the high luminosity upgrade of the Large Hadron Collider. The 120 mm aperture High-field Quadrupole (HQ) models are the last step in the R&D phase supporting the development of the new IR Quadrupoles (MQXF). Three series of HQ coils were fabricated and assembled in a shell-based support structure, progressively optimizing the design and fabrication process. The final set of coils consistently applied the optimized design solutions, and was assembled in the HQ03a model. Furthermore, this paper reports a summary of the HQ03a test results, including training, mechanical performance, field quality and quench studies.« less
NASA Astrophysics Data System (ADS)
Barbier, G.; Cadoux, F.; Clark, A.; Endo, M.; Favre, Y.; Ferrere, D.; Gonzalez-Sevilla, S.; Hanagaki, K.; Hara, K.; Iacobucci, G.; Ikegami, Y.; Jinnouchi, O.; La Marra, D.; Nakamura, K.; Nishimura, R.; Perrin, E.; Seez, W.; Takubo, Y.; Takashima, R.; Terada, S.; Todome, K.; Unno, Y.; Weber, M.
2014-04-01
It is expected that after several years of data-taking, the Large Hadron Collider (LHC) physics programme will be extended to the so-called High-Luminosity LHC, where the instantaneous luminosity will be increased up to 5 × 1034 cm-2 s-1. For the general-purpose ATLAS experiment at the LHC, a complete replacement of its internal tracking detector will be necessary, as the existing detector will not provide the required performance due to the cumulated radiation damage and the increase in the detector occupancy. The baseline layout for the new ATLAS tracker is an all-silicon-based detector, with pixel sensors in the inner layers and silicon micro-strip detectors at intermediate and outer radii. The super-module (SM) is an integration concept proposed for the barrel strip region of the future ATLAS tracker, where double-sided stereo silicon micro-strip modules (DSM) are assembled into a low-mass local support (LS) structure. Mechanical aspects of the proposed LS structure are described.
Test beam studies of silicon timing for use in calorimetry
Apresyan, A.; Bolla, G.; Bornheim, A.; ...
2016-04-12
The high luminosity upgrade of the Large Hadron Collider (HL-LHC) at CERN is expected to provide instantaneous luminosities of 5 X 10 34 cm –2 s –1. The high luminosities expected at the HL-LHC will be accompanied by a factor of 5 to 10 more pileup compared with LHC conditions in 2015, causing general confusion for particle identification and event reconstruction. Precision timing allows to extend calorimetric measurements into such a high density environment by subtracting the energy deposits from pileup interactions. Calorimeters employing silicon as the active component have recently become a popular choice for the HL-LHC and futuremore » collider experiments which face very high radiation environments. In this article, we present studies of basic calorimetric and precision timing measurements using a prototype composed of tungsten absorber and silicon sensor as the active medium. Lastly, we show that for the bulk of electromagnetic showers induced by electrons in the range of 20 GeV to 30 GeV, we can achieve time resolutions better than 25 ps per single pad sensor.« less
Precision Timing with Silicon Sensors for Use in Calorimetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bornheim, A.; Ronzhin, A.; Kim, H.
2017-11-27
The high luminosity upgrade of the Large Hadron Collider (HL-LHC) at CERN is expected to provide instantaneous luminosities of 5 × 10 34 cm -2 s -1. The high luminosities expected at the HL-LHC will be accompanied by a factor of 5 to 10 more pileup compared with LHC conditions in 2015, causing general confusion for particle identification and event reconstruction. Precision timing allows to extend calorimetric measurements into such a high density environment by subtracting the energy deposits from pileup interactions. Calorimeters employing silicon as the active component have recently become a popular choice for the HL- LHC andmore » future collider experiments which face very high radiation environments. We present studies of basic calorimetric and precision timing measurements using a prototype composed of tungsten absorber and silicon sensor as the active medium. We show that for the bulk of electromagnetic showers induced by electrons in the range of 20 GeV to 30 GeV, we can achieve time resolutions better than 25 ps per single pad sensor.« less
New Fast Beam Conditions Monitoring (BCM1F) system for CMS
NASA Astrophysics Data System (ADS)
Zagozdzinska, A. A.; Bell, A. J.; Dabrowski, A. E.; Hempel, M.; Henschel, H. M.; Karacheban, O.; Przyborowski, D.; Leonard, J. L.; Penno, M.; Pozniak, K. T.; Miraglia, M.; Lange, W.; Lohmann, W.; Ryjov, V.; Lokhovitskiy, A.; Stickland, D.; Walsh, R.
2016-01-01
The CMS Beam Radiation Instrumentation and Luminosity (BRIL) project is composed of several systems providing the experiment protection from adverse beam conditions while also measuring the online luminosity and beam background. Although the readout bandwidth of the Fast Beam Conditions Monitoring system (BCM1F—one of the faster monitoring systems of the CMS BRIL), was sufficient for the initial LHC conditions, the foreseen enhancement of the beams parameters after the LHC Long Shutdown-1 (LS1) imposed the upgrade of the system. This paper presents the new BCM1F, which is designed to provide real-time fast diagnosis of beam conditions and instantaneous luminosity with readout able to resolve the 25 ns bunch structure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milic, A.
The high luminosities of L > 10{sup 34} cm{sup -2}s{sup -1} at the Large Hadron Collider (LHC) at CERN produce an intense radiation environment that the detectors and their electronics must withstand. The ATLAS detector is a multi-purpose apparatus constructed to explore the new particle physics regime opened by the LHC. Of the many decay particles observed by the ATLAS detector, the energy of the created electrons and photons is measured by a sampling calorimeter technique that uses Liquid Argon (LAr) as its active medium. The front end (FE) electronic readout of the ATLAS LAr calorimeter located on the detectormore » itself consists of a combined analog and digital processing system. In order to exploit the higher luminosity while keeping the same trigger bandwidth of 100 kHz, higher transverse granularity, higher resolution and longitudinal shower shape information will be provided from the LAr calorimeter to the Level-l trigger processors. New trigger readout electronics have been designed for this purpose, which will withstand the radiation dose levels expected for an integrated luminosity of 3000 fb{sup -1} during the high luminosity LHC (HL-LHC), which is well above the original LHC design qualifications. (authors)« less
Investigation of thin n-in-p planar pixel modules for the ATLAS upgrade
NASA Astrophysics Data System (ADS)
Savic, N.; Beyer, J.; La Rosa, A.; Macchiolo, A.; Nisius, R.
2016-12-01
In view of the High Luminosity upgrade of the Large Hadron Collider (HL-LHC), planned to start around 2023-2025, the ATLAS experiment will undergo a replacement of the Inner Detector. A higher luminosity will imply higher irradiation levels and hence will demand more radiation hardness especially in the inner layers of the pixel system. The n-in-p silicon technology is a promising candidate to instrument this region, also thanks to its cost-effectiveness because it only requires a single sided processing in contrast to the n-in-n pixel technology presently employed in the LHC experiments. In addition, thin sensors were found to ensure radiation hardness at high fluences. An overview is given of recent results obtained with not irradiated and irradiated n-in-p planar pixel modules. The focus will be on n-in-p planar pixel sensors with an active thickness of 100 and 150 μm recently produced at ADVACAM. To maximize the active area of the sensors, slim and active edges are implemented. The performance of these modules is investigated at beam tests and the results on edge efficiency will be shown.
L1 track trigger for the CMS HL-LHC upgrade using AM chips and FPGAs
NASA Astrophysics Data System (ADS)
Fedi, Giacomo
2017-08-01
The increase of luminosity at the HL-LHC will require the introduction of tracker information in CMS's Level-1 trigger system to maintain an acceptable trigger rate when selecting interesting events, despite the order of magnitude increase in minimum bias interactions. To meet the latency requirements, dedicated hardware has to be used. This paper presents the results of tests of a prototype system (pattern recognition ezzanine) as core of pattern recognition and track fitting for the CMS experiment, combining the power of both associative memory custom ASICs and modern Field Programmable Gate Array (FPGA) devices. The mezzanine uses the latest available associative memory devices (AM06) and the most modern Xilinx Ultrascale FPGAs. The results of the test for a complete tower comprising about 0.5 million patterns is presented, using as simulated input events traversing the upgraded CMS detector. The paper shows the performance of the pattern matching, track finding and track fitting, along with the latency and processing time needed. The pT resolution over pT of the muons measured using the reconstruction algorithm is at the order of 1% in the range 3-100 GeV/c.
The upgraded ATLAS and CMS detectors and their physics capabilities.
Wells, Pippa S
2015-01-13
The update of the European Strategy for Particle Physics from 2013 states that Europe's top priority should be the exploitation of the full potential of the LHC, including the high-luminosity upgrade of the machine and detectors with a view to collecting 10 times more data than in the initial design. The plans for upgrading the ATLAS and CMS detectors so as to maintain their performance and meet the challenges of increasing luminosity are presented here. A cornerstone of the physics programme is to measure the properties of the 125GeV Higgs boson with the highest possible precision, to test its consistency with the Standard Model. The high-luminosity data will allow precise measurements of the dominant production and decay modes, and offer the possibility of observing rare modes including Higgs boson pair production. Direct and indirect searches for additional Higgs bosons beyond the Standard Model will also continue.
Development of a timing detector for the TOTEM experiment at the LHC
NASA Astrophysics Data System (ADS)
Minafra, Nicola
2017-09-01
The upgrade program of the TOTEM experiment will include the installation of timing detectors inside vertical Roman Pots to allow the reconstruction of the longitudinal vertex position in the presence of event pile-up in high- β^{\\ast} dedicated runs. The small available space inside the Roman Pot, optimized for high-intensity LHC runs, and the required time precision led to the study of a solution using single crystal CVD diamonds. The sensors are read out using fast low-noise front-end electronics developed by the TOTEM Collaboration, achieving a signal-to-noise ratio larger than 20 for MIPs. A prototype was designed, manufactured and tested during a test beam campaign, proving a time precision below 100ps and an efficiency above 99%. The geometry of the detector has been designed to guarantee uniform occupancy in the expected running conditions keeping, at the same time, the number of channels below 12. The read-out electronics was developed during an extensive campaign of beam tests dedicated first to the characterization of existing solution and then to the optimization of the electronics designed within the Collaboration. The detectors were designed to be read out using the SAMPIC chip, a fast sampler designed specifically for picosecond timing measurements with high-rate capabilities; later, a modified version was realized using the HPTDC to achieve the higher trigger rates required for the CT-PPS experiment. The first set of prototypes was successfully installed and tested in the LHC in November 2015; moreover the detectors modified for CT-PPS are successfully part of the global CMS data taking since October 2016.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larbalestier, David C.; Lee, Peter J.; Tarantini, Chiara
All present circular accelerators use superconducting magnets to bend and to focus the particle beams. The most powerful of these machines is the large hadron collider (LHC) at CERN. The main ring dipole magnets of the LHC are made from Nb-Ti but, as the machine is upgraded to higher luminosity, more powerful magnets made of Nb 3Sn will be required. Our work addresses how to make the Nb 3Sn conductors more effective and more suitable for use in the LHC. The most important property of the superconducting conductor used for an accelerator magnet is that it must have very highmore » critical current density, the property that allows the generation of high magnetic fields in small spaces. Nb 3Sn is the original high field superconductor, the material which was discovered in 1960 to allow a high current density in the field of about 9 T. For the high luminosity upgrade of the LHC, much higher current densities in fields of about 12 Tesla will be required. The critical value of the current density is of order 2600 A/mm 2 in a field of 12 Tesla. But there are very important secondary factors that complicate the attainment of this critical current density. The first is that the effective filament diameter must be no larger than about 40 µm. The second factor is that 50% of the cross-section of the Nb 3Sn conductor that is pure copper must be protected from any poisoning by any Sn leakage through the diffusion barrier that protects the package of niobium and tin from which the Nb 3Sn is formed by a high temperature reaction. These three, somewhat conflicting requirements, mean that optimization of the conductor is complex. The work described in this contract report addresses these conflicting requirements. They show that very sophisticated characterizations can uncover the way to satisfy all 3 requirements and they also suggest that the ultimate optimization of Nb 3Sn is still not yet in sight« less
Characterisation of novel thin n-in-p planar pixel modules for the ATLAS Inner Tracker upgrade
NASA Astrophysics Data System (ADS)
Beyer, J.-C.; La Rosa, A.; Macchiolo, A.; Nisius, R.; Savic, N.; Taibah, R.
2018-01-01
In view of the high luminosity phase of the LHC (HL-LHC) to start operation around 2026, a major upgrade of the tracker system for the ATLAS experiment is in preparation. The expected neutron equivalent fluence of up to 2.4×1016 1 MeV neq./cm2 at the innermost layer of the pixel detector poses the most severe challenge. Thanks to their low material budget and high charge collection efficiency after irradiation, modules made of thin planar pixel sensors are promising candidates to instrument these layers. To optimise the sensor layout for the decreased pixel cell size of 50×50 μm2, TCAD device simulations are being performed to investigate the charge collection efficiency before and after irradiation. In addition, sensors of 100-150 μm thickness, interconnected to FE-I4 read-out chips featuring the previous generation pixel cell size of 50×250 μm2, are characterised with testbeams at the CERN-SPS and DESY facilities. The performance of sensors with various designs, irradiated up to a fluence of 1×1016 neq./cm2, is compared in terms of charge collection and hit efficiency. A replacement of the two innermost pixel layers is foreseen during the lifetime of HL-LHC . The replacement will require several months of intervention, during which the remaining detector modules cannot be cooled. They are kept at room temperature, thus inducing an annealing. The performance of irradiated modules will be investigated with testbeam campaigns and the method of accelerated annealing at higher temperatures.
Design and implementation of a crystal collimation test stand at the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Mirarchi, D.; Hall, G.; Redaelli, S.; Scandale, W.
2017-06-01
Future upgrades of the CERN Large Hadron Collider (LHC) demand improved cleaning performance of its collimation system. Very efficient collimation is required during regular operations at high intensities, because even a small amount of energy deposited on superconducting magnets can cause an abrupt loss of superconducting conditions (quench). The possibility to use a crystal-based collimation system represents an option for improving both cleaning performance and impedance compared to the present system. Before relying on crystal collimation for the LHC, a demonstration under LHC conditions (energy, beam parameters, etc.) and a comparison against the present system is considered mandatory. Thus, a prototype crystal collimation system has been designed and installed in the LHC during the Long Shutdown 1 (LS1), to perform feasibility tests during the Run 2 at energies up to 6.5 TeV. The layout is suitable for operation with proton as well as heavy ion beams. In this paper, the design constraints and the solutions proposed for this test stand for feasibility demonstration of crystal collimation at the LHC are presented. The expected cleaning performance achievable with this test stand, as assessed in simulations, is presented and compared to that of the present LHC collimation system. The first experimental observation of crystal channeling in the LHC at the record beam energy of 6.5 TeV has been obtained in 2015 using the layout presented (Scandale et al., Phys Lett B 758:129, 2016). First tests to measure the cleaning performance of this test stand have been carried out in 2016 and the detailed data analysis is still on-going.
The phase 1 upgrade of the CMS Pixel Front-End Driver
NASA Astrophysics Data System (ADS)
Friedl, M.; Pernicka, M.; Steininger, H.
2010-12-01
The pixel detector of the CMS experiment at the LHC is read out by analog optical links, sending the data to 9U VME Front-End Driver (FED) boards located in the electronics cavern. There are plans for the phase 1 upgrade of the pixel detector (2016) to add one more layer, while significantly cutting down the overall material budget. At the same time, the optical data transmission will be replaced by a serialized digital scheme. A plug-in board solution with a high-speed digital optical receiver has been developed for the Pixel-FED readout boards and will be presented along with first tests of the future optical link.
NASA Astrophysics Data System (ADS)
Dumont Dayot, Nicolas
2012-01-01
In the context of the LHC upgrade, we develop a new Read Out Driver (ROD) for the ATLAS Liquid Argon (LAr) community. ATCA and μTCA (Advanced/Micro Telecom Computing Architecture) is becoming a standard in high energy physics and a strong candidate to be used for boards and crates. We work to master ATCA and to integrate a large number of high speed links (96 links at 8.5 Gbps) on a ROD evaluation ATCA board. A versatile ATCA IPMI controller for ATCA boards which is FPGA Mezzanine Card (FMC) compliant has been developed to control the ROD evaluation board.
A Roadmap for HEP Software and Computing R&D for the 2020s
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Antonio Augusto, Jr; et al.
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to preparemore » for this software upgrade.« less
A programming framework for data streaming on the Xeon Phi
NASA Astrophysics Data System (ADS)
Chapeland, S.;
2017-10-01
ALICE (A Large Ion Collider Experiment) is the dedicated heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). After the second long shut-down of the LHC, the ALICE detector will be upgraded to cope with an interaction rate of 50 kHz in Pb-Pb collisions, producing in the online computing system (O2) a sustained throughput of 3.4 TB/s. This data will be processed on the fly so that the stream to permanent storage does not exceed 90 GB/s peak, the raw data being discarded. In the context of assessing different computing platforms for the O2 system, we have developed a framework for the Intel Xeon Phi processors (MIC). It provides the components to build a processing pipeline streaming the data from the PC memory to a pool of permanent threads running on the MIC, and back to the host after processing. It is based on explicit offloading mechanisms (data transfer, asynchronous tasks) and basic building blocks (FIFOs, memory pools, C++11 threads). The user only needs to implement the processing method to be run on the MIC. We present in this paper the architecture, implementation, and performance of this system.
HVMUX, a high voltage multiplexing for the ATLAS Tracker upgrade
NASA Astrophysics Data System (ADS)
Giulio Villani, E.; Phillips, P.; Matheson, J.; Zhang, Z.; Lynn, D.; Kuczewski, P.; Hommels, L. B. A.; Gregor, I.; Bessner, M.; Tackmann, K.; Newcomer, F. M.; Spencer, E.; Greenall, A.
2017-01-01
The HV biasing solution adopted in the current ATLAS detector uses one HV conductor for each sensor. This approach easily allows disabling of malfunctioning sensors without affecting the others, but space constraints and material budget considerations renders this approach impractical for the Upgraded detector. In fact, the increased luminosity of the Upgraded LHC will require more channels in the upgraded ATLAS Tracker, as a result of the finer detector segmentation. Different approaches to bring the HV biasing to the detectors, including the use of a shared HV line to bias several sensors and employing semiconductor switches for the HV routing (HVMUX), have been investigated. Beside the size constraints, particular attention must be paid to the radiation tolerance of any proposed solution, which, for the strips detector, requires proper operation up to fluences of the order of 2ṡ 1015 1MeV neq/cm2 and TID in excess of 300 kGy. In this paper, a description of the proposed HVMUX solution, along with electrical and radiation tests results will be presented and discussed.
CO2 evaporative cooling: The future for tracking detector thermal management
NASA Astrophysics Data System (ADS)
Tropea, P.; Daguin, J.; Petagna, P.; Postema, H.; Verlaat, B.; Zwalinski, L.
2016-07-01
In the last few years, CO2 evaporative cooling has been one of the favourite technologies chosen for the thermal management of tracking detectors at LHC. ATLAS Insertable B-Layer and CMS Pixel phase 1 upgrade have adopted it and their systems are now operational or under commissioning. The CERN PH-DT team is now merging the lessons learnt on these two systems in order to prepare the design and construction of the cooling systems for the new Upstream Tracker and the Velo upgrade in LHCb, due by 2018. Meanwhile, the preliminary design of the ATLAS and CMS full tracker upgrades is started, and both concepts heavily rely on CO2 evaporative cooling. This paper highlights the performances of the systems now in operation and the challenges to overcome in order to scale them up to the requirements of the future generations of trackers. In particular, it focuses on the conceptual design of a new cooling system suited for the large phase 2 upgrade programmes, which will be validated with the construction of a common prototype in the next years.
Preliminary Mechanical Design Study of the Hollow Electron Lens for HL-LHC
NASA Astrophysics Data System (ADS)
Zanoni, Carlo; Gobbi, Giorgia; Perini, Diego; Stancari, Giulio
2017-07-01
A Hollow Electron Lens (HEL) has been proposed in order to improve performance of halo control and collimation in the Large Hadron Collider in view of its High Luminosity upgrade (HL-LHC). The concept is based on a hollow beam of electrons that travels around the protons for a few meters. The electron beam is produced by a cathode and then guided by a strong magnetic field. The first step of the design is the definition of the magnetic field that drives the electron trajectories. The estimation of such trajectories by means of a dedicated MATLAB tool is presented. The influence of the main geometrical and electrical parameters is analyzed and discussed. Then, the main mechanical design choices for the solenoids, cryostats gun and collector are described. The aim of this paper is to provide an overview of the feasibility study of the Electron Lens for LHC. The methods used in this study also serve as examples for future mechanical and integration designs of similar devices.
Preliminary Mechanical Design Study of the Hollow Electron Lens for HL-LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zanoni, Carlo; Gobbi, Giorgia; Perini, Diego
A Hollow Electron Lens (HEL) has been proposed in order to improve performance of halo control and collimation in the Large Hadron Collider in view of its High Luminosity upgrade (HL-LHC). The concept is based on a hollow beam of electrons that travels around the protons for a few meters. The electron beam is produced by a cathode and then guided by a strong magnetic field. The first step of the design is the definition of the magnetic field that drives the electron trajectories. The estimation of such trajectories by means of a dedicated MATLAB tool is presented. The influencemore » of the main geometrical and electrical parameters is analyzed and discussed. Then, the main mechanical design choices for the solenoids, cryostats gun and collector are described. The aim of this paper is to provide an overview of the feasibility study of the Electron Lens for LHC. The methods used in this study also serve as examples for future mechanical and integration designs of similar devices.« less
Superconducting Magnet Technology for the Upgrade
NASA Astrophysics Data System (ADS)
Todesco, E.; Ambrosio, G.; Ferracin, P.; Rifflet, J. M.; Sabbi, G. L.; Segreti, M.; Nakamoto, T.; van Weelderen, R.; Xu, Q.
In this section we present the magnet technology for the High Luminosity LHC. After a short review of the project targets and constraints, we discuss the main guidelines used to determine the technology, the field/gradients, the operational margins, and the choice of the current density for each type of magnet. Then we discuss the peculiar aspects of each class of magnet, with special emphasis on the triplet.
Resistive-strips micromegas detectors with two-dimensional readout
NASA Astrophysics Data System (ADS)
Byszewski, M.; Wotschack, J.
2012-02-01
Micromegas detectors show very good performance for charged particle tracking in high rate environments as for example at the LHC. It is shown that two coordinates can be extracted from a single gas gap in these detectors. Several micromegas chambers with spark protection by resistive strips and two-dimensional readout have been tested in the context of the R&D work for the ATLAS Muon System upgrade.
Progress on the upgrade of the CMS Hadron Calorimeter Front-End electronics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Jake; Whitmore, Juliana; /Fermilab
2011-11-01
We present a scheme to upgrade the CMS HCAL front-end electronics in the second long shutdown to upgrade the LHC (LS2), which is expected to occur around 2018. The HCAL electronics upgrade is required to handle the major instantaneous luminosity increase (up to 5 * 10{sup 34} cm{sup -2} s{sup -1}) and an expected integrated luminosity of {approx}3000 fb{sup -1}. A key aspect of the HCAL upgrade is to read out longitudinal segmentation information to improve background rejection, energy resolution, and electron isolation at the L1 trigger. This paper focuses on the requirements for the new electronics and on themore » proposed solutions. The requirements include increased channel count, additional timing capabilities, and additional redundancy. The electronics are required to operate in a harsh environment and are constrained by the existing infrastructure. The proposed solutions span from chip level to system level. They include the development of a new ASIC ADC, the design and testing of higher speed transmitters to handle the increased data volume, the evaluation and use of circuits from other developments, evaluation of commercial FPGAs, better thermal design, and improvements in the overall readout architecture. We will report on the progress of the designs for these upgraded systems, along with performance requirements and initial design studies.« less
Upgrade plans for the ATLAS Forward Calorimeter at the HL-LHC
NASA Astrophysics Data System (ADS)
Rutherfoord, John; ATLAS Liquid Argon Calorimeter Group
2012-12-01
Although data-taking at CERN's Large Hadron Collider (LHC) is expected to continue for a number of years, plans are already being developed for operation of the LHC and associated detectors at an increased instantaneous luminosity about 5 times the original design value of 1034 cm-2 s-1. The increased particle flux at this high luminosity (HL) will have an impact on many sub-systems of the ATLAS detector. In particular, in the liquid argon forward calorimeter (FCal), which was designed for operation at LHC luminosities, the associated increase in the ionization load at HL-LHC luminosities creates a number of problems which can degrade its performance. These include space-charge effects in the liquid argon gaps, excessive drop in potential across the gaps due to large HV supply currents through the protection resistors, and an increase in temperature which may cause the liquid argon to boil. One solution, which would require opening both End-Cap cryostats, is the construction and installation of new FCals with narrower liquid argon gaps, lowering the values of the protection resistors, and the addition of cooling loops. A second proposed solution, which does not require opening the cryostat cold volume, is the addition of a small, warm calorimeter in front of each existing FCal, resulting in a reduction of the particle flux to levels at which the existing FCal can operate normally.
NASA Astrophysics Data System (ADS)
Riegel, C.; Backhaus, M.; Van Hoorne, J. W.; Kugathasan, T.; Musa, L.; Pernegger, H.; Riedler, P.; Schaefer, D.; Snoeys, W.; Wagner, W.
2017-01-01
A part of the upcoming HL-LHC upgrade of the ATLAS Detector is the construction of a new Inner Tracker. This upgrade opens new possibilities, but also presents challenges in terms of occupancy and radiation tolerance. For the pixel detector inside the inner tracker, hybrid modules containing passive silicon sensors and connected readout chips are presently used, but require expensive assembly techniques like fine-pitch bump bonding. Silicon devices fabricated in standard commercial CMOS technologies, which include part or all of the readout chain, are also investigated offering a reduced cost as they are cheaper per unit area than traditional silicon detectors. If they contain the full readout chain, as for a fully monolithic approach, there is no need for the expensive flip-chip assembly, resulting in a further cost reduction and material savings. In the outer pixel layers of the ATLAS Inner Tracker, the pixel sensors must withstand non-ionising energy losses of up to 1015 n/cm2 and offer a timing resolution of 25 ns or less. This paper presents test results obtained on a monolithic test chip, the TowerJazz 180nm Investigator, towards these specifications. The presented program of radiation hardness and timing studies has been launched to investigate this technology's potential for the new ATLAS Inner Tracker.
Lattice QCD and physics beyond the Standar Model: an experimentalist perspective
NASA Astrophysics Data System (ADS)
Artuso, Marina
2017-01-01
The new frontier in elementary particle physics is to find evidence for new physics that may lead to a deeper understanding of observations such as the baryon-antibaryon asymmetry of the universe, mass hierarchy, dark matter, or dark energy to name a few. Flavor physics provides a wealth of opportunities to find such signatures, and a vast body of data taken at e+e- b-factories and at hadron machines has provided valuable information, and a few tantalizing ``tensions'' with respect to the Standard Model predictions. While the window for new physics is still open, the chance that its manifestations will be subtle is very real. A vibrant experimental program is ongoing, and significant upgrades, such as the upgraded LHCb experiment at LHC and Belle 2 at KEKb, are imminent. One of the challenges in extracting new physics from flavor physics data is the need to relate observed hadron decays to fundamental particles and interactions. The continuous improvement of Lattice QCD predictions is a key element to achieve success in this quest. Improvements in algorithms and hardware have led to predictions of increasing precision on several fundamental matrix elements, and the continuous breaking of new grounds, thus allowing a broader spectrum of measurements to become relevant to this quest. An important aspect of the experiment-lattice synergy is a comparison between lattice predictions with experiment for a variety of hadronic quantities. This talk summarizes current synergies between lattice QCD theory and flavor physics experiments, and gives some highlights of expectations from future upgrades. this work was supported by NSF.
76 FR 23795 - Low-Power Television and Translator Upgrade Program: Notice of Final Closing Date
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-28
.... 110418247-1247-01] Low-Power Television and Translator Upgrade Program: Notice of Final Closing Date AGENCY... receipt of applications for the Low-Power Television and Translator Upgrade Program (Upgrade Program) will... Rules to Establish Rules for Digital Low Power Television, Television Translator, and Television Booster...
GPU-accelerated track reconstruction in the ALICE High Level Trigger
NASA Astrophysics Data System (ADS)
Rohr, David; Gorbunov, Sergey; Lindenstruth, Volker;
2017-10-01
ALICE (A Large Heavy Ion Experiment) is one of the four major experiments at the Large Hadron Collider (LHC) at CERN. The High Level Trigger (HLT) is an online compute farm which reconstructs events measured by the ALICE detector in real-time. The most compute-intensive part is the reconstruction of particle trajectories called tracking and the most important detector for tracking is the Time Projection Chamber (TPC). The HLT uses a GPU-accelerated algorithm for TPC tracking that is based on the Cellular Automaton principle and on the Kalman filter. The GPU tracking has been running in 24/7 operation since 2012 in LHC Run 1 and 2. In order to better leverage the potential of the GPUs, and speed up the overall HLT reconstruction, we plan to bring more reconstruction steps (e.g. the tracking for other detectors) onto the GPUs. There are several tasks running so far on the CPU that could benefit from cooperation with the tracking, which is hardly feasible at the moment due to the delay of the PCI Express transfers. Moving more steps onto the GPU, and processing them on the GPU at once, will reduce PCI Express transfers and free up CPU resources. On top of that, modern GPUs and GPU programming APIs provide new features which are not yet exploited by the TPC tracking. We present our new developments for GPU reconstruction, both with a focus on the online reconstruction on GPU for the online offline computing upgrade in ALICE during LHC Run 3, and also taking into account how the current HLT in Run 2 can profit from these improvements.
Suppression of Higher Order Modes in an Array of Cavities Using Waveguides
NASA Astrophysics Data System (ADS)
Shashkov, Ya. V.; Sobenin, N. P.; Bazyl, D. S.; Kaminskiy, V. I.; Mitrofanov, A. A.; Zobov, M. M.
An application of additional harmonic cavities operating at multiplies of the main RF system frequency of 400 MHz is currently under discussionin the framework of the High Luminosity LHC upgrade program [1,2]. A structure consisting of two 800 MHz single cell superconducting cavities with grooved beam pipes coupled by drift tubes has been suggested for implementation. However, it is desirable to increase the number of single cells installed in one cryomodule in order to decrease the number of transitions between "warm" and "cold" parts of the collider vacuum chamber. Unfortunately, it can lead to the appearance of higher order modes (HOM) trapped between the cavities. In order to solve this problem the methods of HOM damping with rectangular waveguides connected to the drift tubes were investigated and compared. We describe the results obtained for arrays of 2, 4 and 8 cavitiesin this paper.
Working Group Report: Higgs Boson
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, Sally; Gritsan, Andrei; Logan, Heather
2013-10-30
This report summarizes the work of the Energy Frontier Higgs Boson working group of the 2013 Community Summer Study (Snowmass). We identify the key elements of a precision Higgs physics program and document the physics potential of future experimental facilities as elucidated during the Snowmass study. We study Higgs couplings to gauge boson and fermion pairs, double Higgs production for the Higgs self-coupling, its quantum numbers and $CP$-mixing in Higgs couplings, the Higgs mass and total width, and prospects for direct searches for additional Higgs bosons in extensions of the Standard Model. Our report includes projections of measurement capabilities frommore » detailed studies of the Compact Linear Collider (CLIC), a Gamma-Gamma Collider, the International Linear Collider (ILC), the Large Hadron Collider High-Luminosity Upgrade (HL-LHC), Very Large Hadron Colliders up to 100 TeV (VLHC), a Muon Collider, and a Triple-Large Electron Positron Collider (TLEP).« less
Airborne Warning and Control System Block 40/45 Upgrade (AWACS Blk 40/45 Upgrade)
2015-12-01
Selected Acquisition Report ( SAR ) RCS: DD-A&T(Q&A)823-277 Airborne Warning and Control System Block 40/45 Upgrade (AWACS Blk 40/45 Upgrade) As of...Upgrade December 2015 SAR March 23, 2016 16:04:37 UNCLASSIFIED 2 Table of Contents Common Acronyms and Abbreviations for MDAP Programs 3 Program...Acquisition Unit Cost AWACS Blk 40/45 Upgrade December 2015 SAR March 23, 2016 16:04:37 UNCLASSIFIED 3 PB - President’s Budget PE - Program Element
MAPS development for the ALICE ITS upgrade
NASA Astrophysics Data System (ADS)
Yang, P.; Aglieri, G.; Cavicchioli, C.; Chalmet, P. L.; Chanlek, N.; Collu, A.; Gao, C.; Hillemanns, H.; Junique, A.; Kofarago, M.; Keil, M.; Kugathasan, T.; Kim, D.; Kim, J.; Lattuca, A.; Marin Tobon, C. A.; Marras, D.; Mager, M.; Martinengo, P.; Mazza, G.; Mugnier, H.; Musa, L.; Puggioni, C.; Rousset, J.; Reidt, F.; Riedler, P.; Snoeys, W.; Siddhanta, S.; Usai, G.; van Hoorne, J. W.; Yi, J.
2015-03-01
Monolithic Active Pixel Sensors (MAPS) offer the possibility to build pixel detectors and tracking layers with high spatial resolution and low material budget in commercial CMOS processes. Significant progress has been made in the field of MAPS in recent years, and they are now considered for the upgrades of the LHC experiments. This contribution will focus on MAPS detectors developed for the ALICE Inner Tracking System (ITS) upgrade and manufactured in the TowerJazz 180 nm CMOS imaging sensor process on wafers with a high resistivity epitaxial layer. Several sensor chip prototypes have been developed and produced to optimise both charge collection and readout circuitry. The chips have been characterised using electrical measurements, radioactive sources and particle beams. The tests indicate that the sensors satisfy the ALICE requirements and first prototypes with the final size of 1.5 × 3 cm2 have been produced in the first half of 2014. This contribution summarises the characterisation measurements and presents first results from the full-scale chips.
NASA Astrophysics Data System (ADS)
Webster, Jordan
2017-01-01
Dense track environments in pp collisions at the Large Hadron Collider (LHC) motivate the use of triggers with dedicated hardware for fast track reconstruction. The ATLAS Collaboration is in the process of implementing a Fast Tracker (FTK) trigger upgrade, in which Content Addressable Memories (CAMs) will be used to rapidly match hit patterns with large banks of simulated tracks. The FTK CAMs are produced primarily at the University of Pisa. However, commercial CAM technology is rapidly developing due to applications in computer networking devices. This poster presents new studies comparing FTK CAMs to cutting-edge ternary CAMs developed by Cavium. The comparison is intended to guide the design of future track-based trigger systems for the next Phase at the LHC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syphers, M. J.; Chattopadhyay, S.
An overview is provided of the currently envisaged landscape of charged particle accelerators at the energy and intensity frontiers to explore particle physics beyond the standard model via 1-100 TeV-scale lepton and hadron colliders and multi-Megawatt proton accelerators for short- and long- baseline neutrino experiments. The particle beam physics, associated technological challenges and progress to date for these accelerator facilities (LHC, HL-LHC, future 100 TeV p-p colliders, Tev-scale linear and circular electron-positron colliders, high intensity proton accelerator complex PIP-II for DUNE and future upgrade to PIP-III) are outlined. Potential and prospects for advanced “nonlinear dynamic techniques” at the multi-MW levelmore » intensity frontier and advanced “plasma- wakefield-based techniques” at the TeV-scale energy frontier and are also described.« less
Developments in the ATLAS Tracking Software ahead of LHC Run 2
NASA Astrophysics Data System (ADS)
Styles, Nicholas; Bellomo, Massimiliano; Salzburger, Andreas; ATLAS Collaboration
2015-05-01
After a hugely successful first run, the Large Hadron Collider (LHC) is currently in a shut-down period, during which essential maintenance and upgrades are being performed on the accelerator. The ATLAS experiment, one of the four large LHC experiments has also used this period for consolidation and further developments of the detector and of its software framework, ahead of the new challenges that will be brought by the increased centre-of-mass energy and instantaneous luminosity in the next run period. This is of particular relevance for the ATLAS Tracking software, responsible for reconstructing the trajectory of charged particles through the detector, which faces a steep increase in CPU consumption due to the additional combinatorics of the high-multiplicity environment. The steps taken to mitigate this increase and stay within the available computing resources while maintaining the excellent performance of the tracking software in terms of the information provided to the physics analyses will be presented. Particular focus will be given to changes to the Event Data Model, replacement of the maths library, and adoption of a new persistent output format. The resulting CPU profiling results will be discussed, as well as the performance of the algorithms for physics processes under the expected conditions for the next LHC run.
NASA Astrophysics Data System (ADS)
Chiuchiolo, A.; Bajko, M.; Perez, J. C.; Bajas, H.; Consales, M.; Giordano, M.; Breglio, G.; Palmieri, L.; Cusano, A.
2014-08-01
The design, fabrication and tests of a new generation of superconducting magnets for the upgrade of the LHC require the support of an adequate, robust and reliable sensing technology. The use of Fiber Optic Sensors is becoming particularly challenging for applications in extreme harsh environments such as ultra-low temperatures, high electromagnetic fields and strong mechanical stresses offering perspectives for the development of technological innovations in several applied disciplines.
3D sensors and micro-fabricated detector systems
NASA Astrophysics Data System (ADS)
Da Vià, Cinzia
2014-11-01
Micro-systems based on the Micro Electro Mechanical Systems (MEMS) technology have been used in miniaturized low power and low mass smart structures in medicine, biology and space applications. Recently similar features found their way inside high energy physics with applications in vertex detectors for high-luminosity LHC Upgrades, with 3D sensors, 3D integration and efficient power management using silicon micro-channel cooling. This paper reports on the state of this development.
LARP Long Quadrupole: A "Long" Step Toward an LHC
Giorgio Ambrosio
2017-12-09
The beginning of the development of Nb3Sn magnets for particle accelerators goes back to the 1960âs. But only very recently has this development begun to face the challenges of fabricating Nb3Sn magnets which can meet the requirements of modern particle accelerators. LARP (the LHC Accelerator Research Program) is leading this effort focusing on long models of the Interaction Region quadrupoles for a possible luminosity upgrade of the Large Hadron Collider. A major milestone in this development is to test, by the end of 2009, 4m-long quadrupole models, which will be the first Nb3Sn accelerator-type magnets approaching the length of real accelerator magnets. The Long Quadrupoles (LQ) are âProof-of-Principleâ magnets which are to demonstrate that Nb3Sn technology is sufficiently mature for use in high energy particle accelerators. Their design is based on the LARP Technological Quadrupole (TQ) models, under development at FNAL and LBNL, which have design gradients higher than 200 T/m and an aperture of 90 mm. Several challenges must be addressed for the successful fabrication of long Nb3Sn coils and magnets. These challenges and the solutions adopted will be presented together with the main features of the LQ magnets. Several R&D lines are participating to this effort and their contributions will be also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stancari, Giulio
2015-03-01
Halo dynamics influences global accelerator performance: beam lifetimes, emittance growth, dynamic aperture, and collimation efficiency. Halo monitoring and control are also critical for the operation of high-power machines. For instance, in the high-luminosity upgrade of the LHC, the energy stored in the beam tails may reach several megajoules. Fast losses can result in superconducting magnet quenches, magnet damage, or even collimator deformation. The need arises to measure the beam halo and to remove it at controllable rates. In the Tevatron and in the LHC, halo population densities and diffusivities were measured with collimator scans by observing the time evolution ofmore » losses following small inward or outward collimator steps, under different experimental conditions: with single beams and in collision, and, in the case of the Tevatron, with a hollow electron lens acting on a subset of bunches. After the LHC resumes operations, it is planned to compare measured diffusivities with the known strength of transverse damper excitations. New proposals for nondestructive halo population density measurements are also briefly discussed.« less
Off-Shell Higgs Probe of Naturalness.
Gonçalves, Dorival; Han, Tao; Mukhopadhyay, Satyanarayan
2018-03-16
Examining the Higgs sector at high energy scales through off-shell Higgs production can potentially shed light on the naturalness problem of the Higgs boson mass. We propose such a study at the LHC by utilizing a representative model with a new scalar field (S) coupled to the standard model Higgs doublet (H) in a form |S|^{2}|H|^{2}. In the process pp→h^{*}→ZZ, the dominant momentum-dependent part of the one-loop scalar singlet corrections, especially above the new threshold at 2m_{S}, leads to a measurable deviation in the differential distribution of the Z-pair invariant mass, in accordance with the quadratic divergence cancellation to the Higgs mass. We find that it is conceivable to probe such new physics at the 5σ level at the high-luminosity LHC, improving further with the upgraded 27 TeV LHC, without requiring the precise measurement of the Higgs boson total width. The discovery of such a Higgs portal could also have important implications for thermal dark matter as well as for electroweak baryogenesis.
Forward shower counters for diffractive physics at the LHC
NASA Astrophysics Data System (ADS)
Albrow, Michael; Collins, Paula; Penzo, Aldo
2014-11-01
The LHC detectors have incomplete angular coverage in the forward direction, for example in the region 6 ≲ |η| ≲ 8, which can be improved with the addition of simple scintillation counters around the beam pipes about 50 m to 120 m from the intersection point. These counters detect showers created by particles hitting the beam pipes and nearby material. The absence of signals in these counters in low pileup conditions is an indication of a forward rapidity gap as a signature of diffraction. In addition, they can be used to detect hadrons from low mass diffractive excitations of the proton, not accompanied by a leading proton but adjacent to a rapidity gap over (e.g.) 3 ≲ |η| ≲ 6. Such a set of forward shower counters, originally used at CDF, was used in CMS (FSC) for high-β* running with TOTEM during LHC Run-1. During LS1 the CMS FSC system is being upgraded for future low pileup runs. A similar system, called HERSCHEL is being installed in LHCb. ALICE is implementing scintillation counters, ADA and ADC, with 4.5 ≲ |η| ≲ 6.4.
Off-Shell Higgs Probe of Naturalness
NASA Astrophysics Data System (ADS)
Gonçalves, Dorival; Han, Tao; Mukhopadhyay, Satyanarayan
2018-03-01
Examining the Higgs sector at high energy scales through off-shell Higgs production can potentially shed light on the naturalness problem of the Higgs boson mass. We propose such a study at the LHC by utilizing a representative model with a new scalar field (S ) coupled to the standard model Higgs doublet (H ) in a form |S |2|H |2. In the process p p →h*→Z Z , the dominant momentum-dependent part of the one-loop scalar singlet corrections, especially above the new threshold at 2 mS, leads to a measurable deviation in the differential distribution of the Z -pair invariant mass, in accordance with the quadratic divergence cancellation to the Higgs mass. We find that it is conceivable to probe such new physics at the 5 σ level at the high-luminosity LHC, improving further with the upgraded 27 TeV LHC, without requiring the precise measurement of the Higgs boson total width. The discovery of such a Higgs portal could also have important implications for thermal dark matter as well as for electroweak baryogenesis.
Experimental High Energy Physics Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohlmann, Marcus
This final report summarizes activities of the Florida Tech High Energy Physics group supported by DOE under grant #DE-SC0008024 during the period June 2012 – March 2015. We focused on one of the main HEP research thrusts at the Energy Frontier by participating in the CMS experiment. We were exploiting the tremendous physics opportunities at the Large Hadron Collider (LHC) and prepared for physics at its planned extension, the High-Luminosity LHC. The effort comprised a physics component with analysis of data from the first LHC run and contributions to the CMS Phase-2 upgrades in the muon endcap system (EMU) formore » the High-Luminosity LHC. The emphasis of our hardware work was the development of large-area Gas Electron Multipliers (GEMs) for the CMS forward muon upgrade. We built a production and testing site for such detectors at Florida Tech to complement future chamber production at CERN. The first full-scale CMS GE1/1 chamber prototype ever built outside of CERN was constructed at Florida Tech in summer 2013. We conducted two beam tests with GEM prototype chambers at CERN in 2012 and at FNAL in 2013 and reported the results at conferences and in publications. Principal Investigator Hohlmann served as chair of the collaboration board of the CMS GEM collaboration and as co-coordinator of the GEM detector working group. He edited and authored sections of the detector chapter of the Technical Design Report (TDR) for the GEM muon upgrade, which was approved by the LHCC and the CERN Research Board in 2015. During the course of the TDR approval process, the GEM project was also established as an official subsystem of the muon system by the CMS muon institution board. On the physics side, graduate student Kalakhety performed a Z' search in the dimuon channel with the 2011 and 2012 CMS datasets that utilized 20.6 fb⁻¹ of p-p collisions at √s = 8 TeV. For the dimuon channel alone, the 95% CL lower limits obtained on the mass of a Z' resonance are 2770 GeV for a Z' with the same standard-model couplings as the Z boson. Our student team operated a Tier-3 cluster on the Open Science Grid (OSG) to support local CMS physics analysis and remote OSG activity. As a service to the HEP community, Hohlmann participated in the Snowmass effort over the course of 2013. Specifically, he acted as a liaison for gaseous detectors between the Instrumentation Frontier and the Energy Frontier and contributed to five papers and reports submitted to the summer study.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milic, A.
The ATLAS Liquid Argon calorimeters are designed and built to study proton-proton collisions produced at the LHC at centre-of-mass energies up to 14 TeV. Liquid argon (LAr) sampling calorimeters are employed for all electromagnetic calorimetry in the pseudorapidity region |η|<3.2, and for hadronic calorimetry in the region from |η|=1.5 to |η|=4.9. Although the nominal LHC experimental programme is still in progress, an upgrade of the read-out electronics is being launched to cope with luminosities of up to 3x10{sup 34} cm{sup -2}s{sup -1}, which are beyond the original design by a factor of 3. An improved spatial granularity of the triggermore » primitives is therefore proposed in order to improve the identification performance for trigger signatures, like electrons, photons, tau leptons, jets, total and missing energy, at high background rejection rates. For the upgrade Phase-1 in 2018, new LAr Trigger Digitizer Boards (LTDB) are being designed to receive higher granularity signals, digitize them on detector and send them via fast optical links to a new LAr digital processing system (LDPS). The LDPS applies a digital filtering and identifies significant energy depositions in each trigger channel. The refined trigger primitives are then transmitted to the first level trigger system to extract improved trigger signatures. The read-out of the trigger signals will process 34000 so-called Super Cells at every LHC bunch-crossing at a frequency of 40 MHz. The new LTDB on-detector electronics is designed to be radiation tolerant in order to be operated for the remaining live-time of the ATLAS detector up to a total luminosity of 3000 fb{sup -1}. For the analog-to-digital conversion (12-bit ADC at 40 MSPS), the data serialization and the fast optical link (5.44 Gb/s) custom components have been developed. They have been qualified for the expected radiation environment of a total ionization dose of 1.3 kGy and a hadron fluence of 6 x 10{sup 13} h/cm{sup 2} with energies above 20 MeV. For the digital components like the ADC, cross-sections for single event effects have been determined. This talk will present R and D results from tests of the radiation tolerant components, the fast data processing electronics and prototypes of the LTDB and LDPS boards. First experience from a Demonstrator setup will be reported, in which about 1/10 of the full Super Cell readout will be equipped with prototype versions of the LTDB and LDPS boards. The Demonstrator will be operated in parallel to the regular ATLAS trigger read-out during the upcoming LHC run. (authors)« less
NASA Astrophysics Data System (ADS)
Hennessy, Karol; LHCb VELO Upgrade Collaboration
2017-02-01
The upgrade of the LHCb experiment, scheduled for LHC Run-III, scheduled to start in 2021, will transform the experiment to a trigger-less system reading out the full detector at 40 MHz event rate. All data reduction algorithms will be executed in a high-level software farm enabling the detector to run at luminosities of 2×1033 cm-2 s-1. The Vertex Locator (VELO) is the silicon vertex detector surrounding the interaction region. The current detector will be replaced with a hybrid pixel system equipped with electronics capable of reading out at 40 MHz. The upgraded VELO will provide fast pattern recognition and track reconstruction to the software trigger. The silicon pixel sensors have 55×55 μm2 pitch, and are read out by the VeloPix ASIC, from the Timepix/Medipix family. The hottest region will have pixel hit rates of 900 Mhits/s yielding a total data rate of more than 3 Tbit/s for the upgraded VELO. The detector modules are located in a separate vacuum, separated from the beam vacuum by a thin custom made foil. The foil will be manufactured through milling and possibly thinned further by chemical etching. The material budget will be minimised by the use of evaporative CO2 coolant circulating in microchannels within 400 μm thick silicon substrates. The current status of the VELO upgrade is described and latest results from operation of irradiated sensor assemblies are presented.
Commissioning of the upgraded CSC Endcap Muon Port Cards at CMS
NASA Astrophysics Data System (ADS)
Ecklund, K.; Liu, J.; Madorsky, A.; Matveev, M.; Michlin, B.; Padley, P.; Rorie, J.
2016-01-01
There are 180 1.6 Gbps optical links from 60 Muon Port Cards (MPC) to the Cathode Strip Chamber Track Finder (CSCTF) in the original system. Before the upgrade each MPC was able to provide up to three trigger primitives from a cluster of nine CSC chambers to the Level 1 CSCTF. With an LHC luminosity increase to 1035 cm-2s-1 at full energy of 7 TeV/beam, the simulation studies suggest that we can expect two or three times more trigger primitives per bunch crossing from the front-end electronics. To comply with this requirement, the MPC, CSCTF, and optical cables need to be upgraded. The upgraded MPC allows transmission of up to 18 trigger primitives from the peripheral crate. This feature would allow searches for physics signatures of muon jets that require more trigger primitives per trigger sector. At the same time, it is very desirable to preserve all the old optical links for compatibility with the older Track Finder during transition period at the beginning of Run 2. Installation of the upgraded MPC boards and the new optical cables has been completed at the CMS detector in the summer of 2014. We describe the final design of the new MPC mezzanine FPGA, its firmware, and results of tests in laboratory and in situ with the old and new CSCTF boards.
Implementation of the ATLAS trigger within the multi-threaded software framework AthenaMT
NASA Astrophysics Data System (ADS)
Wynne, Ben; ATLAS Collaboration
2017-10-01
We present an implementation of the ATLAS High Level Trigger, HLT, that provides parallel execution of trigger algorithms within the ATLAS multithreaded software framework, AthenaMT. This development will enable the ATLAS HLT to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data-taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the HLT input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that each execute algorithms sequentially for different events. AthenaMT will provide a fully multi-threaded environment that will additionally enable concurrent execution of algorithms within an event. This has the potential to significantly reduce the memory footprint on future manycore devices. An additional benefit of the HLT implementation within AthenaMT is that it facilitates the integration of offline code into the HLT. The trigger must retain high rejection in the face of increasing numbers of pileup collisions. This will be achieved by greater use of offline algorithms that are designed to maximize the discrimination of signal from background. Therefore a unification of the HLT and offline reconstruction software environment is required. This has been achieved while at the same time retaining important HLT-specific optimisations that minimize the computation performed to reach a trigger decision. Such optimizations include early event rejection and reconstruction within restricted geometrical regions. We report on an HLT prototype in which the need for HLT-specific components has been reduced to a minimum. Promising results have been obtained with a prototype that includes the key elements of trigger functionality including regional reconstruction and early event rejection. We report on the first experience of migrating trigger selections to this new framework and present the next steps towards a full implementation of the ATLAS trigger.
Quench performance and field quality of FNAL twin-aperture 11 T Nb 3Sn dipole model for LHC upgrades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoynev, Stoyan; Andreev, Nikolai; Apollinari, Giorgio
A 2 m long single-aperture dipole demonstrator and two 1 m long single-aperture models based on Nb 3Sn superconductor have been built and tested at FNAL. The two 1 m long collared coils were then assembled in a twin-aperture Nb 3Sn dipole demonstrator compatible with the LHC main dipole and tested in two thermal cycles. This paper summarizes the quench performance of the FNAL twin-aperture Nb 3Sn 11 T dipole in the temperature range of 1.9-4.5 K. The results of magnetic measurements for one of the two apertures are also presented. Test results are compared to the performance of coilsmore » in a single-aperture configuration. Lastly, a summary of quench propagation studies in both apertures is given.« less
Quench performance and field quality of FNAL twin-aperture 11 T Nb 3Sn dipole model for LHC upgrades
Stoynev, Stoyan; Andreev, Nikolai; Apollinari, Giorgio; ...
2016-12-07
A 2 m long single-aperture dipole demonstrator and two 1 m long single-aperture models based on Nb 3Sn superconductor have been built and tested at FNAL. The two 1 m long collared coils were then assembled in a twin-aperture Nb 3Sn dipole demonstrator compatible with the LHC main dipole and tested in two thermal cycles. This paper summarizes the quench performance of the FNAL twin-aperture Nb 3Sn 11 T dipole in the temperature range of 1.9-4.5 K. The results of magnetic measurements for one of the two apertures are also presented. Test results are compared to the performance of coilsmore » in a single-aperture configuration. Lastly, a summary of quench propagation studies in both apertures is given.« less
Beyond the bump-hunt: A game plan for discovering dynamical dark matter at the LHC
NASA Astrophysics Data System (ADS)
Dienes, Keith R.; Su, Shufang; Thomas, Brooks
2016-06-01
Dynamical Dark Matter (DDM) is an alternative framework for dark-matter physics in which an ensemble of individual constituent fields with a spectrum of masses, lifetimes, and cosmological abundances collectively constitute the dark-matter candidate, and in which the traditional notion of dark-matter stability is replaced by a balancing between lifetimes and abundances across the ensemble. In this talk, we discuss the prospects for distinguishing between DDM ensembles and traditional dark-matter candidates at hadron colliders - and in particular, at the upgraded LHC - via the analysis of event-shape distributions of kine-matic variables. We also examine the correlations between these kinematic variables and other relevant collider variables in order to assess how imposing cuts on these additional variables may distort - for better or worse - their event-shape distributions.
Status of the R&D activities for the upgrade of the ALICE TPC
NASA Astrophysics Data System (ADS)
Deisting, Alexander
2018-02-01
After the Long Shutdown 2 (LS2) the LHC will provide lead-lead collisions at interaction rates as high as 50 kHz. In order to cope with such conditions the ALICE Time Projection Chamber (TPC) needs to be upgraded. After the upgrade the TPC will run in a continuous mode, without any degradation of the momentum and dE/dx resolution compared to the performance of the present TPC. Since readout by multi-wire proportional chambers is no longer feasible with these requirements, new technologies have to be employed. In the new readout chambers the electron amplification is provided by a stack of four Gas ElectronMultiplier (GEM) foils. Here foils with a standard hole pitch of 140 μm as well as large pitch foils (280 μm) are used. Their high voltage settings and orientation have been optimised to provide an energy resolution of σE/E ≤ 12% at the photopeak of 55Fe. At the same settings the Ion BackFlow into the drift volume is less than 1% of the effective number of ions produced during gas amplification and the primary ionisations. This is necessary to prevent the accumulation of space charge, which eventually will distort the field in the drift volume. To ensure stable operation at the high loads during LHC run 3 the chambers have to be robust against discharges, too. With the selected configuration in a quadruple GEMstack the discharge probability is kept at the level of 10-12 discharges per incoming hadron. An overview of the ALICE TPC upgrade activities will be given in these proceedings and the optimised settings foreseen for the GEM stacks of the future readout chambers are introduced. Furthermore the outcome of two beam time campaigns at SPS and PS (at CERN) in the end of 2014 is shown. At this campaigns the stability against discharges and the dE/dx performance of a full size readout chamber prototype was tested. In addition it is reported on charging-up studies of 4GEM stacks and on tests of electromagnetic sagging of large GEM foils.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stancari, Giulio
Electron lenses are pulsed, magnetically confined electron beams whose current-density profile is shaped to obtain the desired effect on the circulating beam. Electron lenses were used in the Fermilab Tevatron collider for bunch-by-bunch compensation of long-range beam-beam tune shifts, for removal of uncaptured particles in the abort gap, for preliminary experiments on head-on beam-beam compensation, and for the demonstration of halo scraping with hollow electron beams. Electron lenses for beam-beam compensation are being commissioned in the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory (BNL). Hollow electron beam collimation and halo control were studied as an option to complementmore » the collimation system for the upgrades of the Large Hadron Collider (LHC) at CERN; a conceptual design was recently completed. Because of their electric charge and the absence of materials close to the proton beam, electron lenses may also provide an alternative to wires for long-range beam-beam compensation in LHC luminosity upgrade scenarios with small crossing angles. At Fermilab, we are planning to install an electron lens in the Integrable Optics Test Accelerator (IOTA, a 40-m ring for 150-MeV electrons) as one of the proof-of-principle implementations of nonlinear integrable optics to achieve large tune spreads and more stable beams without loss of dynamic aperture.« less
Investigation of HV/HR-CMOS technology for the ATLAS Phase-II Strip Tracker Upgrade
NASA Astrophysics Data System (ADS)
Fadeyev, V.; Galloway, Z.; Grabas, H.; Grillo, A. A.; Liang, Z.; Martinez-Mckinney, F.; Seiden, A.; Volk, J.; Affolder, A.; Buckland, M.; Meng, L.; Arndt, K.; Bortoletto, D.; Huffman, T.; John, J.; McMahon, S.; Nickerson, R.; Phillips, P.; Plackett, R.; Shipsey, I.; Vigani, L.; Bates, R.; Blue, A.; Buttar, C.; Kanisauskas, K.; Maneuski, D.; Benoit, M.; Di Bello, F.; Caragiulo, P.; Dragone, A.; Grenier, P.; Kenney, C.; Rubbo, F.; Segal, J.; Su, D.; Tamma, C.; Das, D.; Dopke, J.; Turchetta, R.; Wilson, F.; Worm, S.; Ehrler, F.; Peric, I.; Gregor, I. M.; Stanitzki, M.; Hoeferkamp, M.; Seidel, S.; Hommels, L. B. A.; Kramberger, G.; Mandić, I.; Mikuž, M.; Muenstermann, D.; Wang, R.; Zhang, J.; Warren, M.; Song, W.; Xiu, Q.; Zhu, H.
2016-09-01
ATLAS has formed strip CMOS project to study the use of CMOS MAPS devices as silicon strip sensors for the Phase-II Strip Tracker Upgrade. This choice of sensors promises several advantages over the conventional baseline design, such as better resolution, less material in the tracking volume, and faster construction speed. At the same time, many design features of the sensors are driven by the requirement of minimizing the impact on the rest of the detector. Hence the target devices feature long pixels which are grouped to form a virtual strip with binary-encoded z position. The key performance aspects are radiation hardness compatibility with HL-LHC environment, as well as extraction of the full hit position with full-reticle readout architecture. To date, several test chips have been submitted using two different CMOS technologies. The AMS 350 nm is a high voltage CMOS process (HV-CMOS), that features the sensor bias of up to 120 V. The TowerJazz 180 nm high resistivity CMOS process (HR-CMOS) uses a high resistivity epitaxial layer to provide the depletion region on top of the substrate. We have evaluated passive pixel performance, and charge collection projections. The results strongly support the radiation tolerance of these devices to radiation dose of the HL-LHC in the strip tracker region. We also describe design features for the next chip submission that are motivated by our technology evaluation.
Long term dynamics of the high luminosity Large Hadron Collider with crab cavities
NASA Astrophysics Data System (ADS)
Barranco García, J.; De Maria, R.; Grudiev, A.; Tomás García, R.; Appleby, R. B.; Brett, D. R.
2016-10-01
The High Luminosity upgrade of the Large Hadron Collider (HL-LHC) aims to achieve an integrated luminosity of 200 - 300 fb-1 per year, including the contribution from the upgrade of the injector chain. For the HL-LHC the larger crossing angle together with a smaller beta function at the collision point would result in more than 70% luminosity loss due to the incomplete geometric overlap of colliding bunches. To recover head-on collisions at the high-luminosity particle-physics detectors ATLAS and CMS and benefit from the very low β* provided by the Achromatic Telescopic Squeezing (ATS) optics, a local crab cavity scheme provides transverse kicks to the proton bunches. The tight space constraints at the location of these cavities leads to designs which are axially non-symmetric, giving rise to high order multipoles components of the main deflecting mode and, since these kicks are harmonic in time, we expand them in a series of multipoles in a similar fashion as is done for static field magnets. In this work we calculate, for the first time, the higher order multipoles and their impact on beam dynamics for three different crab cavity prototypes. Different approaches to calculate the multipoles are presented. Furthermore, we perform the first calculation of their impact on the long term stability of the machine using the concept of dynamic aperture.
Online Luminosity Measurement at CMS for Energy Frontier Physics after LS1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stickland, David P.
2015-09-20
This proposal was directed towards the measurement of Bunch-by-Bunch and Total Luminosity in the CMS experiment using Single-Crystal Diamond (sCVD) installed close to the Interaction Point - known as the Fast Beam Conditions Monitor, or BCM1F detector. The proposal was successfully carried out and in February 2015 CMS installed its upgraded BCM1F detector. At first collisions in June 2015 the BCM1F was used as the primary luminometer, then in August 2015 a Van De Meer scan has been carried out and the detailed luminometer calibration is under study. In all aspects of performance measurement the upgraded detector has satisfied itsmore » design parameters and as an overview of its performance in this report will show, we have high expectations that the detector will be a powerful addition to the luminosity measurement at CMS and LHC. The proposed upgrade of BCM1F was a collaboration of CMS Institutes in Germany (DESY-Zeuthen) and the USA (Princeton) and of CERN itself.« less
Modeling of beam-induced damage of the LHC tertiary collimators
NASA Astrophysics Data System (ADS)
Quaranta, E.; Bertarelli, A.; Bruce, R.; Carra, F.; Cerutti, F.; Lechner, A.; Redaelli, S.; Skordis, E.; Gradassi, P.
2017-09-01
Modern hadron machines with high beam intensity may suffer from material damage in the case of large beam losses and even beam-intercepting devices, such as collimators, can be harmed. A systematic method to evaluate thresholds of damage owing to the impact of high energy particles is therefore crucial for safe operation and for predicting possible limitations in the overall machine performance. For this, a three-step simulation approach is presented, based on tracking simulations followed by calculations of energy deposited in the impacted material and hydrodynamic simulations to predict the thermomechanical effect of the impact. This approach is applied to metallic collimators at the CERN Large Hadron Collider (LHC), which in standard operation intercept halo protons, but risk to be damaged in the case of extraction kicker malfunction. In particular, tertiary collimators protect the aperture bottlenecks, their settings constrain the reach in β* and hence the achievable luminosity at the LHC experiments. Our calculated damage levels provide a very important input on how close to the beam these collimators can be operated without risk of damage. The results of this approach have been used already to push further the performance of the present machine. The risk of damage is even higher in the upgraded high-luminosity LHC with higher beam intensity, for which we quantify existing margins before equipment damage for the proposed baseline settings.
Design of the large hadron electron collider interaction region
NASA Astrophysics Data System (ADS)
Cruz-Alaniz, E.; Newton, D.; Tomás, R.; Korostelev, M.
2015-11-01
The large hadron electron collider (LHeC) is a proposed upgrade of the Large Hadron Collider (LHC) within the high luminosity LHC (HL-LHC) project, to provide electron-nucleon collisions and explore a new regime of energy and luminosity for deep inelastic scattering. The design of an interaction region for any collider is always a challenging task given that the beams are brought into crossing with the smallest beam sizes in a region where there are tight detector constraints. In this case integrating the LHeC into the existing HL-LHC lattice, to allow simultaneous proton-proton and electron-proton collisions, increases the difficulty of the task. A nominal design was presented in the the LHeC conceptual design report in 2012 featuring an optical configuration that focuses one of the proton beams of the LHC to β*=10 cm in the LHeC interaction point to reach the desired luminosity of L =1033 cm-2 s-1 . This value is achieved with the aid of a new inner triplet of quadrupoles at a distance L*=10 m from the interaction point. However the chromatic beta beating was found intolerable regarding machine protection issues. An advanced chromatic correction scheme was required. This paper explores the feasibility of the extension of a novel optical technique called the achromatic telescopic squeezing scheme and the flexibility of the interaction region design, in order to find the optimal solution that would produce the highest luminosity while controlling the chromaticity, minimizing the synchrotron radiation power and maintaining the dynamic aperture required for stability.
Upgrade of the cryogenic CERN RF test facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pirotte, O.; Benda, V.; Brunner, O.
2014-01-29
With the large number of superconducting radiofrequency (RF) cryomodules to be tested for the former LEP and the present LHC accelerator a RF test facility was erected early in the 1990’s in the largest cryogenic test facility at CERN located at Point 18. This facility consisted of four vertical test stands for single cavities and originally one and then two horizontal test benches for RF cryomodules operating at 4.5 K in saturated helium. CERN is presently working on the upgrade of its accelerator infrastructure, which requires new superconducting cavities operating below 2 K in saturated superfluid helium. Consequently, the RFmore » test facility has been renewed in order to allow efficient cavity and cryomodule tests in superfluid helium and to improve its thermal performances. The new RF test facility is described and its performances are presented.« less
A radiation tolerant Data link board for the ATLAS Tile Cal upgrade
NASA Astrophysics Data System (ADS)
Åkerstedt, H.; Bohm, C.; Muschter, S.; Silverstein, S.; Valdes, E.
2016-01-01
This paper describes the latest, full-functionality revision of the high-speed data link board developed for the Phase-2 upgrade of ATLAS hadronic Tile Calorimeter. The link board design is highly redundant, with digital functionality implemented in two Xilinx Kintex-7 FPGAs, and two Molex QSFP+ electro-optic modules with uplinks run at 10 Gbps. The FPGAs are remotely configured through two radiation-hard CERN GBTx deserialisers (GBTx), which also provide the LHC-synchronous system clock. The redundant design eliminates virtually all single-point error modes, and a combination of triple-mode redundancy (TMR), internal and external scrubbing will provide adequate protection against radiation-induced errors. The small portion of the FPGA design that cannot be protected by TMR will be the dominant source of radiation-induced errors, even if that area is small.
Upgrading Programs for Construction Journeymen. Final Report.
ERIC Educational Resources Information Center
Franklin, William S.
The report describes a study of industry-sponsored upgrading programs for journeymen in construction unions. Interviews with union and training officials, as well as 405 journeymen and 99 contractors, revealed that upgrading activities were concentrated in electrical work, carpentry, and the pipe trades, and that both the number of programs and…
Highlights and Perspectives from the CMS Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, Joel Nathan
2017-09-09
In 2016, the Large Hadron Collider provided proton-proton collisions at 13 TeV center-of-mass energy and achieved very high luminosity and reliability. The performance of the CMS Experiment in this running period and a selection of recent physics results are presented. These include precision measurements and searches for new particles. The status and prospects for data-taking in 2017 and a brief summary of the highlights of the High Luminosity (HL-LHC) upgrade of the CMS detector are also presented.
"Upgraded" physics at the LHC and RHIC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Llope, W. J.
2017-09-03
Closeout materials enclosed. This grant supported a postdoctoral scientist (S. Jowzaee) and the tuition for a graduate student (B. Erko), both working under the supervision of Prof. W.J. Llope at Wayne State University. Travel to a STAR Collaboration Meeting and the Quark Matter 2017 conference was also supported. The physics research concentrated on particle-identified two-particle correlations in the Beam Energy Scan data from the STAR experiment at RHIC. S. Jowzaee gave an oral presentation on this research at the Quark Matter 2017 conference.
Crab cavities: Past, present, and future of a challenging device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Q.
2015-05-03
In two-ring facilities operating with a crossing-angle collision scheme, luminosity can be limited due to an incomplete overlapping of the colliding bunches. Crab cavities then are introduced to restore head-on collisions by providing the destined opposite deflection to the head and tail of the bunch. An increase in luminosity was demonstrated at KEKB with global crab-crossing, while the Large Hardron Collider (LHC) at CERN currently is designing local crab crossing for the Hi-Lumi upgrade. Future colliders may investigate both approaches. In this paper, we review the challenges in the technology, and the implementation of crab cavities, while discussing experience inmore » earlier colliders, ongoing R&D, and proposed implementations for future facilities, such as HiLumi-LHC, CERN’s compact linear collider (CLIC), the international linear collider (ILC), and the electron-ion collider under design at BNL (eRHIC).« less
Beam-dynamics driven design of the LHeC energy-recovery linac
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pellegrini, Dario; Latina, Andrea; Schulte, Daniel
The LHeC study is a possible upgrade of the LHC that aims at delivering an electron beam for collisions with the existing hadronic beams. The current baseline design for the electron facility consists of a multi-pass superconducting energy-recovery linac operating in a continuous wave mode. Here, we summarize the overall layout of such ERL complex located on the LHC site and introduce the most recent developments. We review of the lattice components, presenting their baseline design along with possible alternatives that aims at improving the overall machine performance. The detector bypass has been designed and integrated into the lattice. Trackingmore » simulations allowed us to verify the high current (~150 mA in the linacs) beam operation required for the LHeC to serve as an Higgs Factory. The impact of single and multi-bunch wake-fields, synchrotron radiation and beam-beam effects has been assessed in this paper.« less
Beam-dynamics driven design of the LHeC energy-recovery linac
Pellegrini, Dario; Latina, Andrea; Schulte, Daniel; ...
2015-12-23
The LHeC study is a possible upgrade of the LHC that aims at delivering an electron beam for collisions with the existing hadronic beams. The current baseline design for the electron facility consists of a multi-pass superconducting energy-recovery linac operating in a continuous wave mode. Here, we summarize the overall layout of such ERL complex located on the LHC site and introduce the most recent developments. We review of the lattice components, presenting their baseline design along with possible alternatives that aims at improving the overall machine performance. The detector bypass has been designed and integrated into the lattice. Trackingmore » simulations allowed us to verify the high current (~150 mA in the linacs) beam operation required for the LHeC to serve as an Higgs Factory. The impact of single and multi-bunch wake-fields, synchrotron radiation and beam-beam effects has been assessed in this paper.« less
The management of large cabling campaigns during the Long Shutdown 1 of LHC
NASA Astrophysics Data System (ADS)
Meroli, S.; Machado, S.; Formenti, F.; Frans, M.; Guillaume, J. C.; Ricci, D.
2014-03-01
The Large Hadron Collider at CERN entered into its first 18 month-long shutdown period in February 2013. During this period the entire CERN accelerator complex will undergo major consolidation and upgrade works, preparing the machines for LHC operation at nominal energy (7 TeV/beam). One of the most challenging activities concerns the cabling infrastructure (copper and optical fibre cables) serving the CERN data acquisition, networking and control systems. About 1000 kilometres of cables, distributed in different machine areas, will be installed, representing an investment of about 15 MCHF. This implies an extraordinary challenge in terms of project management, including resource and activity planning, work execution and quality control. The preparation phase of this project started well before its implementation, by defining technical solutions and setting financial plans for staff recruitment and material supply. Enhanced task coordination was further implemented by deploying selected competences to form a central support team.
Three-loop corrections to the Higgs boson mass and implications for supersymmetry at the LHC.
Feng, Jonathan L; Kant, Philipp; Profumo, Stefano; Sanford, David
2013-09-27
In supersymmetric models with minimal particle content and without left-right squark mixing, the conventional wisdom is that the 125.6 GeV Higgs boson mass implies top squark masses of O(10) TeV, far beyond the reach of colliders. This conclusion is subject to significant theoretical uncertainties, however, and we provide evidence that it may be far too pessimistic. We evaluate the Higgs boson mass, including the dominant three-loop terms at O(αtαs2), in currently viable models. For multi-TeV top squarks, the three-loop corrections can increase the Higgs boson mass by as much as 3 GeV and lower the required top-squark masses to 3-4 TeV, greatly improving prospects for supersymmetry discovery at the upcoming run of the LHC and its high-luminosity upgrade.
Juchno, M.; Ambrosio, G.; Anerella, M.; ...
2016-01-26
Within the scope of the High Luminosity LHC project, the collaboration between CERN and U.S. LARP is developing new low-β quadrupoles using the Nb 3Sn superconducting technology for the upgrade of the LHC interaction regions. The magnet support structure of the first short model was designed and two units were fabricated and tested at CERN and at LBNL. The structure provides the preload to the collars-coils subassembly by an arrangement of outer aluminum shells pre-tensioned with water-pressurized bladders. For the mechanical qualification of the structure and the assembly procedure, superconducting coils were replaced with solid aluminum “dummy coils”, the structuremore » was preloaded at room temperature, and then cooled-down to 77 K. Mechanical behavior of the magnet structure was monitored with the use of strain gauges installed on the aluminum shells, the dummy coils and the axial preload system. As a result, this paper reports on the outcome of the assembly and the cool-down tests with dummy coils, which were performed at CERN and at LBNL, and presents the strain gauge measurements compared to the 3D finite element model predictions.« less
Laboratory and testbeam results for thin and epitaxial planar sensors for HL-LHC
Bubna, M.; Bolla, G.; Bortoletto, D.; ...
2015-08-03
The High-Luminosity LHC (HL-LHC) upgrade of the CMS pixel detector will require the development of novel pixel sensors which can withstand the increase in instantaneous luminosity to L = 5 × 10 34 cm –2s –1 and collect ~ 3000fb –1 of data. The innermost layer of the pixel detector will be exposed to doses of about 10 16 n eq/ cm 2. Hence, new pixel sensors with improved radiation hardness need to be investigated. A variety of silicon materials (Float-zone, Magnetic Czochralski and Epitaxially grown silicon), with thicknesses from 50 μm to 320 μm in p-type and n-type substrates have beenmore » fabricated using single-sided processing. The effect of reducing the sensor active thickness to improve radiation hardness by using various techniques (deep diffusion, wafer thinning, or growing epitaxial silicon on a handle wafer) has been studied. Furthermore, the results for electrical characterization, charge collection efficiency, and position resolution of various n-on-p pixel sensors with different substrates and different pixel geometries (different bias dot gaps and pixel implant sizes) will be presented.« less
NASA Astrophysics Data System (ADS)
Affolder, Anthony; Allport, Phil; Casse, Gianluigi
2010-11-01
The planned luminosity upgrade of the Large Hadron Collider at CERN (Super-LHC) will provide a challenging environment for the tracking and vertexing detector systems. Planar, segmented silicon detectors are one of the few radiation tolerant technologies under consideration for use for the Super-LHC tracking detectors in either pixel or strip geometries. In this paper, charge collection measurements are made with planar silicon sensors with 2 different substrate materials (float zone and magnetic Czochralski) and 3 different diode configurations (p+ strip in n-bulk, n+ strip in n-bulk, and n+ strip in p-bulk). For the first time, a comparison of the charge collection of these devices will be made after irradiation up to 6 ×1014 neq cm-2 with 280 MeV charged pions, and up to 2.2 ×1016 neq cm-2 with 26 MeV protons. This study covers the expected range of final fluences for the different layers of pixel and microstrip sensors of the ATLAS and CMS experiments at the Super-LHC. These measurements have been carried out using analogue, high-speed (40 MHz) electronics and a Strontium-90 beta source.
Test Result of the Short Models MQXFS3 and MQXFS5 for the HL-LHC Upgrade
Bajas, Hugues; Ambrosio, Giorgio; Ballarino, A.; ...
2018-02-27
In the framework of the High-Luminosity Large Hadron Collider, the installation of a new generation of quadrupole magnets is foreseen on each side of ATLAS and CMS experiments. The new magnets are based on Nbmore » $$_{3}$$Sn technology and shall be able to reach an ultimate current of 17.9 kA with a peak field of 12.3 T in the coil. In 2016 and 2017, the first two short models, called MQXFS3 and MQXFS5, have been tested at 4.2 and 1.9 K in the two new test benches at the European Organization for Nuclear Research. This paper presents the result of the quench performance of the two models; the first magnet reached nominal but failed to reach ultimate, showing detraining in one coil. MQXFS5 reached ultimate performance without any detraining phenomena, validating the PIT conductor used for the first time in this magnet program.« less
Test Result of the Short Models MQXFS3 and MQXFS5 for the HL-LHC Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bajas, Hugues; Ambrosio, Giorgio; Ballarino, A.
In the framework of the High-Luminosity Large Hadron Collider, the installation of a new generation of quadrupole magnets is foreseen on each side of ATLAS and CMS experiments. The new magnets are based on Nbmore » $$_{3}$$Sn technology and shall be able to reach an ultimate current of 17.9 kA with a peak field of 12.3 T in the coil. In 2016 and 2017, the first two short models, called MQXFS3 and MQXFS5, have been tested at 4.2 and 1.9 K in the two new test benches at the European Organization for Nuclear Research. This paper presents the result of the quench performance of the two models; the first magnet reached nominal but failed to reach ultimate, showing detraining in one coil. MQXFS5 reached ultimate performance without any detraining phenomena, validating the PIT conductor used for the first time in this magnet program.« less
ERIC Educational Resources Information Center
ABC Canada, Toronto (Ontario).
The reasons for nonparticipation in adult literacy and upgrading programs were examined in a national study during which interviewers in 12 Canadian provinces conducted in-person interviews with 44 adults who had never participated in a literacy or upgrading program. Most interviewees indicated that they had experienced transition points at which…
Large area thinned planar sensors for future high-luminosity-LHC upgrades
NASA Astrophysics Data System (ADS)
Wittig, T.; Lawerenz, A.; Röder, R.
2016-12-01
Planar hybrid silicon sensors are a well proven technology for past and current particle tracking detectors in HEP experiments. However, the future high-luminosity upgrades of the inner trackers at the LHC experiments pose big challenges to the detectors. A first challenge is an expected radiation damage level of up to 2ṡ 1016 neq/cm2. For planar sensors, one way to counteract the charge loss and thus increase the radiation hardness is to decrease the thickness of their active area. A second challenge is the large detector area which has to be built as cost-efficient as possible. The CiS research institute has accomplished a proof-of-principle run with n-in-p ATLAS-Pixel sensors in which a cavity is etched to the sensor's back side to reduce its thickness. One advantage of this technology is the fact that thick frames remain at the sensor edges and guarantee mechanical stability on wafer level while the sensor is left on the resulting thin membrane. For this cavity etching technique, no handling wafers are required which represents a benefit in terms of process effort and cost savings. The membranes with areas of up to ~ 4 × 4 cm2 and thicknesses of 100 and 150 μm feature a sufficiently good homogeneity across the whole wafer area. The processed pixel sensors show good electrical behaviour with an excellent yield for a suchlike prototype run. First sensors with electroless Ni- and Pt-UBM are already successfully assembled with read-out chips.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauer, Gerry; et al.
The DAQ system of the CMS experiment at CERN collects data from more than 600 custom detector Front-End Drivers (FEDs). During 2013 and 2014 the CMS DAQ system will undergo a major upgrade to address the obsolescence of current hardware and the requirements posed by the upgrade of the LHC accelerator and various detector components. For a loss-less data collection from the FEDs a new FPGA based card implementing the TCP/IP protocol suite over 10Gbps Ethernet has been developed. To limit the TCP hardware implementation complexity the DAQ group developed a simplified and unidirectional but RFC 793 compliant version ofmore » the TCP protocol. This allows to use a PC with the standard Linux TCP/IP stack as a receiver. We present the challenges and protocol modifications made to TCP in order to simplify its FPGA implementation. We also describe the interaction between the simplified TCP and Linux TCP/IP stack including the performance measurements.« less
Recent progress and tests of radiation resistant impregnation materials for Nb3Sn coils
NASA Astrophysics Data System (ADS)
Bossert, R.; Krave, S.; Ambrosio, G.; Andreev, N.; Chlachidze, G.; Nobrega, A.; Novitski, I.; Yu, M.; Zlobin, A. V.
2014-01-01
Fermilab is collaborating with Lawrence Berkeley National Laboratory (LBNL) and Brookhaven National Laboratory (BNL) (US-LARP collaboration) to develop a large-aperture Nb3Sn superconducting quadrupole for the Large Hadron Collider (LHC) luminosity upgrade. An important component of this work is the development of materials that are sufficiently radiation resistant for use in critical areas of the upgrade. This paper describes recent progress in characterization of materials, including the baseline CTD101K epoxy, cyanate ester blends, and Matrimid 5292, a bismaleimide-based system. Structural properties of "ten stacks" of cable impregnated with these materials are tested at room and cryogenic temperatures and compared to the baseline CT-101K. Experience with potting 1 and 2 meter long coils with Matrimid 5292 are described. Test results of a single 1-m coil impregnated with Matrimid 5292 are reported and compared to similar coils impregnated with the traditional epoxy.
Upgrading the ATLAS Tile Calorimeter Electronics
NASA Astrophysics Data System (ADS)
Carrió, Fernando
2013-11-01
This work summarizes the status of the on-detector and off-detector electronics developments for the Phase 2 Upgrade of the ATLAS Tile Calorimeter at the LHC scheduled around 2022. A demonstrator prototype for a slice of the calorimeter including most of the new electronics is planned to be installed in ATLAS in the middle of 2014 during the first Long Shutdown. For the on-detector readout, three different front-end boards (FEB) alternatives are being studied: a new version of the 3-in-1 card, the QIE chip and a dedicated ASIC called FATALIC. The Main Board will provide communication and control to the FEBs and the Daughter Board will transmit the digitized data to the off-detector electronics in the counting room, where the super Read-Out Driver (sROD) will perform processing tasks on them and will be the interface to the trigger levels 0, 1 and 2.
NASA Astrophysics Data System (ADS)
Cory, Bradley S.
The reEnergize Program conducted 957 energy upgrades in Omaha Nebraska from July 2010 to September 30th 2013, through a government grant within the Better Buildings Neighborhood Program. Projected program savings were provided upon program completion but it was unknown how effective the program was at actually reducing energy consumption in the homes that were upgraded. The following research report uses a PRISM analysis to remove the effect of weather and compare the actual pre and post utility usage rates to determine the actual effectiveness of the program. The housing characteristics, and individual energy upgrades were analyzed to see if any patterns or trends could be identified between consumption savings and housing type and specific upgrade measure. The results of the study showed that the program did induce savings but by much less than the engineering estimates predicted. It is likely that housing characteristics and upgrade measures play a role in inducing consumption savings but homeowner behavior is a stronger factor that influences savings.
Reach of the high-energy LHC for gluinos and top squarks in SUSY models with light Higgsinos
NASA Astrophysics Data System (ADS)
Baer, Howard; Barger, Vernon; Gainer, James S.; Serce, Hasan; Tata, Xerxes
2017-12-01
We examine the top squark (stop) and gluino reach of the proposed 33 TeV energy upgrade of the Large Hadron Collider (LHC33) in the Minimal Supersymmetric Standard Model (MSSM) with light Higgsinos and relatively heavy electroweak gauginos. In our analysis, we assume that stops decay to Higgsinos via t˜1→t Z˜1, t˜1→t Z˜2, and t˜1→b W˜1 with branching fractions in the ratio 1 ∶1 ∶2 (expected if the decay occurs dominantly via the superpotential Yukawa coupling), while gluinos decay via g ˜→t t˜1 or via three-body decays to third-generation quarks plus Higgsinos. These decay patterns are motivated by models of natural supersymmetry where Higgsinos are expected to be close in mass to mZ, but gluinos may be as heavy as 5-6 TeV, and stops may have masses up to ˜3 TeV . We devise cuts to optimize the signals from stop and gluino pair production at LHC33. We find that experiments at LHC33 should be able to discover stops with >5 σ significance if mt˜1<2.3 (2.8) [3.2] TeV for an integrated luminosity of 0.3 (1) [3 ] ab-1 . The corresponding reach for gluinos extends to 5 (5.5) [6] TeV. These results imply that experiments at LHC33 should be able to discover at least one of the stop or gluino pair signals even with an integrated luminosity of 0.3 ab-1 for natural supersymmetry models with no worse than 3% electroweak fine-tuning and quite likely both gluinos and stops for an integrated luminosity of 3 ab-1 .
PanDA: Exascale Federation of Resources for the ATLAS Experiment at the LHC
NASA Astrophysics Data System (ADS)
Barreiro Megino, Fernando; Caballero Bejar, Jose; De, Kaushik; Hover, John; Klimentov, Alexei; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Padolski, Siarhei; Panitkin, Sergey; Petrosyan, Artem; Wenaus, Torre
2016-02-01
After a scheduled maintenance and upgrade period, the world's largest and most powerful machine - the Large Hadron Collider(LHC) - is about to enter its second run at unprecedented energies. In order to exploit the scientific potential of the machine, the experiments at the LHC face computational challenges with enormous data volumes that need to be analysed by thousand of physics users and compared to simulated data. Given diverse funding constraints, the computational resources for the LHC have been deployed in a worldwide mesh of data centres, connected to each other through Grid technologies. The PanDA (Production and Distributed Analysis) system was developed in 2005 for the ATLAS experiment on top of this heterogeneous infrastructure to seamlessly integrate the computational resources and give the users the feeling of a unique system. Since its origins, PanDA has evolved together with upcoming computing paradigms in and outside HEP, such as changes in the networking model, Cloud Computing and HPC. It is currently running steadily up to 200 thousand simultaneous cores (limited by the available resources for ATLAS), up to two million aggregated jobs per day and processes over an exabyte of data per year. The success of PanDA in ATLAS is triggering the widespread adoption and testing by other experiments. In this contribution we will give an overview of the PanDA components and focus on the new features and upcoming challenges that are relevant to the next decade of distributed computing workload management using PanDA.
NASA Astrophysics Data System (ADS)
Ohene-Kwofie, Daniel; Otoo, Ekow
2015-10-01
The ATLAS detector, operated at the Large Hadron Collider (LHC) records proton-proton collisions at CERN every 50ns resulting in a sustained data flow up to PB/s. The upgraded Tile Calorimeter of the ATLAS experiment will sustain about 5PB/s of digital throughput. These massive data rates require extremely fast data capture and processing. Although there has been a steady increase in the processing speed of CPU/GPGPU assembled for high performance computing, the rate of data input and output, even under parallel I/O, has not kept up with the general increase in computing speeds. The problem then is whether one can implement an I/O subsystem infrastructure capable of meeting the computational speeds of the advanced computing systems at the petascale and exascale level. We propose a system architecture that leverages the Partitioned Global Address Space (PGAS) model of computing to maintain an in-memory data-store for the Processing Unit (PU) of the upgraded electronics of the Tile Calorimeter which is proposed to be used as a high throughput general purpose co-processor to the sROD of the upgraded Tile Calorimeter. The physical memory of the PUs are aggregated into a large global logical address space using RDMA- capable interconnects such as PCI- Express to enhance data processing throughput.
Evaluative Assessment for NASA/GSFC Equal Opportunity Programs Office Sponsored Programs
NASA Technical Reports Server (NTRS)
Jarrell, H. Judith
1995-01-01
The purpose of PREP (Pre-College Minority Engineering Program) is to upgrade skills of minority students who have shown an interest in pursuing academic degrees in electrical engineering. The goal is to upgrade skills needed for successful completion of the rigorous curriculum leading to a Bachelor of Science degree in engineering through a comprehensive upgrade of academic, study and interpersonal skills.
Muon g -2 and dark matter suggest nonuniversal gaugino masses: S U (5 )×A4 case study at the LHC
NASA Astrophysics Data System (ADS)
Belyaev, Alexander S.; King, Steve F.; Schaefers, Patrick B.
2018-06-01
We argue that in order to account for the muon anomalous magnetic moment g -2 , dark matter and LHC data, nonuniversal gaugino masses Mi at the high scale are required in the framework of the minimal supersymmetric standard model. We also need a right-handed smuon μ˜R with a mass around 100 GeV, evading LHC searches due to the proximity of a neutralino χ˜10 several GeV lighter which allows successful dark matter. We discuss such a scenario in the framework of an S U (5 ) grand unified theory (GUT) combined with A4 family symmetry, where the three 5 ¯ representations form a single triplet of A4 with a unified soft mass mF, while the three 10 representations are singlets of A4 with independent soft masses mT 1,mT 2,mT 3. Although mT 2 (and hence μ˜R) may be light, the muon g -2 and relic density also requires light M1≃250 GeV , which is incompatible with universal gaugino masses due to LHC constraints on M2 and M3 arising from gaugino searches. After showing that universal gaugino masses M1 /2 at the GUT scale are excluded by gluino searches, we provide a series of benchmarks which show that while M1=M2≪M3 is in tension with 8 and 13 TeV LHC data, M1
Tesla: An application for real-time data analysis in High Energy Physics
NASA Astrophysics Data System (ADS)
Aaij, R.; Amato, S.; Anderlini, L.; Benson, S.; Cattaneo, M.; Clemencic, M.; Couturier, B.; Frank, M.; Gligorov, V. V.; Head, T.; Jones, C.; Komarov, I.; Lupton, O.; Matev, R.; Raven, G.; Sciascia, B.; Skwarnicki, T.; Spradlin, P.; Stahl, S.; Storaci, B.; Vesterinen, M.
2016-11-01
Upgrades to the LHCb computing infrastructure in the first long shutdown of the LHC have allowed for high quality decay information to be calculated by the software trigger making a separate offline event reconstruction unnecessary. Furthermore, the storage space of the triggered candidate is an order of magnitude smaller than the entire raw event that would otherwise need to be persisted. Tesla is an application designed to process the information calculated by the trigger, with the resulting output used to directly perform physics measurements.
MQXFS1 Quadrupole Fabrication Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosio, G.; Anerella, M.; Bossert, R.
This report presents the fabrication and QC data of MQXFS1, the first short model of the low-beta quadrupoles (MQXF) for the LHC High Luminosity Upgrade. It describes the conductor, the coils, and the structure that make the MQXFS1 magnet. Qualification tests and non-conformities are also presented and discussed. The fabrication of MQXFS1 was started before the finalization of conductor and coil design for MQXF magnets. Two strand design were used (RRP 108/127 and RRP 132/169). Cable and coil cross-sections were “first generation”.
High Energy Physics Research with the CMS Experiment at CERN - Energy Frontier Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, Gail G.
The Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) near Geneva, Switzerland, is now the highest energy accelerator in the world, colliding protons with protons. On July 4, 2012, the two general-purpose experiments, ATLAS and the Compact Muon Solenoid (CMS) experiment, announced the observation of a particle consistent with the world’s most sought-after particle, the Higgs boson, at a mass of about 125 GeV (approximately 125 times the mass of the proton). The Higgs boson is the final missing ingredient of the standard model, in which it is needed to allow most other particles to acquiremore » mass through the mechanism of electroweak symmetry breaking. We are members of the team in the CMS experiment that found evidence for the Higgs boson through its decay to two photons, the most sensitive channel at the LHC. We are proposing to carry out studies to determine whether the new particle has the properties expected for the standard model Higgs boson or whether it is something else. The new particle can still carry out its role in electroweak symmetry breaking but have other properties as well. Most theorists think that a single standard model Higgs boson cannot be the complete solution – there are other particles needed to answer some of the remaining questions, such as the hierarchy problem. The particle that has been observed could be one of several Higgs bosons, for example, or it could be composite. One model of physics beyond the standard model is supersymmetry, in which every ordinary particle has a superpartner with opposite spin properties. In supersymmetric models, there must be at least five Higgs bosons. In the most popular versions of supersymmetry, the lightest supersymmetric particle does not decay and is a candidate for dark matter. This proposal covers the period from June 1, 2013, to March 31, 2016. During this period the LHC will finally reach its design energy, almost twice the energy at which it now runs. We will be able to study the Higgs boson at the current LHC energy using about three times as much data as were used to make the observation. In 2013 the LHC will shut down to make preparations to run at its design energy in 2015. During the shutdown period, we will be preparing upgrades of the detector to be able to run at the higher rates of proton-proton collisions that will also be possible once the LHC is running at design energy. The upgrade on which we are working, the inner silicon pixel tracker, will be installed in late 2016. Definitive tests of whether the new particle satisfies the properties of the standard model Higgs boson will almost certainly require both the higher energy and the larger amounts of data that can be accumulated using the higher rates. Meanwhile we will use the data taken during 2012 and the higher energy data starting in 2015 to continue to search for beyond-the-standard-model physics such as supersymmetry and heavy neutrinos. We have already made such searches using data since the LHC started running. We are discussing with theorists how a 125-GeV Higgs modifies such models. Finding such particles will probably also require the higher energy and larger amounts of data beginning in 2015. The period of this proposal promises to be very exciting, leading to new knowledge of the matter in the Universe.« less
The test facility for the short prototypes of the LHC superconducting magnets
NASA Astrophysics Data System (ADS)
Delsolaro, W. Venturini; Arn, A.; Bottura, L.; Giloux, C.; Mompo, R.; Siemko, A.; Walckiers, L.
2002-05-01
The LHC development program relies on cryogenic tests of prototype and model magnets. This vigorous program is pursued in a dedicated test facility based on several vertical cryostats working at superfluid helium temperatures. The performance of the facility is detailed. Goals and test equipment for currently performed studies are reviewed: quench analysis and magnet protection studies, measurement of the field quality, test of ancillary electrical equipment like diodes and busbars. The paper covers the equipment available for tests of prototypes and some special series of LHC magnets to come.
Upgrading in an Industrial Setting. Final Report.
ERIC Educational Resources Information Center
Russell, Wendell
The project objectives were: (1) to assess existing industrial upgrading practices in an Atomic Energy Commission contractor organization, (2) to design new alternative upgrading methods, (3) to experiment with new upgrading methods, (4) to plan for utilization of proven upgrading programs, and (5) to document and disseminate activities. A twelve…
Quench protection study of the updated MQXF for the LHC luminosity upgrade (HiLumi LHC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marinozzi, Vittorio; Ambrosio, Giorgio; Ferracin, Paolo
In 2023, the LHC luminosity will be increased, aiming at reaching 3000 fb-1 integrated over ten years. To obtain this target, new Nb 3Sn low-β quadrupoles (MQXF) have been designed for the interaction regions. These magnets present a very large aperture (150 mm, to be compared with the 70 mm of the present NbTi quadrupoles) and a very large stored energy density (120 MJ/m 3). For these reasons, quench protection is one of the most challenging aspects of the design of these magnets. In fact, protection studies of a previous design showed that the simulated hot spot temperature was verymore » close to the maximum allowed limit of 350 K; this challenge motivated improvements in the current discharge modeling, taking into account the so-called dynamic effects on the apparent magnet inductance. Moreover, quench heaters design has been studied to be going into more details. In this study, a protection study of the updated MQXF is presented, benefiting from the experience gained by studying the previous design. As a result, a study of the voltages between turns in the magnet is also presented during both normal operation and most important failure scenarios.« less
Controlled longitudinal emittance blow-up using band-limited phase noise in CERN PSB
NASA Astrophysics Data System (ADS)
Quartullo, D.; Shaposhnikova, E.; Timko, H.
2017-07-01
Controlled longitudinal emittance blow-up (from 1 eVs to 1.4 eVs) for LHC beams in the CERN PS Booster is currently achievied using sinusoidal phase modulation of a dedicated high-harmonic RF system. In 2021, after the LHC injectors upgrade, 3 eVs should be extracted to the PS. Even if the current method may satisfy the new requirements, it relies on low-power level RF improvements. In this paper another method of blow-up was considered, that is the injection of band-limited phase noise in the main RF system (h=1), never tried in PSB but already used in CERN SPS and LHC, under different conditions (longer cycles). This technique, which lowers the peak line density and therefore the impact of intensity effects in the PSB and the PS, can also be complementary to the present method. The longitudinal space charge, dominant in the PSB, causes significant synchrotron frequency shifts with intensity, and its effect should be taken into account. Another complication arises from the interaction of the phase loop with the injected noise, since both act on the RF phase. All these elements were studied in simulations of the PSB cycle with the BLonD code, and the required blow-up was achieved.
The ATLAS Data Acquisition System in LHC Run 2
NASA Astrophysics Data System (ADS)
Panduro Vazquez, William; ATLAS Collaboration
2017-10-01
The LHC has been providing pp collisions with record luminosity and energy since the start of Run 2 in 2015. The Trigger and Data Acquisition system of the ATLAS experiment has been upgraded to deal with the increased performance required by this new operational mode. The dataflow system and associated network infrastructure have been reshaped in order to benefit from technological progress and to maximize the flexibility and efficiency of the data selection process. The new design is radically different from the previous implementation both in terms of architecture and performance, with the previous two-level structure merged into a single processing farm, performing incremental data collection and analysis. In addition, logical farm slicing, with each slice managed by a dedicated supervisor, has been dropped in favour of global management by a single farm master operating at 100 kHz. This farm master has also been integrated with a new software-based Region of Interest builder, replacing the previous VMEbus-based system. Finally, the Readout system has been completely refitted with new higher performance, lower footprint server machines housing a new custom front-end interface card. Here we will cover the overall design of the system, along with performance results from the start-up phase of LHC Run 2.
NASA Astrophysics Data System (ADS)
Lacour, D.
2018-02-01
The expected increase of the particle flux at the high luminosity phase of the LHC (HL-LHC) with instantaneous luminosities up to 7.5ṡ1034 cm-2s-1 will have a severe impact on the ATLAS detector performance. The pile-up is expected to increase on average to 200 interactions per bunch crossing. The reconstruction performance for electrons, photons as well as jets and transverse missing energy will be severely degraded in the end-cap and forward region. A High Granularity Timing Detector (HGTD) is proposed in front of the liquid Argon end-cap and forward calorimeters for pile-up mitigation. This device should cover the pseudo-rapidity range of 2.4 to about 4.0. Low Gain Avalanche Detectors (LGAD) technology has been chosen as it provides an internal gain good enough to reach large signal over noise ratio needed for excellent time resolution. The requirements and overall specifications of the High Granular Timing Detector at the HL-LHC will be presented as well as the conceptual design of its mechanics and electronics. Beam test results and measurements of irradiated LGAD silicon sensors, such as gain and timing resolution, will be shown.
Quench protection study of the updated MQXF for the LHC luminosity upgrade (HiLumi LHC)
Marinozzi, Vittorio; Ambrosio, Giorgio; Ferracin, Paolo; ...
2016-06-01
In 2023, the LHC luminosity will be increased, aiming at reaching 3000 fb-1 integrated over ten years. To obtain this target, new Nb 3Sn low-β quadrupoles (MQXF) have been designed for the interaction regions. These magnets present a very large aperture (150 mm, to be compared with the 70 mm of the present NbTi quadrupoles) and a very large stored energy density (120 MJ/m 3). For these reasons, quench protection is one of the most challenging aspects of the design of these magnets. In fact, protection studies of a previous design showed that the simulated hot spot temperature was verymore » close to the maximum allowed limit of 350 K; this challenge motivated improvements in the current discharge modeling, taking into account the so-called dynamic effects on the apparent magnet inductance. Moreover, quench heaters design has been studied to be going into more details. In this study, a protection study of the updated MQXF is presented, benefiting from the experience gained by studying the previous design. As a result, a study of the voltages between turns in the magnet is also presented during both normal operation and most important failure scenarios.« less
Radiation hardness studies of AMS HV-CMOS 350 nm prototype chip HVStripV1
Kanisauskas, K.; Affolder, A.; Arndt, K.; ...
2017-02-15
CMOS active pixel sensors are being investigated for their potential use in the ATLAS inner tracker upgrade at the HL-LHC. The new inner tracker will have to handle a significant increase in luminosity while maintaining a sufficient signal-to-noise ratio and pulse shaping times. This paper focuses on the prototype chip "HVStripV1" (manufactured in the AMS HV-CMOS 350nm process) characterization before and after irradiation up to fluence levels expected for the strip region in the HL-LHC environment. The results indicate an increase of depletion region after irradiation for the same bias voltage by a factor of ≈2.4 and ≈2.8 for twomore » active pixels on the test chip. As a result, there was also a notable increase in noise levels from 85 e – to 386 e – and from 75 e – to 277 e – for the corresponding pixels.« less
Radiation hardness studies of AMS HV-CMOS 350 nm prototype chip HVStripV1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanisauskas, K.; Affolder, A.; Arndt, K.
CMOS active pixel sensors are being investigated for their potential use in the ATLAS inner tracker upgrade at the HL-LHC. The new inner tracker will have to handle a significant increase in luminosity while maintaining a sufficient signal-to-noise ratio and pulse shaping times. This paper focuses on the prototype chip "HVStripV1" (manufactured in the AMS HV-CMOS 350nm process) characterization before and after irradiation up to fluence levels expected for the strip region in the HL-LHC environment. The results indicate an increase of depletion region after irradiation for the same bias voltage by a factor of ≈2.4 and ≈2.8 for twomore » active pixels on the test chip. As a result, there was also a notable increase in noise levels from 85 e – to 386 e – and from 75 e – to 277 e – for the corresponding pixels.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perin, A.; Casas-Cubillos, J.; Pezzetti, M.
2014-01-29
The 600 A and 120 A circuits of the inner triplet magnets of the Large Hadron Collider are powered by resistive gas cooled current leads. The current solution for controlling the gas flow of these leads has shown severe operability limitations. In order to allow a more precise and more reliable control of the cooling gas flow, new flowmeters will be installed during the first long shutdown of the LHC. Because of the high level of radiation in the area next to the current leads, the flowmeters will be installed in shielded areas located up to 50 m away frommore » the current leads. The control valves being located next to the current leads, this configuration leads to long piping between the valves and the flowmeters. In order to determine its dynamic behaviour, the proposed system was simulated with a numerical model and validated with experimental measurements performed on a dedicated test bench.« less
CMS distributed data analysis with CRAB3
NASA Astrophysics Data System (ADS)
Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.
2015-12-01
The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.
NASA Astrophysics Data System (ADS)
Pernegger, H.; Bates, R.; Buttar, C.; Dalla, M.; van Hoorne, J. W.; Kugathasan, T.; Maneuski, D.; Musa, L.; Riedler, P.; Riegel, C.; Sbarra, C.; Schaefer, D.; Schioppa, E. J.; Snoeys, W.
2017-06-01
The upgrade of the ATLAS [1] tracking detector for the High-Luminosity Large Hadron Collider (LHC) at CERN requires novel radiation hard silicon sensor technologies. Significant effort has been put into the development of monolithic CMOS sensors but it has been a challenge to combine a low capacitance of the sensing node with full depletion of the sensitive layer. Low capacitance brings low analog power. Depletion of the sensitive layer causes the signal charge to be collected by drift sufficiently fast to separate hits from consecutive bunch crossings (25 ns at the LHC) and to avoid losing the charge by trapping. This paper focuses on the characterization of charge collection properties and detection efficiency of prototype sensors originally designed in the framework of the ALICE Inner Tracking System (ITS) upgrade [2]. The prototypes are fabricated both in the standard TowerJazz 180nm CMOS imager process [3] and in an innovative modification of this process developed in collaboration with the foundry, aimed to fully deplete the sensitive epitaxial layer and enhance the tolerance to non-ionizing energy loss. Sensors fabricated in standard and modified process variants were characterized using radioactive sources, focused X-ray beam and test beams before and after irradiation. Contrary to sensors manufactured in the standard process, sensors from the modified process remain fully functional even after a dose of 1015neq/cm2, which is the the expected NIEL radiation fluence for the outer pixel layers in the future ATLAS Inner Tracker (ITk) [4].
The PCIe-based readout system for the LHCb experiment
NASA Astrophysics Data System (ADS)
Cachemiche, J. P.; Duval, P. Y.; Hachon, F.; Le Gac, R.; Réthoré, F.
2016-02-01
The LHCb experiment is designed to study differences between particles and anti-particles as well as very rare decays in the beauty and charm sector at the LHC. The detector will be upgraded in 2019 in order to significantly increase its efficiency, by removing the first-level hardware trigger. The upgrade experiment will implement a trigger-less readout system in which all the data from every LHC bunch-crossing are transported to the computing farm over 12000 optical links without hardware filtering. The event building and event selection are carried out entirely in the farm. Another original feature of the system is that data transmitted through these fibres arrive directly to computers through a specially designed PCIe card called PCIe40. The same board handles the data acquisition flow and the distribution of fast and slow controls to the detector front-end electronics. It embeds one of the most powerful FPGAs currently available on the market with 1.2 million logic cells. The board has a bandwidth of 480 Gbits/s in both input and output over optical links and 100 Gbits/s over the PCI Express bus to the CPU. We will present how data circulate through the board and in the PC server for achieving the event building. We will focus on specific issues regarding the design of such a board with a very large FPGA, in particular in terms of power supply dimensioning and thermal simulations. The features of the board will be detailed and we will finally present the first performance measurements.
Tractor Trailer Driver's Training Programs. Performance Report.
ERIC Educational Resources Information Center
New Hampshire Vocational Technical Coll., Nashua.
This document describes a project to develop a 320-hour tractor trailer driver training program and a 20-hour commercial driver licensing upgrade training program. Of 34 graduates from the training program, 28 secured employment in the trucking industry. From August 1989 to June 1990, 725 students were trained in the upgrade training program with…
Characterization of the Outer Barrel modules for the upgrade of the ALICE Inner Tracking System
NASA Astrophysics Data System (ADS)
Di Ruzza, B.
2017-09-01
ALICE is one of the four large detectors at the CERN LHC collider, designed to address the physics of strongly interacting matter, and in particular the properties of the Quark-Gluon Plasma using proton-proton, proton-nucleus, and nucleus-nucleus collisions. Despite the success already reached in achieving these physics goals, there are several measurements still to be finalized, like high precision measurements of rare probes (D mesons, Lambda baryons and B mesons decays) over a broad range of transverse momenta. In order to achieve these new physics goals, a wide upgrade plan was approved that combined with a significant increase of luminosity will enhance the ALICE physics capabilities enormously and will allow the achievement of these fundamental measurements. The Inner Tracking System (ITS) upgrade of the ALICE detector is one of the major improvements of the experimental set-up that will take place in 2019-2020 when the whole ITS sub-detector will be replaced with one realized using a innovative monolithic active pixel silicon sensor, called ALPIDE. The upgraded ITS will be realized using more than twenty-four thousand ALPIDE chips organized in seven different cylindrical layers, for a total surface of about ten square meters. The main features of the new ITS are a low material budget, high granularity and low power consumption. All these peculiar capabilities will allow for full reconstruction of rare heavy flavour decays and the achievement of the physics goals. In this paper after an overview of the whole ITS upgrade project, the construction procedure of the basic building block of the detector, namely the module, and its characterization in laboratory will be presented.
PHENIX: Beyond 15 years of discovery
Morrison, David; Nagle, James L.
2015-01-12
The PHENIX experiment at BNL’s Relativistic Heavy Ion Collider (RHIC) was designed to uncover properties of the quark–gluon plasma (QGP) via rare penetrating probes. Over the past 15 years, the collaboration has delivered on its promised measurements, often with exciting results beyond those originally foreseen. That the QGP behaves as a nearly perfect fluid and that non-photonic electrons are substantially suppressed has led to the use of heavy quarks as probes of the medium. The PHENIX silicon vertex detectors are opening a new arena for QGP studies, and the MPC-EX, a novel forward calorimeter with silicon readout, accesses low-x physicsmore » via direct photons with unprecedented precision. PHENIX has proposed sPHENIX, a major upgrade using the recently acquired BaBar solenoid and full calorimetric coverage and high rate capabilities. sPHENIX will reconstruct jets and extend observables to higher transverse momentum, where comparisons to results from the Large Hadron Collider (LHC) heavy-ion program will provide the most insightful. Following the RHIC program, the nuclear physics community has identified an electron ion collider (EIC) as crucial to the next generation of QCD investigations. The BaBar magnet and sPHENIX calorimetry will be an excellent foundation for a new collaborative pursuit of discovery.« less
NASA Astrophysics Data System (ADS)
Mori, R.; Allport, P. P.; Baca, M.; Broughton, J.; Chisholm, A.; Nikolopoulos, K.; Pyatt, S.; Thomas, J. P.; Wilson, J. A.; Kierstead, J.; Kuczewski, P.; Lynn, D.; Arratia-Munoz, M. I.; Hommels, L. B. A.; Ullan, M.; Fleta, C.; Fernandez-Tejero, J.; Bloch, I.; Gregor, I. M.; Lohwasser, K.; Poley, L.; Tackmann, K.; Trofimov, A.; Yildirim, E.; Hauser, M.; Jakobs, K.; Kuehn, S.; Mahboubi, K.; Parzefall, U.; Clark, A.; Ferrere, D.; Sevilla, S. Gonzalez; Ashby, J.; Blue, A.; Bates, R.; Buttar, C.; Doherty, F.; McMullen, T.; McEwan, F.; O'Shea, V.; Kamada, S.; Yamamura, K.; Ikegami, Y.; Nakamura, K.; Takubo, Y.; Unno, Y.; Takashima, R.; Chilingarov, A.; Fox, H.; Affolder, A. A.; Casse, G.; Dervan, P.; Forshaw, D.; Greenall, A.; Wonsak, S.; Wormald, M.; Cindro, V.; Kramberger, G.; Mandić, I.; Mikuž, M.; Gorelov, I.; Hoeferkamp, M.; Palni, P.; Seidel, S.; Taylor, A.; Toms, K.; Wang, R.; Hessey, N. P.; Valencic, N.; Hanagaki, K.; Dolezal, Z.; Kodys, P.; Bohm, J.; Stastny, J.; Mikestikova, M.; Bevan, A.; Beck, G.; Milke, C.; Domingo, M.; Fadeyev, V.; Galloway, Z.; Hibbard-Lubow, D.; Liang, Z.; Sadrozinski, H. F.-W.; Seiden, A.; To, K.; French, R.; Hodgson, P.; Marin-Reyes, H.; Parker, K.; Jinnouchi, O.; Hara, K.; Sato, K.; Sato, K.; Hagihara, M.; Iwabuchi, S.; Bernabeu, J.; Civera, J. V.; Garcia, C.; Lacasta, C.; Garcia, S. Marti i.; Rodriguez, D.; Santoyo, D.; Solaz, C.; Soldevila, U.
2016-09-01
The upgrade to the High-Luminosity LHC foreseen in about ten years represents a great challenge for the ATLAS inner tracker and the silicon strip sensors in the forward region. Several strip sensor designs were developed by the ATLAS collaboration and fabricated by Hamamatsu in order to maintain enough performance in terms of charge collection efficiency and its uniformity throughout the active region. Of particular attention, in the case of a stereo-strip sensor, is the area near the sensor edge where shorter strips were ganged to the complete ones. In this work the electrical and charge collection test results on irradiated miniature sensors with forward geometry are presented. Results from charge collection efficiency measurements show that at the maximum expected fluence, the collected charge is roughly halved with respect to the one obtained prior to irradiation. Laser measurements show a good signal uniformity over the sensor. Ganged strips have a similar efficiency as standard strips.
NASA Astrophysics Data System (ADS)
Ratza, Viktor; Ball, Markus; Liebtrau, M.; Ketzer, Bernhard
2018-02-01
In the context of the upgrade of the LHC during the second long shutdown the interaction rate of the ALICE experiment will be increased up to 50 kHz for Pb-Pb collisions. As a consequence, a continuous read-out of the Time Projection Chamber (TPC) will be required. To keep the space-charge distortions at a manageable size, the ion backflow of the charge amplification system has to be significantly reduced. At the same time an excellent detector performance and stability of the system has to be maintained. A solution with four Gaseous Electron Multipliers (GEMs) has been adopted as baseline solution for the upgraded chambers. As an alternative approach a hybrid GEM-Micromegas detector consisting of one Micromegas (MM) and two GEMs has been investigated. The recent results of the study of the hybrid GEM-Micromegas detector will be presented and compared to measurements with four GEM foils.
Test beam performance measurements for the Phase I upgrade of the CMS pixel detector
NASA Astrophysics Data System (ADS)
Dragicevic, M.; Friedl, M.; Hrubec, J.; Steininger, H.; Gädda, A.; Härkönen, J.; Lampén, T.; Luukka, P.; Peltola, T.; Tuominen, E.; Tuovinen, E.; Winkler, A.; Eerola, P.; Tuuva, T.; Baulieu, G.; Boudoul, G.; Caponetto, L.; Combaret, C.; Contardo, D.; Dupasquier, T.; Gallbit, G.; Lumb, N.; Mirabito, L.; Perries, S.; Vander Donckt, M.; Viret, S.; Bonnin, C.; Charles, L.; Gross, L.; Hosselet, J.; Tromson, D.; Feld, L.; Karpinski, W.; Klein, K.; Lipinski, M.; Pierschel, G.; Preuten, M.; Rauch, M.; Wlochal, M.; Aldaya, M.; Asawatangtrakuldee, C.; Beernaert, K.; Bertsche, D.; Contreras-Campana, C.; Eckerlin, G.; Eckstein, D.; Eichhorn, T.; Gallo, E.; Garay Garcia, J.; Hansen, K.; Haranko, M.; Harb, A.; Hauk, J.; Keaveney, J.; Kalogeropoulos, A.; Kleinwort, C.; Lohmann, W.; Mankel, R.; Maser, H.; Mittag, G.; Muhl, C.; Mussgiller, A.; Pitzl, D.; Reichelt, O.; Savitskyi, M.; Schütze, P.; Sola, V.; Spannagel, S.; Walsh, R.; Zuber, A.; Biskop, H.; Buhmann, P.; Centis-Vignali, M.; Garutti, E.; Haller, J.; Hoffmann, M.; Klanner, R.; Lapsien, T.; Matysek, M.; Perieanu, A.; Scharf, Ch.; Schleper, P.; Schmidt, A.; Schwandt, J.; Sonneveld, J.; Steinbrück, G.; Vormwald, B.; Wellhausen, J.; Abbas, M.; Amstutz, C.; Barvich, T.; Barth, Ch.; Boegelspacher, F.; De Boer, W.; Butz, E.; Casele, M.; Colombo, F.; Dierlamm, A.; Freund, B.; Hartmann, F.; Heindl, S.; Husemann, U.; Kornmeyer, A.; Kudella, S.; Muller, Th.; Simonis, H. J.; Steck, P.; Weber, M.; Weiler, Th.; Kiss, T.; Siklér, F.; Tölyhi, T.; Veszprémi, V.; Cariola, P.; Creanza, D.; De Palma, M.; De Robertis, G.; Fiore, L.; Franco, M.; Loddo, F.; Sala, G.; Silvestris, L.; Maggi, G.; My, S.; Selvaggi, G.; Albergo, S.; Cappello, G.; Costa, S.; Di Mattia, A.; Giordano, F.; Potenza, R.; Saizu, M. A.; Tricomi, A.; Tuve, C.; Focardi, E.; Dinardo, M. E.; Fiorendi, S.; Gennai, S.; Malvezzi, S.; Manzoni, R. A.; Menasce, D.; Moroni, L.; Pedrini, D.; Azzi, P.; Bacchetta, N.; Bisello, D.; Dall'Osso, M.; Pozzobon, N.; Tosi, M.; Alunni Solestizi, L.; Biasini, M.; Bilei, G. M.; Cecchi, C.; Checcucci, B.; Ciangottini, D.; Fanò, L.; Gentsos, C.; Ionica, M.; Leonardi, R.; Manoni, E.; Mantovani, G.; Marconi, S.; Mariani, V.; Menichelli, M.; Modak, A.; Morozzi, A.; Moscatelli, F.; Passeri, D.; Placidi, P.; Postolache, V.; Rossi, A.; Saha, A.; Santocchia, A.; Storchi, L.; Spiga, D.; Androsov, K.; Azzurri, P.; Bagliesi, G.; Basti, A.; Boccali, T.; Borrello, L.; Bosi, F.; Castaldi, R.; Ceccanti, M.; Ciocci, M. A.; Dell'Orso, R.; Donato, S.; Fedi, G.; Giassi, A.; Grippo, M. T.; Ligabue, F.; Magazzu, G.; Mammini, P.; Mariani, F.; Mazzoni, E.; Messineo, A.; Moggi, A.; Morsani, F.; Palla, F.; Palmonari, F.; Profeti, A.; Raffaelli, F.; Ragonesi, A.; Rizzi, A.; Soldani, A.; Spagnolo, P.; Tenchini, R.; Tonelli, G.; Venturi, A.; Verdini, P. G.; Abbaneo, D.; Ahmed, I.; Albert, E.; Auzinger, G.; Berruti, G.; Bonnaud, J.; Daguin, J.; D'Auria, A.; Detraz, S.; Dondelewski, O.; Engegaard, B.; Faccio, F.; Frank, N.; Gill, K.; Honma, A.; Kornmayer, A.; Labaza, A.; Manolescu, F.; McGill, I.; Mersi, S.; Michelis, S.; Onnela, A.; Ostrega, M.; Pavis, S.; Peisert, A.; Pernot, J.-F.; Petagna, P.; Postema, H.; Rapacz, K.; Sigaud, C.; Tropea, P.; Troska, J.; Tsirou, A.; Vasey, F.; Verlaat, B.; Vichoudis, P.; Zwalinski, L.; Bachmair, F.; Becker, R.; di Calafiori, D.; Casal, B.; Berger, P.; Djambazov, L.; Donega, M.; Grab, C.; Hits, D.; Hoss, J.; Kasieczka, G.; Lustermann, W.; Mangano, B.; Marionneau, M.; Martinez Ruiz del Arbol, P.; Masciovecchio, M.; Meinhard, M.; Perozzi, L.; Roeser, U.; Starodumov, A.; Tavolaro, V.; Wallny, R.; Zhu, D.; Amsler, C.; Bösiger, K.; Caminada, L.; Canelli, F.; Chiochia, V.; de Cosa, A.; Galloni, C.; Hreus, T.; Kilminster, B.; Lange, C.; Maier, R.; Ngadiuba, J.; Pinna, D.; Robmann, P.; Taroni, S.; Yang, Y.; Bertl, W.; Deiters, K.; Erdmann, W.; Horisberger, R.; Kaestli, H.-C.; Kotlinski, D.; Langenegger, U.; Meier, B.; Rohe, T.; Streuli, S.; Chen, P.-H.; Dietz, C.; Fiori, F.; Grundler, U.; Hou, W.-S.; Lu, R.-S.; Moya, M.; Tsai, J.-F.; Tzeng, Y. M.; Cussans, D.; Goldstein, J.; Grimes, M.; Newbold, D.; Hobson, P.; Reid, I. D.; Auzinger, G.; Bainbridge, R.; Dauncey, P.; Hall, G.; James, T.; Magnan, A.-M.; Pesaresi, M.; Raymond, D. M.; Uchida, K.; Durkin, T.; Harder, K.; Shepherd-Themistocleous, C.; Chertok, M.; Conway, J.; Conway, R.; Flores, C.; Lander, R.; Pellett, D.; Ricci-Tam, F.; Squires, M.; Thomson, J.; Yohay, R.; Burt, K.; Ellison, J.; Hanson, G.; Olmedo, M.; Si, W.; Yates, B. R.; Dominguez, A.; Bartek, R.; Bentele, B.; Cumalat, J. P.; Ford, W. T.; Jensen, F.; Johnson, A.; Krohn, M.; Leontsinis, S.; Mulholland, T.; Stenson, K.; Wagner, S. R.; Apresyan, A.; Bolla, G.; Burkett, K.; Butler, J. N.; Canepa, A.; Cheung, H. W. K.; Christian, D.; Cooper, W. E.; Deptuch, G.; Derylo, G.; Gingu, C.; Grünendahl, S.; Hasegawa, S.; Hoff, J.; Howell, J.; Hrycyk, M.; Jindariani, S.; Johnson, M.; Kahlid, F.; Kwan, S.; Lei, C. M.; Lipton, R.; Lopes De Sá, R.; Liu, T.; Los, S.; Matulik, M.; Merkel, P.; Nahn, S.; Prosser, A.; Rivera, R.; Schneider, B.; Sellberg, G.; Shenai, A.; Siehl, K.; Spiegel, L.; Tran, N.; Uplegger, L.; Voirin, E.; Berry, D. R.; Chen, X.; Ennesser, L.; Evdokimov, A.; Gerber, C. E.; Makauda, S.; Mills, C.; Sandoval Gonzalez, I. D.; Alimena, J.; Antonelli, L. J.; Francis, B.; Hart, A.; Hill, C. S.; Parashar, N.; Stupak, J.; Bortoletto, D.; Bubna, M.; Hinton, N.; Jones, M.; Miller, D. H.; Shi, X.; Baringer, P.; Bean, A.; Khalil, S.; Kropivnitskaya, A.; Majumder, D.; Schmitz, E.; Wilson, G.; Ivanov, A.; Mendis, R.; Mitchell, T.; Skhirtladze, N.; Taylor, R.; Anderson, I.; Fehling, D.; Gritsan, A.; Maksimovic, P.; Martin, C.; Nash, K.; Osherson, M.; Swartz, M.; Xiao, M.; Acosta, J. G.; Cremaldi, L. M.; Oliveros, S.; Perera, L.; Summers, D.; Bloom, K.; Claes, D. R.; Fangmeier, C.; Gonzalez Suarez, R.; Monroy, J.; Siado, J.; Bartz, E.; Gershtein, Y.; Halkiadakis, E.; Kyriacou, S.; Lath, A.; Nash, K.; Osherson, M.; Schnetzer, S.; Stone, R.; Walker, M.; Malik, S.; Norberg, S.; Ramirez Vargas, J. E.; Alyari, M.; Dolen, J.; Godshalk, A.; Harrington, C.; Iashvili, I.; Kharchilava, A.; Nguyen, D.; Parker, A.; Rappoccio, S.; Roozbahani, B.; Alexander, J.; Chaves, J.; Chu, J.; Dittmer, S.; McDermott, K.; Mirman, N.; Rinkevicius, A.; Ryd, A.; Salvati, E.; Skinnari, L.; Soffi, L.; Tao, Z.; Thom, J.; Tucker, J.; Zientek, M.; Akgün, B.; Ecklund, K. M.; Kilpatrick, M.; Nussbaum, T.; Zabel, J.; D'Angelo, P.; Johns, W.; Rose, K.; Choudhury, S.; Korol, I.; Seitz, C.; Vargas Trevino, A.; Dolinska, G.
2017-05-01
A new pixel detector for the CMS experiment was built in order to cope with the instantaneous luminosities anticipated for the Phase I Upgrade of the LHC . The new CMS pixel detector provides four-hit tracking with a reduced material budget as well as new cooling and powering schemes. A new front-end readout chip mitigates buffering and bandwidth limitations, and allows operation at low comparator thresholds. In this paper, comprehensive test beam studies are presented, which have been conducted to verify the design and to quantify the performance of the new detector assemblies in terms of tracking efficiency and spatial resolution. Under optimal conditions, the tracking efficiency is 99.95 ± 0.05%, while the intrinsic spatial resolutions are 4.80 ± 0.25 μm and 7.99 ± 0.21 μm along the 100 μm and 150 μm pixel pitch, respectively. The findings are compared to a detailed Monte Carlo simulation of the pixel detector and good agreement is found.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schambach, Joachim; Rossewij, M. J.; Sielewicz, K. M.
The ALICE Collaboration is preparing a major detector upgrade for the LHC Run 3, which includes the construction of a new silicon pixel based Inner Tracking System (ITS). The ITS readout system consists of 192 readout boards to control the sensors and their power system, receive triggers, and deliver sensor data to the DAQ. To prototype various aspects of this readout system, an FPGA based carrier board and an associated FMC daughter card containing the CERN Gigabit Transceiver (GBT) chipset have been developed. Furthermore, this contribution describes laboratory and radiation testing results with this prototype board set.
Schambach, Joachim; Rossewij, M. J.; Sielewicz, K. M.; ...
2016-12-28
The ALICE Collaboration is preparing a major detector upgrade for the LHC Run 3, which includes the construction of a new silicon pixel based Inner Tracking System (ITS). The ITS readout system consists of 192 readout boards to control the sensors and their power system, receive triggers, and deliver sensor data to the DAQ. To prototype various aspects of this readout system, an FPGA based carrier board and an associated FMC daughter card containing the CERN Gigabit Transceiver (GBT) chipset have been developed. Furthermore, this contribution describes laboratory and radiation testing results with this prototype board set.
Design of an AdvancedTCA board management controller (IPMC)
NASA Astrophysics Data System (ADS)
Mendez, J.; Bobillier, V.; Haas, S.; Joos, M.; Mico, S.; Vasey, F.
2017-03-01
The AdvancedTCA (ATCA) standard has been selected as the hardware platform for the upgrade of the back-end electronics of the CMS and ATLAS experiments of the Large Hadron Collider (LHC) . In this context, the electronic systems for experiments group at CERN is running a project to evaluate, specify, design and support xTCA equipment. As part of this project, an Intelligent Platform Management Controller (IPMC) for ATCA blades, based on a commercial solution, has been designed to be used on existing and future ATCA blades. This paper reports on the status of this project presenting the hardware and software developments.
NASA Astrophysics Data System (ADS)
Robertis, G. De; Fanizzi, G.; Loddo, F.; Manzari, V.; Rizzi, M.
2018-02-01
In this work the MOSAIC ("MOdular System for Acquisition, Interface and Control") board, designed for the readout and testing of the pixel modules for the silicon tracker upgrade of the ALICE (A Large Ion Collider Experiment) experiment at teh CERN LHC, is described. It is based on an Artix7 Field Programmable Gate Array device by Xilinx and is compliant with the six unit "Versa Modular Eurocard" standard (6U-VME) for easy housing in a standard VMEbus crate from which it takes only power supplies and cooling.
A TTC upgrade proposal using bidirectional 10G-PON FTTH technology
NASA Astrophysics Data System (ADS)
Kolotouros, D. M.; Baron, S.; Soos, C.; Vasey, F.
2015-04-01
A new generation FPGA-based Timing-Trigger and Control (TTC) system based on emerging Passive Optical Network (PON) technology is being proposed to replace the existing off-detector TTC system used by the LHC experiments. High split ratio, dynamic software partitioning, low and deterministic latency, as well as low jitter are required. Exploiting the latest available technologies allows delivering higher capacity together with bidirectionality, a feature absent from the legacy TTC system. This article focuses on the features and capabilities of the latest TTC-PON prototype based on 10G-PON FTTH components along with some metrics characterizing its performance.
NASA Astrophysics Data System (ADS)
Schambach, J.; Rossewij, M. J.; Sielewicz, K. M.; Aglieri Rinella, G.; Bonora, M.; Ferencei, J.; Giubilato, P.; Vanat, T.
2016-12-01
The ALICE Collaboration is preparing a major detector upgrade for the LHC Run 3, which includes the construction of a new silicon pixel based Inner Tracking System (ITS). The ITS readout system consists of 192 readout boards to control the sensors and their power system, receive triggers, and deliver sensor data to the DAQ. To prototype various aspects of this readout system, an FPGA based carrier board and an associated FMC daughter card containing the CERN Gigabit Transceiver (GBT) chipset have been developed. This contribution describes laboratory and radiation testing results with this prototype board set.
NASA Astrophysics Data System (ADS)
Poley, L.; Bloch, I.; Edwards, S.; Friedrich, C.; Gregor, I.-M.; Jones, T.; Lacker, H.; Pyatt, S.; Rehnisch, L.; Sperlich, D.; Wilson, J.
2016-05-01
The Phase-II upgrade of the ATLAS detector for the High Luminosity Large Hadron Collider (HL-LHC) includes the replacement of the current Inner Detector with an all-silicon tracker consisting of pixel and strip detectors. The current Phase-II detector layout requires the construction of 20,000 strip detector modules consisting of sensor, circuit boards and readout chips, which are connected mechanically using adhesives. The adhesive used initially between readout chips and circuit board is a silver epoxy glue as was used in the current ATLAS SemiConductor Tracker (SCT). However, this glue has several disadvantages, which motivated the search for an alternative. This paper presents a study of six ultra-violet (UV) cure glues and a glue pad for possible use in the assembly of silicon strip detector modules for the ATLAS upgrade. Trials were carried out to determine the ease of use, thermal conduction and shear strength. Samples were thermally cycled, radiation hardness and corrosion resistance were also determined. These investigations led to the exclusion of three UV cure glues as well as the glue pad. Three UV cure glues were found to be possible better alternatives than silver loaded glue. Results from electrical tests of first prototype modules constructed using these glues are presented.
The LHCb software and computing upgrade for Run 3: opportunities and challenges
NASA Astrophysics Data System (ADS)
Bozzi, C.; Roiser, S.; LHCb Collaboration
2017-10-01
The LHCb detector will be upgraded for the LHC Run 3 and will be readout at 30 MHz, corresponding to the full inelastic collision rate, with major implications on the full software trigger and offline computing. If the current computing model and software framework are kept, the data storage capacity and computing power required to process data at this rate, and to generate and reconstruct equivalent samples of simulated events, will exceed the current capacity by at least one order of magnitude. A redesign of the software framework, including scheduling, the event model, the detector description and the conditions database, is needed to fully exploit the computing power of multi-, many-core architectures, and coprocessors. Data processing and the analysis model will also change towards an early streaming of different data types, in order to limit storage resources, with further implications for the data analysis workflows. Fast simulation options will allow to obtain a reasonable parameterization of the detector response in considerably less computing time. Finally, the upgrade of LHCb will be a good opportunity to review and implement changes in the domains of software design, test and review, and analysis workflow and preservation. In this contribution, activities and recent results in all the above areas are presented.
ALPIDE: the Monolithic Active Pixel Sensor for the ALICE ITS upgrade
NASA Astrophysics Data System (ADS)
Šuljić, M.
2016-11-01
The upgrade of the ALICE vertex detector, the Inner Tracking System (ITS), is scheduled to be installed during the next long shutdown period (2019-2020) of the CERN Large Hadron Collider (LHC) . The current ITS will be replaced by seven concentric layers of Monolithic Active Pixel Sensors (MAPS) with total active surface of ~10 m2, thus making ALICE the first LHC experiment implementing MAPS detector technology on a large scale. The ALPIDE chip, based on TowerJazz 180 nm CMOS Imaging Process, is being developed for this purpose. A particular process feature, the deep p-well, is exploited so the full CMOS logic can be implemented over the active sensor area without impinging on the deposited charge collection. ALPIDE is implemented on silicon wafers with a high resistivity epitaxial layer. A single chip measures 15 mm by 30 mm and contains half a million pixels distributed in 512 rows and 1024 columns. In-pixel circuitry features amplification, shaping, discrimination and multi-event buffering. The readout is hit driven i.e. only addresses of hit pixels are sent to the periphery. The upgrade of the ITS presents two different sets of requirements for sensors of the inner and of the outer layers due to the significantly different track density, radiation level and active detector surface. The ALPIDE chip fulfils the stringent requirements in both cases. The detection efficiency is higher than 99%, fake-hit probability is orders of magnitude lower than the required 10-6 and spatial resolution within the required 5 μm. This performance is to be maintained even after a total ionising does (TID) of 2.7 Mrad and a non-ionising energy loss (NIEL) fluence of 1.7 × 1013 1 MeV neq/cm2, which is above what is expected during the detector lifetime. Readout rate of 100 kHz is provided and the power density of ALPIDE is less than 40 mW/cm2. This contribution will provide a summary of the ALPIDE features and main test results.
Theoretical & Experimental Research in Weak, Electromagnetic & Strong Interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandi, Satyanarayan; Babu, Kaladi; Rizatdinova, Flera
The conducted research spans a wide range of topics in the theoretical, experimental and phenomenological aspects of elementary particle interactions. Theory projects involve topics in both the energy frontier and the intensity frontier. The experimental research involves energy frontier with the ATLAS Collaboration at the Large Hadron Collider (LHC). In theoretical research, novel ideas going beyond the Standard Model with strong theoretical motivations were proposed, and their experimental tests at the LHC and forthcoming neutrino facilities were outlined. These efforts fall into the following broad categories: (i) TeV scale new physics models for LHC Run 2, including left-right symmetry andmore » trinification symmetry, (ii) unification of elementary particles and forces, including the unification of gauge and Yukawa interactions, (iii) supersummetry and mechanisms of supersymmetry breaking, (iv) superworld without supersymmetry, (v) general models of extra dimensions, (vi) comparing signals of extra dimensions with those of supersymmetry, (vii) models with mirror quarks and mirror leptons at the TeV scale, (viii) models with singlet quarks and singlet Higgs and their implications for Higgs physics at the LHC, (ix) new models for the dark matter of the universe, (x) lepton flavor violation in Higgs decays, (xi) leptogenesis in radiative models of neutrino masses, (xii) light mediator models of non-standard neutrino interactions, (xiii) anomalous muon decay and short baseline neutrino anomalies, (xiv) baryogenesis linked to nucleon decay, and (xv) a new model for recently observed diboson resonance at the LHC and its other phenomenological implications. The experimental High Energy Physics group has been, and continues to be, a successful and productive contributor to the ATLAS experiment at the LHC. Members of the group performed search for gluinos decaying to stop and top quarks, new heavy gauge bosons decaying to top and bottom quarks, and vector-like quarks produced in pairs and decaying to light quarks. Members of the OSU group played a leading role in the detailed optimization studies for the future ATLAS Inner Tracker (ITk), which will be installed during the Phase-II upgrade, replacing the current tracking system. The proposed studies aim to enhance the ATLAS discovery potential in the high-luminosity LHC era. The group members have contributed to the calibration of algorithms for identifying boosted vector bosons and b-jets, which will help expand the ATLAS reach in many searches for new physics.« less
An Experimental Review on Heavy-Flavor v 2 in Heavy-Ion Collision
Nasim, Md.; Esha, Roli; Huang, Huan Zhong
2016-01-01
For overmore » a decade now, the primary purpose of relativistic heavy-ion collisions at the Relativistic Heavy-Ion Collider (RHIC) and the Large Hadron Collider (LHC) has been to study the properties of QCD matter under extreme conditions—high temperature and high density. The heavy-ion experiments at both RHIC and LHC have recorded a wealth of data in p+p, p+Pb, d+Au, Cu+Cu, Cu+Au, Au+Au, Pb+Pb, and U+U collisions at energies ranging from s N N = 7.7 GeV to 7 TeV. Heavy quarks are considered good probe to study the QCD matter created in relativistic collisions due to their very large mass and other unique properties. A precise measurement of various properties of heavy-flavor hadrons provides an insight into the fundamental properties of the hot and dense medium created in these nucleus-nucleus collisions, such as transport coefficient and thermalization and hadronization mechanisms. The main focus of this paper is to present a review on the measurements of azimuthal anisotropy of heavy-flavor hadrons and to outline the scientific opportunities in this sector due to future detector upgrade. We will mainly discuss the elliptic flow of open charmed meson ( D -meson), J / ψ , and leptons from heavy-flavor decay at RHIC and LHC energy.« less
Grewe, Sabrina; Ballottari, Matteo; Alcocer, Marcelo; D’Andrea, Cosimo; Blifernez-Klassen, Olga; Hankamer, Ben; Mussgnug, Jan H.; Bassi, Roberto; Kruse, Olaf
2014-01-01
Photosynthetic organisms developed multiple strategies for balancing light-harvesting versus intracellular energy utilization to survive ever-changing environmental conditions. The light-harvesting complex (LHC) protein family is of paramount importance for this function and can form light-harvesting pigment protein complexes. In this work, we describe detailed analyses of the photosystem II (PSII) LHC protein LHCBM9 of the microalga Chlamydomonas reinhardtii in terms of expression kinetics, localization, and function. In contrast to most LHC members described before, LHCBM9 expression was determined to be very low during standard cell cultivation but strongly increased as a response to specific stress conditions, e.g., when nutrient availability was limited. LHCBM9 was localized as part of PSII supercomplexes but was not found in association with photosystem I complexes. Knockdown cell lines with 50 to 70% reduced amounts of LHCBM9 showed reduced photosynthetic activity upon illumination and severe perturbation of hydrogen production activity. Functional analysis, performed on isolated PSII supercomplexes and recombinant LHCBM9 proteins, demonstrated that presence of LHCBM9 resulted in faster chlorophyll fluorescence decay and reduced production of singlet oxygen, indicating upgraded photoprotection. We conclude that LHCBM9 has a special role within the family of LHCII proteins and serves an important protective function during stress conditions by promoting efficient light energy dissipation and stabilizing PSII supercomplexes. PMID:24706511
The case for future hadron colliders from B → K (*) μ + μ - decays
NASA Astrophysics Data System (ADS)
Allanach, B. C.; Gripaios, Ben; You, Tevong
2018-03-01
Recent measurements in B → K (*) μ + μ - decays are somewhat discrepant with Standard Model predictions. They may be harbingers of new physics at an energy scale potentially accessible to direct discovery. We estimate the sensitivity of future hadron colliders to the possible new particles that may be responsible for the anomalies at tree-level: leptoquarks or Z's. We consider luminosity upgrades for a 14 TeV LHC, a 33 TeV LHC, and a 100 TeV pp collider such as the FCC-hh. In the most conservative and pessimistic models, for narrow particles with perturbative couplings, Z' masses up to 20 TeV and leptoquark masses up to 41 TeV may in principle explain the anomalies. Coverage of Z' models is excellent: a 33 TeV 1 ab-1 LHC is expected to cover most of the parameter space up to 8 TeV in mass, whereas the 100 TeV FCC-hh with 10 ab-1 will cover all of it. A smaller portion of the leptoquark parameter space is covered by future colliders: for example, in a μ + μ - jj di-leptoquark search, a 100 TeV 10 ab-1 collider has a projected sensitivity up to leptoquark masses of 12 TeV (extendable to 21 TeV with a strong coupling for single leptoquark production).
Methods for Upgrading an Intramural-Recreational Sports Program: An Agency Report.
ERIC Educational Resources Information Center
Newman, Richard E.; Miller, Michael T.
This study assessed the state of intramural-recreational (IR) programs at Peru State College (Nebraska) and offered suggestions for the improvement of existing IR programs. The existing IR sports program is directed by a part-time adjunct staff member with the aid of student assistants and receives limited support. Upgrading the directorship of…
Evaluation of GPUs as a level-1 track trigger for the High-Luminosity LHC
NASA Astrophysics Data System (ADS)
Mohr, H.; Dritschler, T.; Ardila, L. E.; Balzer, M.; Caselle, M.; Chilingaryan, S.; Kopmann, A.; Rota, L.; Schuh, T.; Vogelgesang, M.; Weber, M.
2017-04-01
In this work, we investigate the use of GPUs as a way of realizing a low-latency, high-throughput track trigger, using CMS as a showcase example. The CMS detector at the Large Hadron Collider (LHC) will undergo a major upgrade after the long shutdown from 2024 to 2026 when it will enter the high luminosity era. During this upgrade, the silicon tracker will have to be completely replaced. In the High Luminosity operation mode, luminosities of 5-7 × 1034 cm-2s-1 and pileups averaging at 140 events, with a maximum of up to 200 events, will be reached. These changes will require a major update of the triggering system. The demonstrated systems rely on dedicated hardware such as associative memory ASICs and FPGAs. We investigate the use of GPUs as an alternative way of realizing the requirements of the L1 track trigger. To this end we implemeted a Hough transformation track finding step on GPUs and established a low-latency RDMA connection using the PCIe bus. To showcase the benefits of floating point operations, made possible by the use of GPUs, we present a modified algorithm. It uses hexagonal bins for the parameter space and leads to a more truthful representation of the possible track parameters of the individual hits in Hough space. This leads to fewer duplicate candidates and reduces fake track candidates compared to the regular approach. With data-transfer latencies of 2 μs and processing times for the Hough transformation as low as 3.6 μs, we can show that latencies are not as critical as expected. However, computing throughput proves to be challenging due to hardware limitations.
Performance verification of the CMS Phase-1 Upgrade Pixel detector
NASA Astrophysics Data System (ADS)
Veszpremi, V.
2017-12-01
The CMS tracker consists of two tracking systems utilizing semiconductor technology: the inner pixel and the outer strip detectors. The tracker detectors occupy the volume around the beam interaction region between 3 cm and 110 cm in radius and up to 280 cm along the beam axis. The pixel detector consists of 124 million pixels, corresponding to about 2 m 2 total area. It plays a vital role in the seeding of the track reconstruction algorithms and in the reconstruction of primary interactions and secondary decay vertices. It is surrounded by the strip tracker with 10 million read-out channels, corresponding to 200 m 2 total area. The tracker is operated in a high-occupancy and high-radiation environment established by particle collisions in the LHC . The current strip detector continues to perform very well. The pixel detector that has been used in Run 1 and in the first half of Run 2 was, however, replaced with the so-called Phase-1 Upgrade detector. The new system is better suited to match the increased instantaneous luminosity the LHC would reach before 2023. It was built to operate at an instantaneous luminosity of around 2×1034 cm-2s-1. The detector's new layout has an additional inner layer with respect to the previous one; it allows for more efficient tracking with smaller fake rate at higher event pile-up. The paper focuses on the first results obtained during the commissioning of the new detector. It also includes challenges faced during the first data taking to reach the optimal measurement efficiency. Details will be given on the performance at high occupancy with respect to observables such as data-rate, hit reconstruction efficiency, and resolution.
Final Report of DOE Grant No. DE-FG02-04ER41306
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandi, Satyanarayan; Babu, Kaladi S; Rizatdinova, Flera
2013-12-10
Project: Theoretical and Experimental Research in Weak, Electromagnetic and Strong Interactions: Investigators: S. Nandi, K.S. Babu, F. Rizatdinova Institution: Oklahoma State University, Stillwater, OK 74078 This completed project focused on the cutting edge research in theoretical and experimental high energy physics. In theoretical high energy physics, the two investigators (Nandi and Babu) worked on a variety of topics in model-building and phenomenological aspects of elementary particle physics. This includes unification of particles and forces, neutrino physics, Higgs boson physics, proton decay, supersymmetry, and collider physics. Novel physics ideas beyond the Standard Model with testable consequences at the LHC have beenmore » proposed. These ideas have stimulated the experimental community to look for new signals. The contributions of the experimental high energy physics group has been at the D0 experiment at the Fermilab Tevatraon and the ATLAS experiment at the Large Hadron Collider. At the D0 experiment, the main focus was search for the Higgs boson in the WH channel, where improved limits were obtained. At the LHC, the OSU group has made significant contributions to the top quark physics, and the calibration of the b-tagging algorithms. The group is also involved in the pixel detector upgrade. This DOE supported grant has resulted in 5 PhD degrees during the past three years. Three postdoctoral fellows were supported as well. In theoretical research over 40 refereed publications have resulted in the past three years, with several involving graduate students and postdoctoral fellows. It also resulted in over 30 conference presentations in the same time period. We are also involved in outreach activities through the Quarknet program, where we engage Oklahoma school teachers and students in our research.« less
Large-area hexagonal silicon detectors for the CMS High Granularity Calorimeter
NASA Astrophysics Data System (ADS)
Pree, E.
2018-02-01
During the so-called Phase-2 Upgrade, the CMS experiment at CERN will undergo significant improvements to cope with the 10-fold luminosity increase of the High Luminosity LHC (HL-LHC) era. Especially the forward calorimetry will suffer from very high radiation levels and intensified pileup in the detectors. For this reason, the CMS collaboration is designing a High Granularity Calorimeter (HGCAL) to replace the existing endcap calorimeters. It features unprecedented transverse and longitudinal segmentation for both electromagnetic (CE-E) and hadronic (CE-H) compartments. The CE-E and a large fraction of CE-H will consist of a sandwich structure with silicon as active detector material. This paper presents an overview of the ongoing sensor development for the HGCAL and highlights important design features and measurement techniques. The design and layout of an 8-inch silicon sensor prototype is shown. The hexagonal sensors consist of 235 pads, each with an area of about 1 cm2. Furthermore, Synopsys TCAD simulations regarding the high voltage stability of the sensors for different geometric parameters are performed. Finally, two different IV characterisation methods are compared on the same sensor.
The design of the new LHC connection cryostats
NASA Astrophysics Data System (ADS)
Vande Craen, A.; Barlow, G.; Eymin, C.; Moretti, M.; Parma, V.; Ramos, D.
2017-12-01
In the frame of the High Luminosity upgrade of the LHC, improved collimation schemes are needed to cope with the superconducting magnet quench limitations due to the increasing beam intensities and particle debris produced in the collision points. Two new TCLD collimators have to be installed on either side of the ALICE experiment to intercept heavy-ion particle debris. Beam optics solutions were found to place these collimators in the continuous cryostat of the machine, in the locations where connection cryostats, bridging a gap of about 13 m between adjacent magnets, are already present. It is therefore planned to replace these connection cryostats with two new shorter ones separated by a bypass cryostat allowing the collimators to be placed close to the beam pipes. The connection cryostats, of a new design when compared to the existing ones, will still have to ensure the continuity of the technical systems of the machine cryostat (i.e. beam lines, cryogenic and electrical circuits, insulation vacuum). This paper describes the functionalities and the design solutions implemented, as well as the plans for their construction.
Charge collection properties in an irradiated pixel sensor built in a thick-film HV-SOI process
NASA Astrophysics Data System (ADS)
Hiti, B.; Cindro, V.; Gorišek, A.; Hemperek, T.; Kishishita, T.; Kramberger, G.; Krüger, H.; Mandić, I.; Mikuž, M.; Wermes, N.; Zavrtanik, M.
2017-10-01
Investigation of HV-CMOS sensors for use as a tracking detector in the ATLAS experiment at the upgraded LHC (HL-LHC) has recently been an active field of research. A potential candidate for a pixel detector built in Silicon-On-Insulator (SOI) technology has already been characterized in terms of radiation hardness to TID (Total Ionizing Dose) and charge collection after a moderate neutron irradiation. In this article we present results of an extensive irradiation hardness study with neutrons up to a fluence of 1× 1016 neq/cm2. Charge collection in a passive pixelated structure was measured by Edge Transient Current Technique (E-TCT). The evolution of the effective space charge concentration was found to be compliant with the acceptor removal model, with the minimum of the space charge concentration being reached after 5× 1014 neq/cm2. An investigation of the in-pixel uniformity of the detector response revealed parasitic charge collection by the epitaxial silicon layer characteristic for the SOI design. The results were backed by a numerical simulation of charge collection in an equivalent detector layout.
File-based data flow in the CMS Filter Farm
NASA Astrophysics Data System (ADS)
Andre, J.-M.; Andronidis, A.; Bawej, T.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.
2015-12-01
During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.
NASA Astrophysics Data System (ADS)
Ferraro, R.; Danzeca, S.; Brucoli, M.; Masi, A.; Brugger, M.; Dilillo, L.
2017-04-01
The need for upgrading the Total Ionizing Dose (TID) measurement resolution of the current version of the Radiation Monitoring system for the LHC complex has driven the research of new TID sensors. The sensors being developed nowadays can be defined as Systems On Chip (SOC) with both analog and digital circuitries embedded in the same silicon. A radiation tolerant TID Monitoring System (TIDMon) has been designed to allow the placement of the entire dosimeter readout electronics in very harsh environments such as calibration rooms and even in the mixed radiation field such as the one of the LHC complex. The objective of the TIDMon is to measure the effect of the TID on the new prototype of Floating Gate Dosimeter (FGDOS) without using long cables and with a reliable measurement system. This work introduces the architecture of the TIDMon, the radiation tolerance techniques applied on the controlling electronics as well as the design choices adopted for the system. Finally, results of several tests of TIDMon under different radiation environments such as gamma rays or mixed radiation field at CHARM are presented.
Heavy-flavour and quarkonium production in the LHC era: from proton-proton to heavy-ion collisions
Andronic, A.; Arleo, F.; Arnaldi, R.; ...
2016-02-29
This report reviews the study of open heavy-flavour and quarkonium production in high-energy hadronic collisions, as tools to investigate fundamental aspects of Quantum Chromodynamics, from the proton and nucleus structure at high energy to deconfinement and the properties of the Quark-Gluon Plasma. Emphasis is given to the lessons learnt from LHC Run 1 results, which are reviewed in a global picture with the results from SPS and RHIC at lower energies, as well as to the questions to be addressed in the future. The report covers heavy flavour and quarkonium production in proton-proton, proton-nucleus and nucleus-nucleus collisions. This includes discussionmore » of the effects of hot and cold strongly interacting matter, quarkonium photo-production in nucleus-nucleus collisions and perspectives on the study of heavy flavour and quarkonium with upgrades of existing experiments and new experiments. The report results from the activity of the SaporeGravis network of the I3 Hadron Physics programme of the European Union 7th Framework Programme.« less
Construction and test of new precision drift-tube chambers for the ATLAS muon spectrometer
NASA Astrophysics Data System (ADS)
Kroha, H.; Kortner, O.; Schmidt-Sommerfeld, K.; Takasugi, E.
2017-02-01
ATLAS muon detector upgrades aim for increased acceptance for muon triggering and precision tracking and for improved rate capability of the muon chambers in the high-background regions of the detector with increasing LHC luminosity. The small-diameter Muon Drift Tube (sMDT) chambers have been developed for these purposes. With half of the drift-tube diameter of the MDT chambers and otherwise unchanged operating parameters, sMDT chambers share the advantages of the MDTs, but have an order of magnitude higher rate capability and can be installed in detector regions where MDT chambers do not fit in. The chamber assembly methods have been optimized for mass production, minimizing construction time and personnel. Sense wire positioning accuracies of 5 μm have been achieved in serial production for large-size chambers comprising several hundred drift tubes. The construction of new sMDT chambers for installation in the 2016/17 winter shutdown of the LHC and the design of sMDT chambers in combination with new RPC trigger chambers for replacement of the inner layer of the barrel muon spectrometer are in progress.
CMS distributed data analysis with CRAB3
Mascheroni, M.; Balcas, J.; Belforte, S.; ...
2015-12-23
The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks andmore » submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.« less
Design and Test of a 65nm CMOS Front-End with Zero Dead Time for Next Generation Pixel Detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaioni, L.; Braga, D.; Christian, D.
This work is concerned with the experimental characterization of a synchronous analog processor with zero dead time developed in a 65 nm CMOS technology, conceived for pixel detectors at the HL-LHC experiment upgrades. It includes a low noise, fast charge sensitive amplifier with detector leakage compensation circuit, and a compact, single ended comparator able to correctly process hits belonging to two consecutive bunch crossing periods. A 2-bit Flash ADC is exploited for digital conversion immediately after the preamplifier. A description of the circuits integrated in the front-end processor and the initial characterization results are provided
Belle II grid computing: An overview of the distributed data management system.
NASA Astrophysics Data System (ADS)
Bansal, Vikas; Schram, Malachi; Belle Collaboration, II
2017-01-01
The Belle II experiment at the SuperKEKB collider in Tsukuba, Japan, will start physics data taking in 2018 and will accumulate 50/ab of e +e- collision data, about 50 times larger than the data set of the Belle experiment. The computing requirements of Belle II are comparable to those of a Run I LHC experiment. Computing at this scale requires efficient use of the compute grids in North America, Asia and Europe and will take advantage of upgrades to the high-speed global network. We present the architecture of data flow and data handling as a part of the Belle II computing infrastructure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kashikhin, V. V.; Novitski, I.; Zlobin, A. V.
2017-05-01
High filed accelerator magnets with operating fields of 15-16 T based on themore » $$Nb_3Sn$$ superconductor are being considered for the LHC energy upgrade or a future Very High Energy pp Collider. Magnet design studies are being conducted in the U.S., Europe and Asia to explore the limits of the $$Nb_3Sn$$ accelerator magnet technology while optimizing the magnet design and performance parame-ters, and reducing magnet cost. The first results of these studies performed at Fermilab in the framework of the US-MDP are reported in this paper.« less
NASA Astrophysics Data System (ADS)
Bruzzi, Mara; Cartiglia, Nicolo; Pace, Emanuele; Talamonti, Cinzia
2015-10-01
The 10th edition of the International Conference on Radiation Effects on Semiconductor Materials, Detectors and Devices (RESMDD) was held in Florence, at Dipartimento di Fisica ed Astronomia on October 8-10, 2014. It has been aimed at discussing frontier research activities in several application fields as nuclear and particle physics, astrophysics, medical and solid-state physics. Main topics discussed in this conference concern performance of heavily irradiated silicon detectors, developments required for the luminosity upgrade of the Large Hadron Collider (HL-LHC), ultra-fast silicon detectors design and manufacturing, high-band gap semiconductor detectors, novel semiconductor-based devices for medical applications, radiation damage issues in semiconductors and related radiation-hardening technologies.
Automotive Stirling Engine Development Program
NASA Technical Reports Server (NTRS)
Nightingale, N.; Ernst, W.; Richey, A.; Simetkosky, M.; Smith, G.; Rohdenburg, C.; Antonelli, M. (Editor)
1983-01-01
Program status and plans are discussed for component and technology development; reference engine system design, the upgraded Mod 1 engine; industry test and evaluation; and product assurance. Four current Mod 1 engines reached a total of 2523 operational hours, while two upgraded engines accumulated 166 hours.
The new front-end electronics for the ATLAS Tile Calorimeter Phase 2 Upgrade
NASA Astrophysics Data System (ADS)
Gomes, A.
2016-02-01
We present the plans, design, and performance results to date for the new front-end electronics being developed for the Phase 2 Upgrade of the ATLAS Tile Calorimeter. The front-end electronics will be replaced to address the increased luminosity at the HL-LHC around 2025, as well as to upgrade to faster, more modern components with higher radiation tolerance. The new electronics will operate dead-timelessly, pushing full data sets from each beam crossing to the data acquisition system that resides off-detector. The new on-detector electronics contains five main parts: the front-end boards that connect directly to the photomultiplier tubes; the Main Boards that digitize the data; the Daughter Boards that collect the data streams and contain the high speed optical communication links for writing data to the data acquisition system; a programmable high voltage control system; and a new low voltage power supply. There are different options for implementing these subcomponents, which will be described. The new system contains new features that in the current version include power system redundancy, data collection redundancy, data transmission redundancy with 2 QSFP optical transceivers and Kintex-7 FPGAs with firmware enhanced scheme for single event upset mitigation. To date, we have built a Demonstrator—a fully functional prototype of the new system. Performance results and plans are presented.
An evaluation of upgraded boron fibers in epoxy-matrix composites
NASA Technical Reports Server (NTRS)
Rhodes, T. C.; Fleck, J. N.; Meiners, K. E.
1973-01-01
An initial evaluation of upgraded boron fibers in an epoxy matrix is performed. Data generated on the program show that fiber strength does increase as a consequence of the upgrading treatment. However, the interlaninar shear strength of upgraded fiber composites is lower than that for an untreated fiber composite. In the limited tests performed, the increased fiber strength failed to translate into the composite.
Test beam performance measurements for the Phase I upgrade of the CMS pixel detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dragicevic, M.; Friedl, M.; Hrubec, J.
A new pixel detector for the CMS experiment was built in order to cope with the instantaneous luminosities anticipated for the Phase~I Upgrade of the LHC. The new CMS pixel detector provides four-hit tracking with a reduced material budget as well as new cooling and powering schemes. A new front-end readout chip mitigates buffering and bandwidth limitations, and allows operation at low comparator thresholds. Here in this paper, comprehensive test beam studies are presented, which have been conducted to verify the design and to quantify the performance of the new detector assemblies in terms of tracking efficiency and spatial resolution. Under optimal conditions, the tracking efficiency ismore » $$99.95\\pm0.05\\,\\%$$, while the intrinsic spatial resolutions are $$4.80\\pm0.25\\,\\mu \\mathrm{m}$$ and $$7.99\\pm0.21\\,\\mu \\mathrm{m}$$ along the $$100\\,\\mu \\mathrm{m}$$ and $$150\\,\\mu \\mathrm{m}$$ pixel pitch, respectively. The findings are compared to a detailed Monte Carlo simulation of the pixel detector and good agreement is found.« less
NASA Astrophysics Data System (ADS)
Malik, S.; Shipsey, I.; Cavanaugh, R.; Bloom, K.; Chan, Kai-Feng; D'Hondt, J.; Klima, B.; Narain, M.; Palla, F.; Rolandi, G.; Schörner-Sadenius, T.
2014-06-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
Test beam performance measurements for the Phase I upgrade of the CMS pixel detector
Dragicevic, M.; Friedl, M.; Hrubec, J.; ...
2017-05-30
A new pixel detector for the CMS experiment was built in order to cope with the instantaneous luminosities anticipated for the Phase~I Upgrade of the LHC. The new CMS pixel detector provides four-hit tracking with a reduced material budget as well as new cooling and powering schemes. A new front-end readout chip mitigates buffering and bandwidth limitations, and allows operation at low comparator thresholds. Here in this paper, comprehensive test beam studies are presented, which have been conducted to verify the design and to quantify the performance of the new detector assemblies in terms of tracking efficiency and spatial resolution. Under optimal conditions, the tracking efficiency ismore » $$99.95\\pm0.05\\,\\%$$, while the intrinsic spatial resolutions are $$4.80\\pm0.25\\,\\mu \\mathrm{m}$$ and $$7.99\\pm0.21\\,\\mu \\mathrm{m}$$ along the $$100\\,\\mu \\mathrm{m}$$ and $$150\\,\\mu \\mathrm{m}$$ pixel pitch, respectively. The findings are compared to a detailed Monte Carlo simulation of the pixel detector and good agreement is found.« less
The Phase-2 electronics upgrade of the ATLAS liquid argon calorimeter system
NASA Astrophysics Data System (ADS)
Vachon, B.
2018-03-01
The LHC high-luminosity upgrade in 2024-2026 requires the associated detectors to operate at luminosities about 5-7 times larger than assumed in their original design. The pile-up is expected to increase to up to 200 events per proton bunch-crossing. The current readout of the ATLAS liquid argon calorimeters does not provide sufficient buffering and bandwidth capabilities to accommodate the hardware triggers requirements imposed by these harsh conditions. Furthermore, the expected total radiation doses are beyond the qualification range of the current front-end electronics. For these reasons an almost complete replacement of the front-end and off-detector readout system is foreseen for the 182,468 readout channels. The new readout system will be based on a free-running architecture, where calorimeter signals are amplified, shaped and digitized by on-detector electronics, then sent at 40 MHz to the off-detector electronics for further processing. Results from the design studies on the performance of the components of the readout system are presented, as well as the results of the tests of the first prototypes.
Mu2e upgrade physics reach optimization studies for the PIP-II era
Pronskikh, Vitaly S.; Glenzinski, Douglas; Mokhov, Nikolai; ...
2016-11-29
The Mu2e experiment at Fermilab is being designed to study the coherent neutrino-less conversion of a negative muon into an electron in the field of a nucleus. This process has an extremely low probability in the Standard Model and its observation would provide unambiguous evidence for BSM physics. The Mu2e design aims to reach a single-event-sensitivity of about 2.5 x 10 -17 and will probe effective new physics mass scales in the 10 3 -10 4 TeV range, well beyond the reach of the LHC. This work examines the maximum beam power that can be tolerated for beam energies inmore » the 0.5-8 GeV range exploring variations in the geometry in the region of the production target using the MARS15 code. Lastly, this has implications for how the sensitivity might be further improved with a second generation experiment using an upgraded proton beam from the PIP-II project, which will be capable of providing MW beams to Fermilab experiments later in the next decade.« less
NASA Astrophysics Data System (ADS)
Meng, X. T.; Levin, D. S.; Chapman, J. W.; Li, D. C.; Yao, Z. E.; Zhou, B.
2017-02-01
The High Performance Time to Digital Converter (HPTDC), a multi-channel ASIC designed by the CERN Microelectronics group, has been proposed for the digitization of the thin-Resistive Plate Chambers (tRPC) in the ATLAS Muon Spectrometer Phase-1 upgrade project. These chambers, to be staged for higher luminosity LHC operation, will increase trigger acceptance and reduce or eliminate the fake muon trigger rates in the barrel-endcap transition region, corresponding to pseudo-rapidity range 1<|η|<1.3. Low level trigger candidates must be flagged within a maximum latency of 1075 ns, thus imposing stringent signal processing time performance requirements on the readout system in general, and on the digitization electronics in particular. This paper investigates the HPTDC signal latency performance based on a specially designed evaluation board coupled with an external FPGA evaluation board, when operated in triggerless mode, and under hit rate conditions expected in Phase-I. This hardware based study confirms previous simulations and demonstrates that the HPTDC in triggerless operation satisfies the digitization timing requirements in both leading edge and pair modes.
The Belle II imaging Time-of-Propagation (iTOP) detector
NASA Astrophysics Data System (ADS)
Fast, J.; Belle II Barrel Particle Identification Group
2017-12-01
High precision flavor physics measurements are an essential complement to the direct searches for new physics at the LHC ATLAS and CMS experiments. Such measurements will be performed using the upgraded Belle II detector that will take data at the SuperKEKB accelerator. With 40x the luminosity of KEKB, the detector systems must operate efficiently at much higher rates than the original Belle detector. A central element of the upgrade is the barrel particle identification system. Belle II has built and installed an imaging-Time-of-Propagation (iTOP) detector. The iTOP uses quartz optics as Cherenkov radiators. The photons are transported down the quartz bars via total internal reflection with a spherical mirror at the forward end to reflect photons to the backward end where they are imaged onto an array of segmented Micro-Channel Plate Photo-Multiplier Tubes (MCP-PMTs). The system is read out using giga-samples per second waveform sampling Application-Specific Integrated Circuits (ASICs). The combined timing and spatial distribution of the photons for each event are used to determine particle species. This paper provides an overview of the iTOP system.
Testing sTGC with small angle wire edges for the ATLAS new small wheel muon detector upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roth, Itamar; Klier, Amit; Duchovni, Ehud
The LHC upgrade scheduled for 2018 is expected to significantly increase the accelerator's luminosity, and as a result the radiation background rates in the ATLAS Muon Spectrometer will increase too. Some of its components will have to be replaced in order to cope with these high rates. Newly designed small-strip Thin Gap chambers (sTGC) will replace them at the small wheel region. One of the differences between the sTGC and the currently used TGC is the alignment of the wires along the azimuthal direction. As a result, the outermost wires approach the detector's edge with a small angle. Such amore » configuration may be a cause for various problems. Two small dedicated chambers were built and tested in order to study possible edge effects that may arise from the new configuration. The sTGC appears to be stable and no spark have been observed, yet some differences in the detector response near the edge is seen and further studies should be carried out. (authors)« less
NASA Astrophysics Data System (ADS)
Senyukov, S.; Baudot, J.; Besson, A.; Claus, G.; Cousin, L.; Dorokhov, A.; Dulinski, W.; Goffe, M.; Hu-Guo, C.; Winter, M.
2013-12-01
The apparatus of the ALICE experiment at CERN will be upgraded in 2017/18 during the second long shutdown of the LHC (LS2). A major motivation for this upgrade is to extend the physics reach for charmed and beauty particles down to low transverse momenta. This requires a substantial improvement of the spatial resolution and the data rate capability of the ALICE Inner Tracking System (ITS). To achieve this goal, the new ITS will be equipped with 50 μm thin CMOS Pixel Sensors (CPS) covering either the three innermost layers or all the 7 layers of the detector. The CPS being developed for the ITS upgrade at IPHC (Strasbourg) is derived from the MIMOSA 28 sensor realised for the STAR-PXL at RHIC in a 0.35 μm CMOS process. In order to satisfy the ITS upgrade requirements in terms of readout speed and radiation tolerance, a CMOS process with a reduced feature size and a high resistivity epitaxial layer should be exploited. In this respect, the charged particle detection performance and radiation hardness of the TowerJazz 0.18 μm CMOS process were studied with the help of the first prototype chip MIMOSA 32. The beam tests performed with negative pions of 120 GeV/c at the CERN-SPS allowed to measure a signal-to-noise ratio (SNR) for the non-irradiated chip in the range between 22 and 32 depending on the pixel design. The chip irradiated with the combined dose of 1 MRad and 1013neq /cm2 was observed to yield an SNR ranging between 11 and 23 for coolant temperatures varying from 15 °C to 30 °C. These SNR values were measured to result in particle detection efficiencies above 99.5% and 98% before and after irradiation, respectively. These satisfactory results allow to validate the TowerJazz 0.18 μm CMOS process for the ALICE ITS upgrade.
From many body wee partons dynamics to perfect fluid: a standard model for heavy ion collisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venugopalan, R.
2010-07-22
We discuss a standard model of heavy ion collisions that has emerged both from experimental results of the RHIC program and associated theoretical developments. We comment briefly on the impact of early results of the LHC program on this picture. We consider how this standard model of heavy ion collisions could be solidified or falsified in future experiments at RHIC, the LHC and a future Electro-Ion Collider.
H-1 Upgrades (4BW/4BN) (H-1 Upgrades)
2015-12-01
automatic blade fold of the new composite rotor blades, new performance matched transmissions, a new four-bladed tail rotor and drive system, upgraded...Upgrades December 2015 SAR March 18, 2016 10:59:17 UNCLASSIFIED 4 Col Steven Girard PMA-276 USMC Light/Attack Helicopter Program Executive Officer...attack helicopter is to provide rotary wing close air support, anti-armor, armed escort, armed/visual reconnaissance and fire support coordination
Cooperative Demonstration Program for High Technology Training. Performance Report.
ERIC Educational Resources Information Center
Indian Hills Community Coll., Ottumwa, IA.
A program at Indian Hills Community College (Ottumwa, Iowa) consisted of a sex equity component aimed to prepare women to enter nontraditional occupations and a building trades component to enable electrical workers to upgrade their skills. Both of the targeted groups underwent assessment and upgrading coordinated through the college's SUCCESS…
Reconstruction of Micropattern Detector Signals using Convolutional Neural Networks
NASA Astrophysics Data System (ADS)
Flekova, L.; Schott, M.
2017-10-01
Micropattern gaseous detector (MPGD) technologies, such as GEMs or MicroMegas, are particularly suitable for precision tracking and triggering in high rate environments. Given their relatively low production costs, MPGDs are an exemplary candidate for the next generation of particle detectors. Having acknowledged these advantages, both the ATLAS and CMS collaborations at the LHC are exploiting these new technologies for their detector upgrade programs in the coming years. When MPGDs are utilized for triggering purposes, the measured signals need to be precisely reconstructed within less than 200 ns, which can be achieved by the usage of FPGAs. In this work, we present a novel approach to identify reconstructed signals, their timing and the corresponding spatial position on the detector. In particular, we study the effect of noise and dead readout strips on the reconstruction performance. Our approach leverages the potential of convolutional neural network (CNNs), which have recently manifested an outstanding performance in a range of modeling tasks. The proposed neural network architecture of our CNN is designed simply enough, so that it can be modeled directly by an FPGA and thus provide precise information on reconstructed signals already in trigger level.
The STAR Detector Upgrades and Electromagnetic Probes in Beam Energy Scan Phase II
NASA Astrophysics Data System (ADS)
Yang, Chi
The Beam Energy Scan Phase II at RHIC, BES-II, is scheduled from year 2019 to 2020 and will explore the high baryon density region of the QCD phase diagram with high precision. The program will focus on the interesting energy region determined from the results of BES-I. Some of the key measurements anticipated are the chiral symmetry restoration and QGP thermal radiation in the dilepton and direct photon channels. The measurements will be possible with an order of magnitude better statistics provided by the electron cooling upgrade of RHIC and with the detector upgrades planned to extend STAR experimental reach. The upgrades are: the inner Time Projection Chamber sectors (iTPC), the Event Plane Detector (EPD), and the end-cap Time of Flight (eTOF). We present the BES-II program details and the physics opportunities in the dilepton and direct photon channels enabled by the upgrades.
On the LHC sensitivity for non-thermalised hidden sectors
NASA Astrophysics Data System (ADS)
Kahlhoefer, Felix
2018-04-01
We show under rather general assumptions that hidden sectors that never reach thermal equilibrium in the early Universe are also inaccessible for the LHC. In other words, any particle that can be produced at the LHC must either have been in thermal equilibrium with the Standard Model at some point or must be produced via the decays of another hidden sector particle that has been in thermal equilibrium. To reach this conclusion, we parametrise the cross section connecting the Standard Model to the hidden sector in a very general way and use methods from linear programming to calculate the largest possible number of LHC events compatible with the requirement of non-thermalisation. We find that even the HL-LHC cannot possibly produce more than a few events with energy above 10 GeV involving states from a non-thermalised hidden sector.
Stepping outside the neighborhood of T at LHC
NASA Astrophysics Data System (ADS)
Wiedemann, Urs Achim
2009-11-01
“ As you are well aware, many in the RHIC community are interested in the LHC heavy-ion program, but have several questions: What can we learn at the LHC that is qualitatively new? Are collisions at LHC similar to RHIC ones, just with a somewhat hotter/denser initial state? If not, why not? These questions are asked in good faith, and this talk is an opportunity to answer them directly to much of the RHIC community.” With these words, the organizers of Quark Matter 2009 in Knoxville invited me to discuss the physics opportunities for heavy ion collisions at the LHC without recalling the standard arguments, which are mainly based on the extended kinematic reach of the machine. In response, I emphasize here that lattice QCD indicates characteristic qualitative differences between thermal physics in the neighborhood of the critical temperature (T
The PDF4LHC report on PDFs and LHC data: Results from Run I and preparation for Run II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rojo, Juan; Accardi, Alberto; Ball, Richard D.
2015-09-16
The accurate determination of Parton Distribution Functions (PDFs) of the proton is an essential ingredient of the Large Hadron Collider (LHC) program. PDF uncertainties impact a wide range of processes, from Higgs boson characterization and precision Standard Model measurements to New Physics searches. A major recent development in modern PDF analyses has been to exploit the wealth of new information contained in precision measurements from the LHC Run I, as well as progress in tools and methods to include these data in PDF fits. In this report we summarize the information that PDF-sensitive measurements at the LHC have provided somore » far, and review the prospects for further constraining PDFs with data from the recently started Run II. As a result, this document aims to provide useful input to the LHC collaborations to prioritize their PDF-sensitive measurements at Run II, as well as a comprehensive reference for the PDF-fitting collaborations.« less
Puerto Rico Nursing Career Cooperative Demonstration Program. Final Performance Report.
ERIC Educational Resources Information Center
Puerto Rico State Dept. of Education, Hato Rey.
The Puerto Rico Nursing Career Cooperative Demonstration Project and Associate Nursing Program provided education and onsite occupational training laboratory to upgrade the education of 20 licensed practical nurses (LPNs) from 1989-90. The nurses were upgraded to associate nurses in an 18-month period at the Technological Institute of Puerto Rico.…
A Survey of CETA Upgrading and Retraining Programs.
ERIC Educational Resources Information Center
Cambridge Office of Manpower Affairs, MA.
In 1979, Comprehensive Employment and Training Act (CETA) legislation was changed to remove income from the eligibility criteria for Title IIC upgrading and retraining programs. In order to assess the impact of this change upon CETA prime sponsor and private industry council (PIC) activities across the country, a telephone survey was made to all…
Design, prototyping, and testing of a compact superconducting double quarter wave crab cavity
NASA Astrophysics Data System (ADS)
Xiao, Binping; Alberty, Luis; Belomestnykh, Sergey; Ben-Zvi, Ilan; Calaga, Rama; Cullen, Chris; Capatina, Ofelia; Hammons, Lee; Li, Zenghai; Marques, Carlos; Skaritka, John; Verdu-Andres, Silvia; Wu, Qiong
2015-04-01
We proposed a novel design for a compact superconducting crab cavity with a double quarter wave (DQWCC) shape. After fabrication and surface treatments, this niobium proof-of-principle cavity was tested cryogenically in a vertical cryostat. The cavity is extremely compact yet has a low frequency of 400 MHz, an essential property for service in the Large Hadron Collider luminosity upgrade. The cavity's electromagnetic properties are well suited for this demanding task. The demonstrated deflecting voltage of 4.6 MV is well above the required 3.34 MV for a crab cavity in the future High Luminosity LHC. In this paper, we present the design, prototyping, and results from testing the DQWCC.
ALICE results from Run-1 and Run-2 and perspectives for Run-3 and Run-4
NASA Astrophysics Data System (ADS)
Noferini, Francesco; ALICE Collaboration
2018-05-01
A review of ALICE is presented focusing on its physics programme and results from the Run-1 and Run-2 data taking periods. Among the four major LHC experiments, ALICE is devoted to the study of the Quark-Gluon Plasma produced in ultra-relativistic heavy-ion collisions (Pb–Pb), but it is also collecting data in smaller systems (pp and p–Pb). This review focuses on the main results collected so far, including the characterization of the QGP via soft and hard probes, and the production rate of light nuclei and hypernuclei. Finally, the perspectives after the detectors upgrades to be performed during 2019-2020 are presented.
NASA Astrophysics Data System (ADS)
Kim, D.; Aglieri Rinella, G.; Cavicchioli, C.; Chanlek, N.; Collu, A.; Degerli, Y.; Dorokhov, A.; Flouzat, C.; Gajanana, D.; Gao, C.; Guilloux, F.; Hillemanns, H.; Hristozkov, S.; Junique, A.; Keil, M.; Kofarago, M.; Kugathasan, T.; Kwon, Y.; Lattuca, A.; Mager, M.; Sielewicz, K. M.; Marin Tobon, C. A.; Marras, D.; Martinengo, P.; Mazza, G.; Mugnier, H.; Musa, L.; Pham, T. H.; Puggioni, C.; Reidt, F.; Riedler, P.; Rousset, J.; Siddhanta, S.; Snoeys, W.; Song, M.; Usai, G.; Van Hoorne, J. W.; Yang, P.
2016-02-01
ALICE plans to replace its Inner Tracking System during the second long shut down of the LHC in 2019 with a new 10 m2 tracker constructed entirely with monolithic active pixel sensors. The TowerJazz 180 nm CMOS imaging Sensor process has been selected to produce the sensor as it offers a deep pwell allowing full CMOS in-pixel circuitry and different starting materials. First full-scale prototypes have been fabricated and tested. Radiation tolerance has also been verified. In this paper the development of the charge sensitive front end and in particular its optimization for uniformity of charge threshold and time response will be presented.
Magnetic Measurements of the First Nb 3Sn Model Quadrupole (MQXFS) for the High-Luminosity LHC
DiMarco, J.; Ambrosio, G.; Chlachidze, G.; ...
2016-12-12
The US LHC Accelerator Research Program (LARP) and CERN are developing high-gradient Nb 3Sn magnets for the High Luminosity LHC interaction regions. Magnetic measurements of the first 1.5 m long, 150 mm aperture model quadrupole, MQXFS1, were performed during magnet assembly at LBNL, as well as during cryogenic testing at Fermilab’s Vertical Magnet Test Facility. This paper reports on the results of these magnetic characterization measurements, as well as on the performance of new probes developed for the tests.
Mechanical performance of short models for MQXF, the Nb3Sn low-β quadrupole for the Hi-Lumi LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vallone, Giorgio; Ambrosio, Giorgio; Anderssen, Eric
In the framework of the Hi-Lumi LHC Project, CERN and U.S. LARP are jointly developing MQXF, a 150-mm aperture high-field Nb3Sn quadrupole for the upgrade of the inner triplet of the low-beta interaction regions. The magnet is supported by a shell-based structure, providing the preload by means of bladder-key technology and differential thermal contraction of the various components. Two short models have been produced using the same cross section currently considered for the final magnet. The structures were preliminarily tested replacing the superconducting coils with blocks of aluminum. This procedure allows for model validation and calibration, and also to setmore » performance goals for the real magnet. Strain gauges were used to monitor the behavior of the structure during assembly, cool down and also excitation in the case of the magnets. The various structures differ for the shell partitioning strategies adopted and for the presence of thick or thin laminations. This study presents the results obtained and discusses the mechanical performance of all the short models produced up to now.« less
First LHCb measurement with data from the LHC Run 2
NASA Astrophysics Data System (ADS)
Anderlini, L.; Amerio, S.
2017-01-01
LHCb has recently introduced a novel real-time detector alignment and calibration strategy for the Run 2. Data collected at the start of each LHC fill are processed in few minutes and used to update the alignment. On the other hand, the calibration constants will be evaluated for each run of data taking. An increase in the CPU and disk capacity of the event filter farm, combined with improvements to the reconstruction software, allow for efficient, exclusive selections already in the first stage of the High Level Trigger (HLT1), while the second stage, HLT2, performs complete, offline-quality, event reconstruction. In Run 2, LHCb will collect the largest data sample of charm mesons ever recorded. Novel data processing and analysis techniques are required to maximise the physics potential of this data sample with the available computing resources, taking into account data preservation constraints. In this write-up, we describe the full analysis chain used to obtain important results analysing the data collected in proton-proton collisions in 2015, such as the J/ψ and open charm production cross-sections, and consider the further steps required to obtain real-time results after the LHCb upgrade.
Mechanical performance of short models for MQXF, the Nb3Sn low-β quadrupole for the Hi-Lumi LHC
Vallone, Giorgio; Ambrosio, Giorgio; Anderssen, Eric; ...
2016-12-23
In the framework of the Hi-Lumi LHC Project, CERN and U.S. LARP are jointly developing MQXF, a 150-mm aperture high-field Nb3Sn quadrupole for the upgrade of the inner triplet of the low-beta interaction regions. The magnet is supported by a shell-based structure, providing the preload by means of bladder-key technology and differential thermal contraction of the various components. Two short models have been produced using the same cross section currently considered for the final magnet. The structures were preliminarily tested replacing the superconducting coils with blocks of aluminum. This procedure allows for model validation and calibration, and also to setmore » performance goals for the real magnet. Strain gauges were used to monitor the behavior of the structure during assembly, cool down and also excitation in the case of the magnets. The various structures differ for the shell partitioning strategies adopted and for the presence of thick or thin laminations. This study presents the results obtained and discusses the mechanical performance of all the short models produced up to now.« less
36th International Conference on High Energy Physics
NASA Astrophysics Data System (ADS)
The Australian particle physics community was honoured to host the 36th ICHEP conference in 2012 in Melbourne. This conference has long been the reference event for our international community. The announcement of the discovery of the Higgs boson at the LHC was a major highlight, with huge international press coverage. ICHEP2012 was described by CERN Director-General, Professor Rolf Heuer, as a landmark conference for our field. In additional to the Higgs announcement, important results from neutrino physics, from flavour physics, and from physics beyond the standard model also provided great interest. There were also updates on key accelerator developments such as the new B-factories, plans for the LHC upgrade, neutrino facilities and associated detector developments. ICHEP2012 exceeded the promise expected of the key conference for our field, and really did provide a reference point for the future. Many thanks to the contribution reviewers: Andy Bakich, Csaba Balazs, Nicole Bell, Catherine Buchanan, Will Crump, Cameron Cuthbert, Ben Farmer, Sudhir Gupta, Elliot Hutchison, Paul Jackson, Geng-Yuan Jeng, Archil Kobakhidze, Doyoun Kim, Tong Li, Antonio Limosani (Head Editor), Kristian McDonald, Nikhul Patel, Aldo Saavedra, Mark Scarcella, Geoff Taylor, Ian Watson, Graham White, Tony Williams and Bruce Yabsley.
A new data acquisition system for the CMS Phase 1 pixel detector
NASA Astrophysics Data System (ADS)
Kornmayer, A.
2016-12-01
A new pixel detector will be installed in the CMS experiment during the extended technical stop of the LHC at the beginning of 2017. The new pixel detector, built from four layers in the barrel region and three layers on each end of the forward region, is equipped with upgraded front-end readout electronics, specifically designed to handle the high particle hit rates created in the LHC environment. The DAQ back-end was entirely redesigned to handle the increased number of readout channels, the higher data rates per channel and the new digital data format. Based entirely on the microTCA standard, new front-end controller (FEC) and front-end driver (FED) cards have been developed, prototyped and produced with custom optical link mezzanines mounted on the FC7 AMC and custom firmware. At the same time as the new detector is being assembled, the DAQ system is set up and its integration into the CMS central DAQ system tested by running the pilot blade detector already installed in CMS. This work describes the DAQ system, integration tests and gives an outline for the activities up to commissioning the final system at CMS in 2017.
A hardware fast tracker for the ATLAS trigger
NASA Astrophysics Data System (ADS)
Asbah, Nedaa
2016-09-01
The trigger system of the ATLAS experiment is designed to reduce the event rate from the LHC nominal bunch crossing at 40 MHz to about 1 kHz, at the design luminosity of 1034 cm-2 s-1. After a successful period of data taking from 2010 to early 2013, the LHC already started with much higher instantaneous luminosity. This will increase the load on High Level Trigger system, the second stage of the selection based on software algorithms. More sophisticated algorithms will be needed to achieve higher background rejection while maintaining good efficiency for interesting physics signals. The Fast TracKer (FTK) is part of the ATLAS trigger upgrade project. It is a hardware processor that will provide, at every Level-1 accepted event (100 kHz) and within 100 microseconds, full tracking information for tracks with momentum as low as 1 GeV. Providing fast, extensive access to tracking information, with resolution comparable to the offline reconstruction, FTK will help in precise detection of the primary and secondary vertices to ensure robust selections and improve the trigger performance. FTK exploits hardware technologies with massive parallelism, combining Associative Memory ASICs, FPGAs and high-speed communication links.
Task Management in the New ATLAS Production System
NASA Astrophysics Data System (ADS)
De, K.; Golubkov, D.; Klimentov, A.; Potekhin, M.; Vaniachine, A.; Atlas Collaboration
2014-06-01
This document describes the design of the new Production System of the ATLAS experiment at the LHC [1]. The Production System is the top level workflow manager which translates physicists' needs for production level processing and analysis into actual workflows executed across over a hundred Grid sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. In the new design, the main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, DEFT manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. The JEDI component then dynamically translates the task definitions from DEFT into actual workload jobs executed in the PanDA Workload Management System [2]. We present the requirements, design parameters, basics of the object model and concrete solutions utilized in building the new Production System and its components.
File-Based Data Flow in the CMS Filter Farm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andre, J.M.; et al.
2015-12-23
During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes aremore » also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.« less
NASA Astrophysics Data System (ADS)
Jain, S.
2017-03-01
The High Granularity Calorimeter (HGCAL) is the technology choice of the CMS collaboration for the endcap calorimetry upgrade planned to cope with the harsh radiation and pileup environment at the High Luminosity-LHC . The HGCAL is realized as a sampling calorimeter, including an electromagnetic compartment comprising 28 layers of silicon pad detectors with pad areas of 0.5-01. cm2 interspersed with absorbers made from tungsten and copper to form a highly compact and granular device. Prototype modules, based on hexagonal silicon pad sensors, with 128 channels, have been constructed and tested in beams at FNAL and at CERN. The modules include many of the features required for this challenging detector, including a PCB glued directly to the sensor, using through-hole wire-bonding for signal readout and 5 mm spacing between layers—including the front-end electronics and all services. Tests in 2016 have used an existing front-end chip —Skiroc2 (designed for the CALICE experiment for ILC). We present results from first tests of these modules both in the laboratory and with beams of electrons, pions and protons, including noise performance, calibration with mips and electron signals.
Big Data over a 100G network at Fermilab
Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo; ...
2014-06-11
As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out ofmore » the laboratory of about 30 Gbit/s and on the Local are network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research & Development facility connected to the ESnet 100G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. Furthermore, this work presents the new R&D facility and the continuation of the evaluation program.« less
Big Data over a 100G network at Fermilab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo
As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out ofmore » the laboratory of about 30 Gbit/s and on the Local are network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research & Development facility connected to the ESnet 100G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. Furthermore, this work presents the new R&D facility and the continuation of the evaluation program.« less
Upgrade of the TOTEM DAQ using the Scalable Readout System (SRS)
NASA Astrophysics Data System (ADS)
Quinto, M.; Cafagna, F.; Fiergolski, A.; Radicioni, E.
2013-11-01
The main goals of the TOTEM Experiment at the LHC are the measurements of the elastic and total p-p cross sections and the studies of the diffractive dissociation processes. At LHC, collisions are produced at a rate of 40 MHz, imposing strong requirements for the Data Acquisition Systems (DAQ) in terms of trigger rate and data throughput. The TOTEM DAQ adopts a modular approach that, in standalone mode, is based on VME bus system. The VME based Front End Driver (FED) modules, host mezzanines that receive data through optical fibres directly from the detectors. After data checks and formatting are applied in the mezzanine, data is retransmitted to the VME interface and to another mezzanine card plugged in the FED module. The VME bus maximum bandwidth limits the maximum first level trigger (L1A) to 1 kHz rate. In order to get rid of the VME bottleneck and improve scalability and the overall capabilities of the DAQ, a new system was designed and constructed based on the Scalable Readout System (SRS), developed in the framework of the RD51 Collaboration. The project aims to increase the efficiency of the actual readout system providing higher bandwidth, and increasing data filtering, implementing a second-level trigger event selection based on hardware pattern recognition algorithms. This goal is to be achieved preserving the maximum back compatibility with the LHC Timing, Trigger and Control (TTC) system as well as with the CMS DAQ. The obtained results and the perspectives of the project are reported. In particular, we describe the system architecture and the new Opto-FEC adapter card developed to connect the SRS with the FED mezzanine modules. A first test bench was built and validated during the last TOTEM data taking period (February 2013). Readout of a set of 3 TOTEM Roman Pot silicon detectors was carried out to verify performance in the real LHC environment. In addition, the test allowed a check of data consistency and quality.
ERIC Educational Resources Information Center
Curtis, John A.
The 5 year report on Project SETT-UP (Special Education via Telecommunications--Teacher Upgrade), a two way telecommunications inservice training project in southeastern Virginia, is presented. The report is organized into three sections concerned with personnel involved in the program, achievement of program objectives, and special program…
NASA Astrophysics Data System (ADS)
Hara, K.; Allport, P. P.; Baca, M.; Broughton, J.; Chisholm, A.; Nikolopoulos, K.; Pyatt, S.; Thomas, J. P.; Wilson, J. A.; Kierstead, J.; Kuczewski, P.; Lynn, D.; Arratia, M.; Hommels, L. B. A.; Ullan, M.; Bloch, I.; Gregor, I. M.; Tackmann, K.; Trofimov, A.; Yildirim, E.; Hauser, M.; Jakobs, K.; Kuehn, S.; Mahboubi, K.; Mori, R.; Parzefall, U.; Clark, A.; Ferrere, D.; Gonzalez Sevilla, S.; Ashby, J.; Blue, A.; Bates, R.; Buttar, C.; Doherty, F.; McMullen, T.; McEwan, F.; O'Shea, V.; Kamada, S.; Yamamura, K.; Ikegami, Y.; Nakamura, K.; Takubo, Y.; Unno, Y.; Takashima, R.; Chilingarov, A.; Fox, H.; Affolder, A. A.; Casse, G.; Dervan, P.; Forshaw, D.; Greenall, A.; Wonsak, S.; Wormald, M.; Cindro, V.; Kramberger, G.; Mandić, I.; Mikuž, M.; Gorelov, I.; Hoeferkamp, M.; Palni, P.; Seidel, S.; Taylor, A.; Toms, K.; Wang, R.; Hessey, N. P.; Valencic, N.; Hanagaki, K.; Dolezal, Z.; Kodys, P.; Bohm, J.; Mikestikova, M.; Bevan, A.; Beck, G.; Milke, C.; Domingo, M.; Fadeyev, V.; Galloway, Z.; Hibbard-Lubow, D.; Liang, Z.; Sadrozinski, H. F.-W.; Seiden, A.; To, K.; French, R.; Hodgson, P.; Marin-Reyes, H.; Parker, K.; Jinnouchi, O.; Hara, K.; Sato, K.; Sato, K.; Hagihara, M.; Iwabuchi, S.; Bernabeu, J.; Civera, J. V.; Garcia, C.; Lacasta, C.; Marti i. Garcia, S.; Rodriguez, D.; Santoyo, D.; Solaz, C.; Soldevila, U.
2016-09-01
The ATLAS group has evaluated the charge collection in silicon microstrip sensors irradiated up to a fluence of 1 ×1016 neq/cm2, exceeding the maximum of 1.6 ×1015 neq/cm2 expected for the strip tracker during the high luminosity LHC (HL-LHC) period including a safety factor of 2. The ATLAS12, n+-on-p type sensor, which is fabricated by Hamamatsu Photonics (HPK) on float zone (FZ) substrates, is the latest barrel sensor prototype. The charge collection from the irradiated 1×1 cm2 barrel test sensors has been evaluated systematically using penetrating β-rays and an Alibava readout system. The data obtained at different measurement sites are compared with each other and with the results obtained from the previous ATLAS07 design. The results are very consistent, in particular, when the deposit charge is normalized by the sensor's active thickness derived from the edge transient current technique (edge-TCT) measurements. The measurements obtained using β-rays are verified to be consistent with the measurements using an electron beam. The edge-TCT is also effective for evaluating the field profiles across the depth. The differences between the irradiated ATLAS07 and ATLAS12 samples have been examined along with the differences among the samples irradiated with different radiation sources: neutrons, protons, and pions. The studies of the bulk properties of the devices show that the devices can yield a sufficiently large signal for the expected fluence range in the HL-LHC, thereby acting as precision tracking sensors.
NASA Astrophysics Data System (ADS)
Sugano, Michinaka; Ballarino, Amalia; Bartova, Barbora; Bjoerstad, Roger; Gerardin, Alexandre; Scheuerlein, Christian
2016-02-01
MgB2 wire is a promising superconductor for the superconducting links for the high-luminosity upgrade of the large Hadron collider at CERN. The mechanical properties of MgB2 must be fully quantified for the cable design, and in this study, we evaluate the Young’s modulus of MgB2 filaments in wires with a practical level of critical current. The Young’s moduli of MgB2 filaments by two different processes, in situ and ex situ, were compared. Two different evaluation methods were applied to an in situ MgB2 wire, a single-fiber tensile test and a tensile test after removing Monel. In addition, the Young’s modulus of the few-micron-thick Nb-Ni reaction layer in an ex situ processed wire was evaluated using a nanoindentation testing technique to improve the accuracy of analysis based on the rule of mixtures. The Young’s moduli of the in situ and ex situ MgB2 wires were in the range of 76-97 GPa and no distinct difference depending on the fabrication process was found.
NASA Astrophysics Data System (ADS)
Wang, T.; Barbero, M.; Berdalovic, I.; Bespin, C.; Bhat, S.; Breugnon, P.; Caicedo, I.; Cardella, R.; Chen, Z.; Degerli, Y.; Egidos, N.; Godiot, S.; Guilloux, F.; Hemperek, T.; Hirono, T.; Krüger, H.; Kugathasan, T.; Hügging, F.; Marin Tobon, C. A.; Moustakas, K.; Pangaud, P.; Schwemling, P.; Pernegger, H.; Pohl, D.-L.; Rozanov, A.; Rymaszewski, P.; Snoeys, W.; Wermes, N.
2018-03-01
Depleted monolithic active pixel sensors (DMAPS), which exploit high voltage and/or high resistivity add-ons of modern CMOS technologies to achieve substantial depletion in the sensing volume, have proven to have high radiation tolerance towards the requirements of ATLAS in the high-luminosity LHC era. DMAPS integrating fast readout architectures are currently being developed as promising candidates for the outer pixel layers of the future ATLAS Inner Tracker, which will be installed during the phase II upgrade of ATLAS around year 2025. In this work, two DMAPS prototype designs, named LF-Monopix and TJ-Monopix, are presented. LF-Monopix was fabricated in the LFoundry 150 nm CMOS technology, and TJ-Monopix has been designed in the TowerJazz 180 nm CMOS technology. Both chips employ the same readout architecture, i.e. the column drain architecture, whereas different sensor implementation concepts are pursued. The paper makes a joint description of the two prototypes, so that their technical differences and challenges can be addressed in direct comparison. First measurement results for LF-Monopix will also be shown, demonstrating for the first time a fully functional fast readout DMAPS prototype implemented in the LFoundry technology.
The Belle II imaging Time-of-Propagation (iTOP) detector
Fast, J.
2017-02-16
High precision flavor physics measurements are an essential complement to the direct searches for new physics at the LHC ATLAS and CMS experiments. We will perform these measurements using the upgraded Belle II detector that will take data at the SuperKEKB accelerator. With 40x the luminosity of KEKB, the detector systems must operate efficiently at much higher rates than the original Belle detector. A central element of the upgrade is the barrel particle identification system. Belle II has built and installed an imaging-Time-of-Propagation (iTOP) detector. The iTOP uses quartz optics as Cherenkov radiators. The photons are transported down the quartzmore » bars via total internal reflection with a spherical mirror at the forward end to reflect photons to the backward end where they are imaged onto an array of segmented Micro-Channel Plate Photo-Multiplier Tubes (MCP-PMTs). The system is read out using giga-samples per second waveform sampling Application-Specific Integrated Circuits (ASICs). Furthermore, we used the combined timing and spatial distribution of the photons for each event to determine particle species. This paper provides an overview of the iTOP system.« less
The CMS Level-1 Calorimeter Trigger for LHC Run II
NASA Astrophysics Data System (ADS)
Sinthuprasith, Tutanon
2017-01-01
The phase-1 upgrades of the CMS Level-1 calorimeter trigger have been completed. The Level-1 trigger has been fully commissioned and it will be used by CMS to collect data starting from the 2016 data run. The new trigger has been designed to improve the performance at high luminosity and large number of simultaneous inelastic collisions per crossing (pile-up). For this purpose it uses a novel design, the Time Multiplexed Design, which enables the data from an event to be processed by a single trigger processor at full granularity over several bunch crossings. The TMT design is a modular design based on the uTCA standard. The architecture is flexible and the number of trigger processors can be expanded according to the physics needs of CMS. Intelligent, more complex, and innovative algorithms are now the core of the first decision layer of CMS: the upgraded trigger system implements pattern recognition and MVA (Boosted Decision Tree) regression techniques in the trigger processors for pT assignment, pile up subtraction, and isolation requirements for electrons, and taus. The performance of the TMT design and the latency measurements and the algorithm performance which has been measured using data is also presented here.
Data Quality Monitoring System for New GEM Muon Detectors for the CMS Experiment Upgrade
NASA Astrophysics Data System (ADS)
King, Robert; CMS Muon Group Team
2017-01-01
The Gas Electron Multiplier (GEM) detectors are novel detectors designed to improve the muon trigger and tracking performance in CMS experiment for the high luminosity upgrade of the LHC. Partial installation of GEM detectors is planned during the 2016-2017 technical stop. Before the GEM system is installed underground, its data acquisition (DAQ) electronics must be thoroughly tested. The DAQ system includes several commercial and custom-built electronic boards running custom firmware. The front-end electronics are radiation-hard and communicate via optical fibers. The data quality monitoring (DQM) software framework has been designed to provide online verification of the integrity of the data produced by the detector electronics, and to promptly identify potential hardware or firmware malfunctions in the system. Local hits reconstruction and clustering algorithms allow quality control of the data produced by each GEM chamber. Once the new detectors are installed, the DQM will monitor the stability and performance of the system during normal data-taking operations. We discuss the design of the DQM system, the software being developed to read out and process the detector data, and the methods used to identify and report hardware and firmware malfunctions of the system.
NASA Astrophysics Data System (ADS)
Meng, X. T.; Levin, D. S.; Chapman, J. W.; Zhou, B.
2016-09-01
The ATLAS Muon Spectrometer endcap thin-Resistive Plate Chamber trigger project compliments the New Small Wheel endcap Phase-1 upgrade for higher luminosity LHC operation. These new trigger chambers, located in a high rate region of ATLAS, will improve overall trigger acceptance and reduce the fake muon trigger incidence. These chambers must generate a low level muon trigger to be delivered to a remote high level processor within a stringent latency requirement of 43 bunch crossings (1075 ns). To help meet this requirement the High Performance Time to Digital Converter (HPTDC), a multi-channel ASIC designed by CERN Microelectronics group, has been proposed for the digitization of the fast front end detector signals. This paper investigates the HPTDC performance in the context of the overall muon trigger latency, employing detailed behavioral Verilog simulations in which the latency in triggerless mode is measured for a range of configurations and under realistic hit rate conditions. The simulation results show that various HPTDC operational configurations, including leading edge and pair measurement modes can provide high efficiency (>98%) to capture and digitize hits within a time interval satisfying the Phase-1 latency tolerance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malik, S.; Shipsey, I.; Cavanaugh, R.
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals ofmore » CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.« less
Research of TREETOPS Structural Dynamics Controls Simulation Upgrade
NASA Technical Reports Server (NTRS)
Yates, Rose M.
1996-01-01
Under the provisions of contract number NAS8-40194, which was entitled 'TREETOPS Structural Dynamics and Controls Simulation System Upgrade', Oakwood College contracted to produce an upgrade to the existing TREETOPS suite of analysis tools. This suite includes the main simulation program, TREETOPS, two interactive preprocessors, TREESET and TREEFLX, an interactive post processor, TREEPLOT, and an adjunct program, TREESEL. A 'Software Design Document', which provides descriptions of the argument lists and internal variables for each subroutine in the TREETOPS suite, was established. Additionally, installation guides for both DOS and UNIX platforms were developed. Finally, updated User's Manuals, as well as a Theory Manual, were generated.
Generic particulate-monitoring system for retrofit to Hanford exhaust stacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camman, J.W.; Carbaugh, E.H.
1982-11-01
Evaluations of 72 sampling and monitoring systems were performed at Hanford as the initial phase of a program to upgrade such systems. Each evaluation included determination of theoretical sampling efficiencies for particle sizes ranging from 0.5 to 10 micrometers aerodynamic equivalent diameter, addressing anisokinetic bias, sample transport line losses, and collector device efficiency. Upgrades needed to meet current Department of Energy guidance for effluent sampling and monitoring were identified, and a cost for each upgrade was estimated. A relative priority for each system's upgrade was then established based on evaluation results, current operational status, and future plans for the facilitymore » being exhausted. Common system upgrade requirements lead to the development of a generic design for common components of an exhaust stack sampling and monitoring system for airborne radioactive particulates. The generic design consists of commercially available off-the-shelf components to the extent practical and will simplify future stack sampling and monitoring system design, fabrication, and installation efforts. Evaluation results and their significance to system upgrades are empasized. A brief discussion of the analytical models used and experience to date with the upgrade program is included. Development of the generic stack sampling and monitoring system design is outlined. Generic system design features and limitations are presented. Requirements for generic system retrofitting to existing exhaust stacks are defined and benefits derived from generic system application are discussed.« less
The QuarkNet CMS masterclass: bringing the LHC to students
NASA Astrophysics Data System (ADS)
Cecire, Kenneth; McCauley, Thomas
2016-04-01
QuarkNet is an educational program which brings high school teachers and their students into the particle physics research community. The program supports research experiences and professional development workshops and provides inquiry-oriented investigations, some using real experimental data. The CMS experiment at the LHC has released several thousand proton-proton collision events for use in education and outreach. QuarkNet, in collaboration with CMS, has developed a physics masterclass and e-Lab based on this data. A masterclass is a day-long educational workshop where high school students travel to nearby universities and research laboratories. There they learn from LHC physicists about the basics of particle physics and detectors. They then perform a simple measurement using LHC data, and share their results with other students around the world via videoconference. Since 2011 thousands of students from over 25 countries have participated in the CMS masterclass as organized by QuarkNet and the International Particle Physics Outreach Group (IPPOG).We describe here the masterclass exercise: the physics, the online event display and database preparation behind it, the measurement the students undertake, their results and experiences, and future plans for the exercise.
FOREWORD: International Conference on Heavy Ion Collisions in the LHC Era
NASA Astrophysics Data System (ADS)
Arleo, Francois; Salgado, Carlos A.; Tran Thanh Van, Jean
2013-03-01
The International Conference on Heavy Ion Collisions in the LHC Era was held in Quy Nhon, Vietnam, on 16-20 July 2012. The series Rencontres du Vietnam, created by Jean Tran Thanh Van in 1993, consists of international meetings aimed to stimulate the development of advanced research in Vietnam and more generally in South East Asia, and to establish collaborative research networks with Western scientific communities. This conference, as the whole series, also supports the International Center for Interdisciplinary Science Education being built in Quy Nhon. The articles published in this volume present the latest results from the heavy-ion collision programs of RHIC and LHC as well as the corresponding theoretical interpretation and future perspectives. Lower energy nuclear programs were also reviewed, providing a rather complete picture of the state-of-the-art in the field. We wish to thank the sponsors of the Conference on Heavy Ion Collisions in the LHC Era: the European Research Council; Xunta de Galicia (Spain); EMMI (Germany) and Agence Nationale de la Recherche (France) François Arleo (Laboratoire d'Annecy-le-Vieux de Physique Théorique, France) Francois Arleo, Carlos A Salgado and Jean Tran Thanh Van Conference photograph
Experiments and Cycling at the LHC Prototype Half-Cell
NASA Astrophysics Data System (ADS)
Saban, R.; Casas-Cubillos, J.; Coull, L.; Cruikshank, P.; Dahlerup-Petersen, K.; Hilbert, B.; Krainz, G.; Kos, N.; Lebrun, P.; Momal, F.; Misiaen, D.; Parma, V.; Poncet, A.; Riddone, G.; Rijllart, A.; Rodriguez-Mateos, F.; Schmidt, R.; Serio, L.; Wallen, E.; van Weelderen, R.; Williams, L. R.
1997-05-01
The first version of the LHC prototype half-cell has been in operation since February 1995. It consists of one quadrupole and three 10-m twin aperture dipole magnets which operate at 1.8 K. This experimental set-up has been used to observe and study phenomena which appear when the systems are assembled in one unit and influence one another. The 18-month long experimental program has validated the cryogenic system and yielded a number of results on cryogenic instrumentation, magnet protection and vacuum in particular under non-standard operating conditions. The program was recently complemented by the cycling experiment: it consisted in powering the magnets following the ramp rates which will be experienced by the magnets during an LHC injection. In order to simulate 10 years of routine operation of LHC, more than 2000 1-hour cycles were performed interleaved with provoked quenches. The objective of this experiment was to reveal eventual flaws in the design of components. The prototype half-cell performed to expectations showing no sign of failure of fatigue of components for more than 2000 cycles until one of the dipoles started exhibiting an erratic quench behavior.
A software upgrade method for micro-electronics medical implants.
Cao, Yang; Hao, Hongwei; Xue, Lin; Li, Luming; Ma, Bozhi
2006-01-01
A software upgrade method for micro-electronics medical implants is designed to enhance the devices' function or renew the software if there are some bugs found, the software updating or some memory units disabled. The implants needn't be replaced by operations if the faults can be corrected through reprogramming, which reduces the patients' pain and improves the safety effectively. This paper introduces the software upgrade method using in-application programming (IAP) and emphasizes how to insure the system, especially the implanted part's reliability and stability while upgrading.
NASA STI Program Coordinating Council Eleventh Meeting: NASA STI Modernization Plan
NASA Technical Reports Server (NTRS)
1993-01-01
The theme of this NASA Scientific and Technical Information Program Coordinating Council Meeting was the modernization of the STI Program. Topics covered included the activities of the Engineering Review Board in the creation of the Infrastructure Upgrade Plan, the progress of the RECON Replacement Project, the use and status of Electronic SCAN (Selected Current Aerospace Notices), the Machine Translation Project, multimedia, electronic document interchange, the NASA Access Mechanism, computer network upgrades, and standards in the architectural effort.
Hey, Matthias; Hocke, Thomas; Mauger, Stefan; Müller-Deile, Joachim
2016-11-01
Individual speech intelligibility was measured in quiet and noise for cochlear Implant recipients upgrading from the Freedom to the CP900 series sound processor. The postlingually deafened participants (n = 23) used either Nucleus CI24RE or CI512 cochlear implant, and currently wore a Freedom sound processor. A significant group mean improvement in speech intelligibility was found in quiet (Freiburg monosyllabic words at 50 dB SPL ) and in noise (adaptive Oldenburger sentences in noise) for the two CP900 series SmartSound programs compared to the Freedom program. Further analysis was carried out on individual's speech intelligibility outcomes in quiet and in noise. Results showed a significant improvement or decrement for some recipients when upgrading to the new programs. To further increase speech intelligibility outcomes when upgrading, an enhanced upgrade procedure is proposed that includes additional testing with different signal-processing schemes. Implications of this research are that future automated scene analysis and switching technologies could provide additional performance improvements by introducing individualized scene-dependent settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lombardo, v.; Barzi, E.; Turrioni, D.
Superconducting magnets with magnetic fields above 20 T will be needed for a Muon Collider and possible LHC energy upgrade. This field level exceeds the possibilities of traditional Low Temperature Superconductors (LTS) such as Nb{sub 3}Sn and Nb{sub 3}Al. Presently the use of high field high temperature superconductors (HTS) is the only option available for achieving such field levels. Commercially available YBCO comes in tapes and shows noticeable anisotropy with respect to field orientation, which needs to be accounted for during magnet design. In the present work, critical current test results are presented for YBCO tape manufactured by Bruker. Shortmore » sample measurements results are presented up to 14 T, assessing the level of anisotropy as a function of field, field orientation and operating temperature.« less
A 10 Gb/s laser driver in 130 nm CMOS technology for high energy physics applications
Zhang, T.; Tavernier, F.; Moreira, P.; ...
2015-02-19
The GigaBit Laser Driver (GBLD) is a key on-detector component of the GigaBit Transceiver (GBT) system at the transmitter side. We have developed a 10 Gb/s GBLD (GBLD10) in a 130 nm CMOS technology, as part of the design efforts towards the upgrade of the electrical components of the LHC experiments. The GBLD10 is based on the distributed-amplifier (DA) architecture and achieves data rates up to 10 Gb/s. It is capable of driving VCSELs with modulation currents up to 12 mA. Furthermore, a pre-emphasis function has been included in the proposed laser driver in order to compensate for the capacitivemore » load and channel losses.« less
Design, prototyping, and testing of a compact superconducting double quarter wave crab cavity
Xiao, Binping; Alberty, Luis; Belomestnykh, Sergey; ...
2015-04-01
We proposed a novel design for a compact superconducting crab cavity with a double quarter wave (DQWCC) shape. After fabrication and surface treatments, this niobium proof-of-principle cavity was tested cryogenically in a vertical cryostat. The cavity is extremely compact yet has a low frequency of 400 MHz, an essential property for service in the Large Hadron Collider luminosity upgrade. The cavity’s electromagnetic properties are well suited for this demanding task. The demonstrated deflecting voltage of 4.6 MV is well above the required 3.34 MV for a crab cavity in the future High Luminosity LHC. In this paper, we present themore » design, prototyping, and results from testing the DQWCC.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henriques, A.
TileCal is the Hadronic calorimeter covering the most central region of the ATLAS experiment at the LHC. It uses iron plates as absorber and plastic scintillating tiles as the active material. Scintillation light produced in the tiles is transmitted by wavelength shifting fibres to photomultiplier tubes (PMTs). The resulting electronic signals from the approximately 10000 PMTs are measured and digitised every 25 ns before being transferred to off-detector data-acquisition systems. This contribution will review in a first part the performances of the calorimeter during run 1, obtained from calibration data, and from studies of the response of particles from collisions.more » In a second part it will present the solutions being investigated for the ongoing and future upgrades of the calorimeter electronics. (authors)« less
Future hadron colliders: From physics perspectives to technology R&D
NASA Astrophysics Data System (ADS)
Barletta, William; Battaglia, Marco; Klute, Markus; Mangano, Michelangelo; Prestemon, Soren; Rossi, Lucio; Skands, Peter
2014-11-01
High energy hadron colliders have been instrumental to discoveries in particle physics at the energy frontier and their role as discovery machines will remain unchallenged for the foreseeable future. The full exploitation of the LHC is now the highest priority of the energy frontier collider program. This includes the high luminosity LHC project which is made possible by a successful technology-readiness program for Nb3Sn superconductor and magnet engineering based on long-term high-field magnet R&D programs. These programs open the path towards collisions with luminosity of 5×1034 cm-2 s-1 and represents the foundation to consider future proton colliders of higher energies. This paper discusses physics requirements, experimental conditions, technological aspects and design challenges for the development towards proton colliders of increasing energy and luminosity.
The Nova Upgrade Facility for ICF ignition and gain
NASA Astrophysics Data System (ADS)
Lowdermilk, W. H.; Campbell, E. M.; Hunt, J. T.; Murray, J. R.; Storm, E.; Tobin, M. T.; Trenholme, J. B.
1992-01-01
Research on Inertial Confinement Fusion (ICF) is motivated by its potential defense and civilian applications, including ultimately the generation of electric power. The U.S. ICF Program was reviewed recently by the National Academy of Science (NAS) and the Fusion Policy Advisory Committee (FPAC). Both committees issued final reports in 1991 which recommended that first priority in the ICF program be placed on demonstrating fusion ignition and modest gain (G less than 10). The U.S. Department of Energy and Lawrence Livermore National Laboratory (LLNL) have proposed an upgrade of the existing Nova Laser Facility at LLNL to accomplish these goals. Both the NAS and FPAC have endorsed the upgrade of Nova as the optimal path to achieving ignition and gain. Results from Nova Upgrade Experiments will be used to define requirements for driver and target technology both for future high-yield military applications, such as the Laboratory Microfusion Facility (LMF) proposed by the Department of Energy, and for high-gain energy applications leading to an ICF engineering test facility. The central role and modifications which Nova Upgrade would play in the national ICF strategy are described.
Hu, Guijie; Yi, Yanhua
2016-01-01
Rural health professionals in township health centers (THCs) tend to have less advanced educational degrees. This study aimed to ascertain the perceived feasibility of a decentralized continuing medical education (CME) program to upgrade their educational levels. A cross-sectional survey of THC health professionals was conducted using a self-administered, structured questionnaire in Guangxi Zhuang Autonomous Region, China. The health professionals in the THCs were overwhelmingly young with low education levels. They had a strong desire to upgrade their educational degrees. The decentralized CME program was perceived as feasible by health workers with positive attitudes about the benefit for license examination, and by those who intended to improve their clinical diagnosis and treatment skills. The target groups of such a program were those who expected to undertake a bachelor's degree and who rated themselves as "partially capable" in clinical competency. They reported that 160-400 USD annually would be an affordable fee for the program. A decentralized CME program was perceived feasible to upgrade rural health workers' education level to a bachelor's degree and improve their clinical competency.
Highlights from the STAR experiment at RHIC
NASA Astrophysics Data System (ADS)
Schmah, Alexander; STAR Collaboration
2017-11-01
We present an overview of results presented by the STAR collaboration at the Quark Matter 2017 conference. We report the precision measurement of elliptic flow of D0 mesons and the first measurement of Λc baryons in Au+Au collisions at √{sNN} = 200GeV, which suggest thermalization of charm quarks. B-hadron production is also measured via the decay daughters, J/ψ, D0 and electrons in Au+Au collisions, exhibiting less suppression than charm mesons at high transverse momenta (pT) for the D0 and electron decay channels. J/ψ and ϒ measurements provide new insights into in-medium regeneration and dissociation. Di-electron production shows evidence for coherent photo-production at very low pT. The first RHIC measurement of direct photon-jet coincidences is reported, and compared to distributions with π0 and charged hadron triggers. The jet shared momentum fraction zg measurement shows no modification in A+A collisions, in contrast to LHC measurements. First results from the fixed-target program at STAR show good agreement with published results. New analysis methods of the chiral magnetic effect are discussed, together with the first observation of a non-zero global Λ polarization in A+A collisions. Finally, we give an outlook to detector upgrades for the Beam Energy Scan phase II.
Development of n-in-p pixel modules for the ATLAS upgrade at HL-LHC
NASA Astrophysics Data System (ADS)
Macchiolo, A.; Nisius, R.; Savic, N.; Terzo, S.
2016-09-01
Thin planar pixel modules are promising candidates to instrument the inner layers of the new ATLAS pixel detector for HL-LHC, thanks to the reduced contribution to the material budget and their high charge collection efficiency after irradiation. 100-200 μm thick sensors, interconnected to FE-I4 read-out chips, have been characterized with radioactive sources and beam tests at the CERN-SPS and DESY. The results of these measurements are reported for devices before and after irradiation up to a fluence of 14 ×1015 neq /cm2 . The charge collection and tracking efficiency of the different sensor thicknesses are compared. The outlook for future planar pixel sensor production is discussed, with a focus on sensor design with the pixel pitches (50×50 and 25×100 μm2) foreseen for the RD53 Collaboration read-out chip in 65 nm CMOS technology. An optimization of the biasing structures in the pixel cells is required to avoid the hit efficiency loss presently observed in the punch-through region after irradiation. For this purpose the performance of different layouts have been compared in FE-I4 compatible sensors at various fluence levels by using beam test data. Highly segmented sensors will represent a challenge for the tracking in the forward region of the pixel system at HL-LHC. In order to reproduce the performance of 50×50 μm2 pixels at high pseudo-rapidity values, FE-I4 compatible planar pixel sensors have been studied before and after irradiation in beam tests at high incidence angle (80°) with respect to the short pixel direction. Results on cluster shapes, charge collection and hit efficiency will be shown.
NASA Technical Reports Server (NTRS)
Monroe, Joseph; Kelkar, Ajit
2003-01-01
The NASA PAIR program incorporated the NASA-Sponsored research into the undergraduate environment at North Carolina Agricultural and Technical State University. This program is designed to significantly improve undergraduate education in the areas of mathematics, science, engineering, and technology (MSET) by directly benefiting from the experiences of NASA field centers, affiliated industrial partners and academic institutions. The three basic goals of the program were enhancing core courses in MSET curriculum, upgrading core-engineering laboratories to compliment upgraded MSET curriculum, and conduct research training for undergraduates in MSET disciplines through a sophomore shadow program and through Research Experience for Undergraduates (REU) programs. Since the inception of the program nine courses have been modified to include NASA related topics and research. These courses have impacted over 900 students in the first three years of the program. The Electrical Engineering circuit's lab is completely re-equipped to include Computer controlled and data acquisition equipment. The Physics lab is upgraded to implement better sensory data acquisition to enhance students understanding of course concepts. In addition a new instrumentation laboratory in the department of Mechanical Engineering is developed. Research training for A&T students was conducted through four different programs: Apprentice program, Developers program, Sophomore Shadow program and Independent Research program. These programs provided opportunities for an average of forty students per semester.
NASA Astrophysics Data System (ADS)
Dienes, Keith R.; Su, Shufang; Thomas, Brooks
2015-03-01
In this paper, we examine the strategies and prospects for distinguishing between traditional dark-matter models and models with nonminimal dark sectors—including models of Dynamical Dark Matter—at hadron colliders. For concreteness, we focus on events with two hadronic jets and large missing transverse energy at the Large Hadron Collider (LHC). As we discuss, simple "bump-hunting" searches are not sufficient; probing nonminimal dark sectors typically requires an analysis of the actual shapes of the distributions of relevant kinematic variables. We therefore begin by identifying those kinematic variables whose distributions are particularly suited to this task. However, as we demonstrate, this then leads to a number of additional subtleties, since cuts imposed on the data for the purpose of background reduction can at the same time have the unintended consequence of distorting these distributions in unexpected ways, thereby obscuring signals of new physics. We therefore proceed to study the correlations between several of the most popular relevant kinematic variables currently on the market, and investigate how imposing cuts on one or more of these variables can impact the distributions of others. Finally, we combine our results in order to assess the prospects for distinguishing nonminimal dark sectors in this channel at the upgraded LHC.
NASA Astrophysics Data System (ADS)
Bianco, M.; Martoiu, S.; Sidiropoulou, O.; Zibell, A.
2015-12-01
A Micromegas (MM) quadruplet prototype with an active area of 0.5 m2 that adopts the general design foreseen for the upgrade of the innermost forward muon tracking systems (Small Wheels) of the ATLAS detector in 2018-2019, has been built at CERN and is going to be tested in the ATLAS cavern environment during the LHC RUN-II period 2015-2017. The integration of this prototype detector into the ATLAS data acquisition system using custom ATCA equipment is presented. An ATLAS compatible Read Out Driver (ROD) based on the Scalable Readout System (SRS), the Scalable Readout Unit (SRU), will be used in order to transmit the data after generating valid event fragments to the high-level Read Out System (ROS). The SRU will be synchronized with the LHC bunch crossing clock (40.08 MHz) and will receive the Level-1 trigger signals from the Central Trigger Processor (CTP) through the TTCrx receiver ASIC. The configuration of the system will be driven directly from the ATLAS Run Control System. By using the ATLAS TDAQ Software, a dedicated Micromegas segment has been implemented, in order to include the detector inside the main ATLAS DAQ partition. A full set of tests, on the hardware and software aspects, is presented.
Electrical properties study under radiation of the 3D-open-shell-electrode detector
NASA Astrophysics Data System (ADS)
Liu, Manwen; Li, Zheng
2018-05-01
Since the 3D-Open-Shell-Electrode Detector (3DOSED) is proposed and the structure is optimized, it is important to study 3DOSED's electrical properties to determine the detector's working performance, especially in the heavy radiation environments, like the Large Hadron Collider (LHC) and it's upgrade, the High Luminosity (HL-LHC) at CERN. In this work, full 3D technology computer-aided design (TCAD) simulations have been done on this novel silicon detector structure. Simulated detector properties include the electric field distribution, the electric potential distribution, current-voltage (I-V) characteristics, capacitance-voltage (C-V) characteristics, charge collection property, and full depletion voltage. Through the analysis of calculations and simulation results, we find that the 3DOSED's electric field and potential distributions are very uniform, even in the tiny region near the shell openings with little perturbations. The novel detector fits the designing purpose of collecting charges generated by particle/light in a good fashion with a well defined funnel shape of electric potential distribution that makes these charges drifting towards the center collection electrode. Furthermore, by analyzing the I-V, C-V, charge collection property and full depletion voltage, we can expect that the novel detector will perform well, even in the heavy radiation environments.
ALICE HLT Run 2 performance overview.
NASA Astrophysics Data System (ADS)
Krzewicki, Mikolaj; Lindenstruth, Volker;
2017-10-01
For the LHC Run 2 the ALICE HLT architecture was consolidated to comply with the upgraded ALICE detector readout technology. The software framework was optimized and extended to cope with the increased data load. Online calibration of the TPC using online tracking capabilities of the ALICE HLT was deployed. Offline calibration code was adapted to run both online and offline and the HLT framework was extended to support that. The performance of this schema is important for Run 3 related developments. An additional data transport approach was developed using the ZeroMQ library, forming at the same time a test bed for the new data flow model of the O2 system, where further development of this concept is ongoing. This messaging technology was used to implement the calibration feedback loop augmenting the existing, graph oriented HLT transport framework. Utilising the online reconstruction of many detectors, a new asynchronous monitoring scheme was developed to allow real-time monitoring of the physics performance of the ALICE detector, on top of the new messaging scheme for both internal and external communication. Spare computing resources comprising the production and development clusters are run as a tier-2 GRID site using an OpenStack-based setup. The development cluster is running continuously, the production cluster contributes resources opportunistically during periods of LHC inactivity.
Contractor Sales Training: Providing the Skills Necessary to Sell Comprehensive Home Energy Upgrades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Billingsley, Megan; Stuart, Elizabeth
2011-08-17
Many comprehensive home energy efficiency programs rely on contractors as the customer-facing ‘front line’ to sell energy improvements. Adding sales skills to contractors’ existing technical expertise is key to converting more assessments into comprehensive home energy upgrades. Leading programs recognize the need to support contractors with sales and business training to help them succeed and to support the growth of the home performance industry for the long term. A number of contractor sales training efforts are emerging, including some programs that are seeing encouraging early results.
Basis of Ionospheric Modification by High-Frequency Waves
2007-06-01
for conducting ionospheric heating experiments in Gakona, Alaska, as part of the High Frequency Active Auroral Research Program ( HAARP ) [5], is being...upgraded. The upgraded HAARP HF transmitting system will be a phased-array antenna of 180 elements. Each element is a cross dipole, which radiates a...supported by the High Frequency Active Auroral Research Program ( HAARP ), the Air Force Research Laboratory at Hanscom Air Force Base, MA, and by the Office
Solid Earth and Natural Hazards (SENH) Research and Applications Program and Internation
NASA Technical Reports Server (NTRS)
2001-01-01
This is a final report for grant NAG5-8627 entitled 'Joint UNAVCO and JPL proposal to NASA for support of the Solid Earth and Natural Hazards Research and Applications Program and Internation'. This report consists of the following sections: (1) new installations (with site visits); (2) upgrades (with site visits; (3) upcoming upgrades (with site visits); and (4) data management and archive efforts during the performance period.
The ALICE Experiment at CERN Lhc:. Status and First Results
NASA Astrophysics Data System (ADS)
Vercellin, Ermanno
The ALICE experiment is aimed at studying the properties of the hot and dense matter produced in heavy-ion collisions at LHC energies. In the first years of LHC operation the ALICE physics program will be focused on Pb-Pb and p-p collisions. The latter, on top of their intrinsic interest, will provide the necessary baseline for heavy-ion data. After its installation and a long commissioning with cosmic rays, in late fall 2009 ALICE participated (very successfully) in the first LHC run, by collecting data in p-p collisions at c.m. energy 900 GeV. After a short stop during winter, LHC operations have been resumed; the machine is now able to accelerate proton beams up to 3.5 TeV and ALICE has undertaken the data taking campaign at 7 TeV c.m. energy. After an overview of the ALICE physics goals and a short description of the detector layout, the ALICE performance in p-p collisions will be presented. The main physics results achieved so far will be highlighted as well as the main aspects of the ongoing data analysis.
2016-01-01
Purpose: Rural health professionals in township health centers (THCs) tend to have less advanced educational degrees. This study aimed to ascertain the perceived feasibility of a decentralized continuing medical education (CME) program to upgrade their educational levels. Methods: A cross-sectional survey of THC health professionals was conducted using a self-administered, structured questionnaire in Guangxi Zhuang Autonomous Region, China. Results: The health professionals in the THCs were overwhelmingly young with low education levels. They had a strong desire to upgrade their educational degrees. The decentralized CME program was perceived as feasible by health workers with positive attitudes about the benefit for license examination, and by those who intended to improve their clinical diagnosis and treatment skills. The target groups of such a program were those who expected to undertake a bachelor’s degree and who rated themselves as “partially capable” in clinical competency. They reported that 160-400 USD annually would be an affordable fee for the program. Conclusion: A decentralized CME program was perceived feasible to upgrade rural health workers’ education level to a bachelor’s degree and improve their clinical competency. PMID:27134005
Prompt radiation, shielding and induced radioactivity in a high-power 160 MeV proton linac
NASA Astrophysics Data System (ADS)
Magistris, Matteo; Silari, Marco
2006-06-01
CERN is designing a 160 MeV proton linear accelerator, both for a future intensity upgrade of the LHC and as a possible first stage of a 2.2 GeV superconducting proton linac. A first estimate of the required shielding was obtained by means of a simple analytical model. The source terms and the attenuation lengths used in the present study were calculated with the Monte Carlo cascade code FLUKA. Detailed FLUKA simulations were performed to investigate the contribution of neutron skyshine and backscattering to the expected dose rate in the areas around the linac tunnel. An estimate of the induced radioactivity in the magnets, vacuum chamber, the cooling system and the concrete shield was performed. A preliminary thermal study of the beam dump is also discussed.
Using MaxCompiler for the high level synthesis of trigger algorithms
NASA Astrophysics Data System (ADS)
Summers, S.; Rose, A.; Sanders, P.
2017-02-01
Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.
Experimental Overview on Heavy Flavor Production in Heavy Ion Collisions
Da Silva, Cesar Luis
2018-01-26
The use of probes containing heavy quarks is one of the pillars for the study of medium formed in high energy nuclear collisions. The conceptual ideas formulated more than two decades ago, such as quark mass hierarchy of the energy that the probe lose in the media and color screening of bound heavy quarkonia states, have being challenged by the measurements performed at RHIC and LHC. A summary of the most recent experimental observations involving charm and bottom quarks in pp, pA, and AA collisions from collisions energies extending from √sNN =200 GeV to 8 TeV is presented. Finally, thismore » manuscript also discuss possibilities of new measurements which can be at reach with increased statistics and detector upgrades.« less
Experimental Overview on Heavy Flavor Production in Heavy Ion Collisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Da Silva, Cesar Luis
The use of probes containing heavy quarks is one of the pillars for the study of medium formed in high energy nuclear collisions. The conceptual ideas formulated more than two decades ago, such as quark mass hierarchy of the energy that the probe lose in the media and color screening of bound heavy quarkonia states, have being challenged by the measurements performed at RHIC and LHC. A summary of the most recent experimental observations involving charm and bottom quarks in pp, pA, and AA collisions from collisions energies extending from √sNN =200 GeV to 8 TeV is presented. Finally, thismore » manuscript also discuss possibilities of new measurements which can be at reach with increased statistics and detector upgrades.« less
Quench Modeling in High-field Nb3Sn Accelerator Magnets
NASA Astrophysics Data System (ADS)
Bermudez, S. Izquierdo; Bajas, H.; Bottura, L.
The development of high-field magnets is on-going in the framework of the LHC luminosity upgrade. The resulting peak field, in the range of 12 T to 13 T, requires the use Nb3Sn as superconductor. Due to the high stored energy density (compact winding for cost reduction) and the low stabilizer fraction (to achieve the desired margins), quench protection becomes a challenging problem. Accurate simulation of quench transientsin these magnets is hence crucial to the design choices, the definition of priority R&D and to prove that the magnets are fit for operation. In this paper we focus on the modelling of quench initiation and propagation, we describe approaches that are suitable for magnet simulation, and we compare numerical results with available experimental data.
Evolution of Database Replication Technologies for WLCG
NASA Astrophysics Data System (ADS)
Baranowski, Zbigniew; Lobato Pardavila, Lorena; Blaszczyk, Marcin; Dimitrov, Gancho; Canali, Luca
2015-12-01
In this article we summarize several years of experience on database replication technologies used at WLCG and we provide a short review of the available Oracle technologies and their key characteristics. One of the notable changes and improvement in this area in recent past has been the introduction of Oracle GoldenGate as a replacement of Oracle Streams. We report in this article on the preparation and later upgrades for remote replication done in collaboration with ATLAS and Tier 1 database administrators, including the experience from running Oracle GoldenGate in production. Moreover, we report on another key technology in this area: Oracle Active Data Guard which has been adopted in several of the mission critical use cases for database replication between online and offline databases for the LHC experiments.
AthenaMT: upgrading the ATLAS software framework for the many-core world with multi-threading
NASA Astrophysics Data System (ADS)
Leggett, Charles; Baines, John; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; van Gemmeren, Peter; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; ATLAS Collaboration
2017-10-01
ATLAS’s current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying handling of features such as event and time dependent data, asynchronous callbacks, metadata, integration with the online High Level Trigger for partial processing in certain regions of interest, concurrent I/O, as well as ensuring thread safety of core services. We also report on upgrading the framework to handle Algorithms that are fully re-entrant.
Evaporative CO2 microchannel cooling for the LHCb VELO pixel upgrade
NASA Astrophysics Data System (ADS)
de Aguiar Francisco, O. A.; Buytaert, J.; Collins, P.; Dumps, R.; John, M.; Mapelli, A.; Romagnoli, G.
2015-05-01
The LHCb Vertex Detector (VELO) will be upgraded in 2018 to a lightweight pixel detector capable of 40 MHz readout and operation in very close proximity to the LHC beams. The thermal management of the system will be provided by evaporative CO2 circulating in microchannels embedded within thin silicon plates. This solution has been selected due to the excellent thermal efficiency, the absence of thermal expansion mismatch with silicon ASICs and sensors, the radiation hardness of CO2, and very low contribution to the material budget. Although microchannel cooling is gaining considerable attention for applications related to microelectronics, it is still a novel technology for particle physics experiments, in particular when combined with evaporative CO2 cooling. The R&D effort for LHCb is focused on the design and layout of the channels together with a fluidic connector and its attachment which must withstand pressures up to 170 bar. Even distribution of the coolant is ensured by means of the use of restrictions implemented before the entrance to a race track like layout of the main cooling channels. The coolant flow and pressure drop have been simulated as well as the thermal performance of the device. This proceeding describes the design and optimization of the cooling system for LHCb and the latest prototyping results.
Electromagnetic dipole moments of charged baryons with bent crystals at the LHC
NASA Astrophysics Data System (ADS)
Bagli, E.; Bandiera, L.; Cavoto, G.; Guidi, V.; Henry, L.; Marangotto, D.; Martinez Vidal, F.; Mazzolari, A.; Merli, A.; Neri, N.; Ruiz Vidal, J.
2017-12-01
We propose a unique program of measurements of electric and magnetic dipole moments of charm, beauty and strange charged baryons at the LHC, based on the phenomenon of spin precession of channeled particles in bent crystals. Studies of crystal channeling and spin precession of positively- and negatively-charged particles are presented, along with feasibility studies and expected sensitivities for the proposed experiment using a layout based on the LHCb detector.
2016-03-23
NASA’s upgraded crawler-transporter 2 (CT-2) begins its trek from the Vehicle Assembly Building (VAB) at the agency’s Kennedy Space Center in Florida to Launch Pad 39B to test recently completed upgrades and modifications for NASA’s journey to Mars. The Ground Systems Development and Operations Program at Kennedy oversaw upgrades to the crawler in the VAB. The crawler will carry the mobile launcher with Orion atop the Space Launch System rocket to Pad 39B for Exploration Mission-1, scheduled for 2018.
2016-03-23
NASA’s upgraded crawler-transporter 2 (CT-2) has exited the Vehicle Assembly Building (VAB) at the agency’s Kennedy Space Center in Florida for its trek along the crawlerway to Launch Pad 39B to test recently completed upgrades and modifications for NASA’s journey to Mars. The Ground Systems Development and Operations Program at Kennedy oversaw upgrades to the crawler in the VAB. The crawler will carry the mobile launcher with Orion atop the Space Launch System rocket to Pad 39B for Exploration Mission-1, scheduled for 2018.
2016-03-23
NASA’s upgraded crawler-transporter 2 (CT-2) travels along the crawlerway from the Vehicle Assembly Building (VAB) at the agency’s Kennedy Space Center in Florida on its trek to Launch Pad 39B to test recently completed upgrades and modifications for NASA’s journey to Mars. The Ground Systems Development and Operations Program at Kennedy oversaw upgrades to the crawler in the VAB. The crawler will carry the mobile launcher with Orion atop the Space Launch System rocket to Pad 39B for Exploration Mission-1, scheduled for 2018.
2016-03-23
NASA’s upgraded crawler-transporter 2 (CT-2) travels along the crawlerway during its trek to Launch Pad 39B at the agency’s Kennedy Space Center in Florida, to test recently completed upgrades and modifications for NASA’s journey to Mars. The Ground Systems Development and Operations Program at Kennedy oversaw upgrades to the crawler in the Vehicle Assembly Building. The crawler will carry the mobile launcher with Orion atop the Space Launch System rocket to Pad 39B for Exploration Mission-1, scheduled for 2018.
Studies of QCD structure in high-energy collisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nadolsky, Pavel M.
2016-06-26
”Studies of QCD structure in high-energy collisions” is a research project in theoretical particle physics at Southern Methodist University funded by US DOE Award DE-SC0013681. The award furnished bridge funding for one year (2015/04/15-2016/03/31) between the periods funded by Nadolsky’s DOE Early Career Research Award DE-SC0003870 (in 2010-2015) and a DOE grant DE-SC0010129 for SMU Department of Physics (starting in April 2016). The primary objective of the research is to provide theoretical predictions for Run-2 of the CERN Large Hadron Collider (LHC). The LHC physics program relies on state-of-the-art predictions in the field of quantum chromodynamics. The main effort ofmore » our group went into the global analysis of parton distribution functions (PDFs) employed by the bulk of LHC computations. Parton distributions describe internal structure of protons during ultrarelivistic collisions. A new generation of CTEQ parton distribution functions (PDFs), CT14, was released in summer 2015 and quickly adopted by the HEP community. The new CT14 parametrizations of PDFs were obtained using benchmarked NNLO calculations and latest data from LHC and Tevatron experiments. The group developed advanced methods for the PDF analysis and estimation of uncertainties in LHC predictions associated with the PDFs. We invented and refined a new ’meta-parametrization’ technique that streamlines usage of PDFs in Higgs boson production and other numerous LHC processes, by combining PDFs from various groups using multivariate stochastic sampling. In 2015, the PDF4LHC working group recommended to LHC experimental collaborations to use ’meta-parametrizations’ as a standard technique for computing PDF uncertainties. Finally, to include new QCD processes into the global fits, our group worked on several (N)NNLO calculations.« less
Retrofit California Overview and Final Reports
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choy, Howard; Rosales, Ana
Energy efficiency retrofits (also called upgrades) are widely recognized as a critical component to achieving energy savings in the building sector to help lower greenhouse gas (GHG) emissions. To date, however, upgrades have accounted for only a small percentage of aggregate energy savings in building stock, both in California and nationally. Although the measures and technologies to retrofit a building to become energy efficient are readily deployed, establishing this model as a standard practice remains elusive. Retrofit California sought to develop and test new program models to increase participation in the energy upgrade market in California. The Program encompassed 24more » pilot projects, conducted between 2010 and mid-2013 and funded through a $30 million American Recovery and Reinvestment Act (ARRA) grant from the U.S. Department of Energy’s (DOE) Better Buildings Neighborhood Program (BBNP). The broad scope of the Program can be seen in the involvement of the following regionally based Grant Partners: Los Angeles County (as prime grantee); Association of Bay Area Governments (ABAG), consisting of: o StopWaste.org for Alameda County o Regional Climate Protection Authority (RCPA) for Sonoma County o SF Environment for the City and County of San Francisco o City of San Jose; California Center for Sustainable Energy (CCSE) for the San Diego region; Sacramento Municipal Utilities District (SMUD). Within these jurisdictions, nine different types of pilots were tested with the common goal of identifying, informing, and educating the people most likely to undertake energy upgrades (both homeowners and contractors), and to provide them with incentives and resources to facilitate the process. Despite its limited duration, Retrofit California undoubtedly succeeded in increasing awareness and education among home and property owners, as well as contractors, realtors, and community leaders. However, program results indicate that a longer timeframe will be needed to transform the market and establish energy retrofits as the new paradigm. Innovations such as Flex Path, which came about because of barriers encountered during the Program, have already shown promise and are enabling increased participation. Together, the pilots represent an unprecedented effort to identify and address market barriers to energy efficiency upgrades and to provide lessons learned to shape future program planning and implementation. The statistics reflects the scope of the marketing and outreach campaigns, which tested a variety of approaches to increase understanding of the benefits of energy upgrades to drive participation in the Program. More traditional methods such as TV and radio advertisements were complimented by innovative community based social marketing campaigns that sought to leverage the trusted status of neighborhood organizations and leaders in order to motivate their constituents to undertake retrofits. The remainder of this report provides an overview of Retrofit California including brief summaries of the pilots’ main components and highlights, followed by the major findings or takeaway lessons from the approaches that were tested. Eleven of the pilots will be continued, with modifications, under the ratepayer-funded Regional Energy Networks. Involvement in the RENS by many of the Retrofit California partners will ensure that early lessons learned are carried forward to guide future programs for energy upgrades in California.« less
ERIC Educational Resources Information Center
Miller, Richard E.
The Association for the Advancement of Health Education (AAHE) and Academic Programs for Health Science, George Mason University (Virginia), have collaborated in upgrading AAHE's Health Resources Information System. The process involved updating the health resources information on file. This information, which represents addresses and telephone…
Losing a Job to Position Upgrading: The Case of Rhonda.
ERIC Educational Resources Information Center
Papalia, Anthony S.
1990-01-01
Presents case of 34-year-old White female employee relations assistant who lost her job when it was upgraded to personnel administrator for manufacturing, a position for which she was unqualified. Presents employment options available to this client who sought career counseling through the corporation's Employee Assistance Program. (NB)
40 CFR 256.24 - Recommendations for closing or upgrading open dumps.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) SOLID WASTES GUIDELINES FOR DEVELOPMENT AND IMPLEMENTATION OF STATE SOLID WASTE MANAGEMENT PLANS Solid Waste Disposal Programs § 256.24 Recommendations for closing or upgrading open dumps. (a) All... feasibility of resource recovery or resource conservation to reduce the solid waste volume entering a facility...
40 CFR 256.24 - Recommendations for closing or upgrading open dumps.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) SOLID WASTES GUIDELINES FOR DEVELOPMENT AND IMPLEMENTATION OF STATE SOLID WASTE MANAGEMENT PLANS Solid Waste Disposal Programs § 256.24 Recommendations for closing or upgrading open dumps. (a) All... feasibility of resource recovery or resource conservation to reduce the solid waste volume entering a facility...
NASA Astrophysics Data System (ADS)
Mattiazzo, S.; Aimo, I.; Baudot, J.; Bedda, C.; La Rocca, P.; Perez, A.; Riggi, F.; Spiriti, E.
2015-10-01
The ALICE experiment at CERN will undergo a major upgrade in the second Long LHC Shutdown in the years 2018-2019; this upgrade includes the full replacement of the Inner Tracking System (ITS), deploying seven layers of Monolithic Active Pixel Sensors (MAPS). For the development of the new ALICE ITS, the Tower-Jazz 0.18 μm CMOS imaging sensor process has been chosen as it is possible to use full CMOS in the pixel and different silicon wafers (including high resistivity epitaxial layers). A large test campaign has been carried out on several small prototype chips, designed to optimize the pixel sensor layout and the front-end electronics. Results match the target requirements both in terms of performance and of radiation hardness. Following this development, the first full scale chips have been designed, submitted and are currently under test, with promising results. A telescope composed of 4 planes of Mimosa-28 and 2 planes of Mimosa-18 chips is under development at the DAFNE Beam Test Facility (BTF) at the INFN Laboratori Nazionali di Frascati (LNF) in Italy with the final goal to perform a comparative test of the full scale prototypes. The telescope has been recently used to test a Mimosa-22THRb chip (a monolithic pixel sensor built in the 0.18 μm Tower-Jazz process) and we foresee to perform tests on the full scale chips for the ALICE ITS upgrade at the beginning of 2015. In this contribution we will describe some first measurements of spatial resolution, fake hit rate and detection efficiency of the Mimosa-22THRb chip obtained at the BTF facility in June 2014 with an electron beam of 500 MeV.
On the Feasibility of a Pulsed 14 TeV C.M.E. Muon Collider in the LHC Tunnel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiltsev, Vladimir; Neuffer, D.
We discuss the technical feasibility, key machine pa-rameters and major challenges of a 14 TeV c.m.e. muon-muon collider in the LHC tunnel [1]. The luminosity of the collider is evaluated for three alternative muon sources – the PS synchrotron, one of a type developed by the US Muon Accelerator Program (MAP) and a low-emittance option based on resonant μ-pair production.
Functional and performance requirements of the next NOAA-Kasas City computer system
NASA Technical Reports Server (NTRS)
Mosher, F. R.
1985-01-01
The development of the Advanced Weather Interactive Processing System for the 1990's (AWIPS-90) will result in more timely and accurate forecasts with improved cost effectiveness. As part of the AWIPS-90 initiative, the National Meteorological Center (NMC), the National Severe Storms Forecast Center (NSSFC), and the National Hurricane Center (NHC) are to receive upgrades of interactive processing systems. This National Center Upgrade program will support the specialized inter-center communications, data acquisition, and processing needs of these centers. The missions, current capabilities and general functional requirements for the upgrade to the NSSFC are addressed. System capabilities are discussed along with the requirements for the upgraded system.
The TOTEM DAQ based on the Scalable Readout System (SRS)
NASA Astrophysics Data System (ADS)
Quinto, Michele; Cafagna, Francesco S.; Fiergolski, Adrian; Radicioni, Emilio
2018-02-01
The TOTEM (TOTal cross section, Elastic scattering and diffraction dissociation Measurement at the LHC) experiment at LHC, has been designed to measure the total proton-proton cross-section and study the elastic and diffractive scattering at the LHC energies. In order to cope with the increased machine luminosity and the higher statistic required by the extension of the TOTEM physics program, approved for the LHC's Run Two phase, the previous VME based data acquisition system has been replaced with a new one based on the Scalable Readout System. The system features an aggregated data throughput of 2GB / s towards the online storage system. This makes it possible to sustain a maximum trigger rate of ˜ 24kHz, to be compared with the 1KHz rate of the previous system. The trigger rate is further improved by implementing zero-suppression and second-level hardware algorithms in the Scalable Readout System. The new system fulfils the requirements for an increased efficiency, providing higher bandwidth, and increasing the purity of the data recorded. Moreover full compatibility has been guaranteed with the legacy front-end hardware, as well as with the DAQ interface of the CMS experiment and with the LHC's Timing, Trigger and Control distribution system. In this contribution we describe in detail the architecture of full system and its performance measured during the commissioning phase at the LHC Interaction Point.
On the search for the electric dipole moment of strange and charm baryons at LHC
NASA Astrophysics Data System (ADS)
Botella, F. J.; Garcia Martin, L. M.; Marangotto, D.; Martinez Vidal, F.; Merli, A.; Neri, N.; Oyanguren, A.; Ruiz Vidal, J.
2017-03-01
Permanent electric dipole moments (EDMs) of fundamental particles provide powerful probes for physics beyond the Standard Model. We propose to search for the EDM of strange and charm baryons at LHC, extending the ongoing experimental program on the neutron, muon, atoms, molecules and light nuclei. The EDM of strange Λ baryons, selected from weak decays of charm baryons produced in p p collisions at LHC, can be determined by studying the spin precession in the magnetic field of the detector tracking system. A test of CPT symmetry can be performed by measuring the magnetic dipole moment of Λ and \\overline{Λ} baryons. For short-lived {Λ} ^+c and {Ξ} ^+c baryons, to be produced in a fixed-target experiment using the 7 TeV LHC beam and channeled in a bent crystal, the spin precession is induced by the intense electromagnetic field between crystal atomic planes. The experimental layout based on the LHCb detector and the expected sensitivities in the coming years are discussed.
LHC searches for dark sector showers
NASA Astrophysics Data System (ADS)
Cohen, Timothy; Lisanti, Mariangela; Lou, Hou Keong; Mishra-Sharma, Siddharth
2017-11-01
This paper proposes a new search program for dark sector parton showers at the Large Hadron Collider (LHC). These signatures arise in theories characterized by strong dynamics in a hidden sector, such as Hidden Valley models. A dark parton shower can be composed of both invisible dark matter particles as well as dark sector states that decay to Standard Model particles via a portal. The focus here is on the specific case of `semi-visible jets,' jet-like collider objects where the visible states in the shower are Standard Model hadrons. We present a Simplified Model-like parametrization for the LHC observables and propose targeted search strategies for regions of parameter space that are not covered by existing analyses. Following the `mono- X' literature, the portal is modeled using either an effective field theoretic contact operator approach or with one of two ultraviolet completions; sensitivity projections are provided for all three cases. We additionally highlight that the LHC has a unique advantage over direct detection experiments in the search for this class of dark matter theories.
NASA Astrophysics Data System (ADS)
Bonacorsi, D.; Gutsche, O.
The Worldwide LHC Computing Grid (WLCG) project decided in March 2009 to perform scale tests of parts of its overall Grid infrastructure before the start of the LHC data taking. The "Scale Test for the Experiment Program" (STEP'09) was performed mainly in June 2009 -with more selected tests in September- October 2009 -and emphasized the simultaneous test of the computing systems of all 4 LHC experiments. CMS tested its Tier-0 tape writing and processing capabilities. The Tier-1 tape systems were stress tested using the complete range of Tier-1 work-flows: transfer from Tier-0 and custody of data on tape, processing and subsequent archival, redistribution of datasets amongst all Tier-1 sites as well as burst transfers of datasets to Tier-2 sites. The Tier-2 analysis capacity was tested using bulk analysis job submissions to backfill normal user activity. In this talk, we will report on the different performed tests and present their post-mortem analysis.
Main improvements of LHC Cryogenics Operation during Run 2 (2015-2018)
NASA Astrophysics Data System (ADS)
Delprat, L.; Bradu, B.; Brodzinski, K.; Ferlin, G.; Hafi, K.; Herblin, L.; Rogez, E.; Suraci, A.
2017-12-01
After the successful Run 1 (2010-2012), the LHC entered its first Long Shutdown period (LS1, 2013-2014). During LS1 the LHC cryogenic system went under a complete maintenance and consolidation program. The LHC resumed operation in 2015 with an increased beam energy from 4 TeV to 6.5 TeV. Prior to the new physics Run 2 (2015-2018), the LHC was progressively cooled down from ambient to the 1.9 K operation temperature. The LHC has resumed operation with beams in April 2015. Operational margins on the cryogenic capacity were reduced compared to Run 1, mainly due to the observed higher than expected electron-cloud heat load coming from increased beam energy and intensity. Maintaining and improving the cryogenic availability level required the implementation of a series of actions in order to deal with the observed heat loads. This paper describes the results from the process optimization and update of the control system, thus allowing the adjustment of the non-isothermal heat load at 4.5 - 20 K and the optimized dynamic behaviour of the cryogenic system versus the electron-cloud thermal load. Effects from the new regulation settings applied for operation on the electrical distribution feed-boxes and inner triplets will be discussed. The efficiency of the preventive and corrective maintenance, as well as the benefits and issues of the present cryogenic system configuration for Run 2 operational scenario will be described. Finally, the overall availability results and helium management of the LHC cryogenic system during the 2015-2016 operational period will be presented.
The other Higgses, at resonance, in the Lee-Wick extension of the Standard Model
NASA Astrophysics Data System (ADS)
Figy, Terrance; Zwicky, Roman
2011-10-01
Within the framework of the Lee-Wick Standard Model (LWSM) we investigate Higgs pair production gg → h 0 h 0, gg to {h_0}{tilde{p}_0} and top pair production gg to bar{t}t at the Large Hadron Collider (LHC), where the neutral particles from the Higgs sector ( h 0, {tilde{h}_0} and {tilde{p}_0} ) appear as possible resonant intermediate states. Depending on whether the LW Higgs state is below or above the top pair threshold either the hh or tt-channel are dominant and therefore of main interest. We investigate the signal gg to {h_0}{h_0} to bar{b}bγ γ and we find that the LW Higgs, depending on its mass-range, can be seen not long after the LHC upgrade in 2012. In gg to bar{t}t the LW states, due to the wrong-sign propagator and negative width, lead to a dip-peak structure instead of the usual peak-dip structure which gives a characteristic signal especially for low-lying LW Higgs states. We comment on the LWSM and the forward-backward asymmetry in view of the measurement at the TeVatron. Furthermore, we present a technique which reduces the hyperbolic diagonalization to standard diagonalization methods. We clarify issues of spurious phases in the Yukawa sector.
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
Claus, R.
2015-10-23
The ATLAS muon Cathode Strip Chamber (CSC) back-end readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run 2 luminosity. The readout design is based on the Reconfiguration Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the ATCA platform. The RCE design is based on the new System on Chip Xilinx Zynq series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources together with auxiliary memories to form a versatile DAQmore » building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the Zynq for G-link, S-link and TTC allowed the full system of 320 G-links from the 32 chambers to be processed by 6 COBs in one ATCA shelf through software waveform feature extraction to output 32 S-links. Furthermore, the full system was installed in Sept. 2014. We will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning towards LHC Run 2.« less
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
NASA Astrophysics Data System (ADS)
Claus, R.; ATLAS Collaboration
2016-07-01
The ATLAS muon Cathode Strip Chamber (CSC) back-end readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run 2 luminosity. The readout design is based on the Reconfiguration Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the ATCA platform. The RCE design is based on the new System on Chip Xilinx Zynq series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources together with auxiliary memories to form a versatile DAQ building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the Zynq for G-link, S-link and TTC allowed the full system of 320 G-links from the 32 chambers to be processed by 6 COBs in one ATCA shelf through software waveform feature extraction to output 32 S-links. The full system was installed in Sept. 2014. We will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning towards LHC Run 2.
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
NASA Astrophysics Data System (ADS)
Bartoldus, R.; Claus, R.; Garelli, N.; Herbst, R. T.; Huffer, M.; Iakovidis, G.; Iordanidou, K.; Kwan, K.; Kocian, M.; Lankford, A. J.; Moschovakos, P.; Nelson, A.; Ntekas, K.; Ruckman, L.; Russell, J.; Schernau, M.; Schlenker, S.; Su, D.; Valderanis, C.; Wittgen, M.; Yildiz, S. C.
2016-01-01
The ATLAS muon Cathode Strip Chamber (CSC) backend readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run-2 luminosity. The readout design is based on the Reconfigurable Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the Advanced Telecommunication Computing Architecture (ATCA) platform. The RCE design is based on the new System on Chip XILINX ZYNQ series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources. Together with auxiliary memories, all these components form a versatile DAQ building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the ZYNQ for high speed input and output fiberoptic links and TTC allowed the full system of 320 input links from the 32 chambers to be processed by 6 COBs in one ATCA shelf. The full system was installed in September 2014. We will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning for LHC Run 2.
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
Bartoldus, R.; Claus, R.; Garelli, N.; ...
2016-01-25
The ATLAS muon Cathode Strip Chamber (CSC) backend readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run-2 luminosity. The readout design is based on the Reconfigurable Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the Advanced Telecommunication Computing Architecture (ATCA) platform. The RCE design is based on the new System on Chip XILINX ZYNQ series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources. Together with auxiliary memories, all ofmore » these components form a versatile DAQ building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the ZYNQ for high speed input and output fiberoptic links and TTC allowed the full system of 320 input links from the 32 chambers to be processed by 6 COBs in one ATCA shelf. The full system was installed in September 2014. In conclusion, we will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning for LHC Run 2.« less
NASA Astrophysics Data System (ADS)
Mitsui, S.; Unno, Y.; Ikegami, Y.; Takubo, Y.; Terada, S.; Hara, K.; Takahashi, Y.; Jinnouchi, O.; Nagai, R.; Kishida, T.; Yorita, K.; Hanagaki, K.; Takashima, R.; Kamada, S.; Yamamura, K.
2013-01-01
Planar geometry silicon pixel and strip sensors for the high luminosity upgrade of the LHC (HL-LHC) require a high bias voltage of 1000 V in order to withstand a radiation damage caused by particle fluences of 1×1016 1 MeV neq/cm2 and 1×1015 1 MeV neq/cm2 for pixel and strip detectors, respectively. In order to minimize the inactive edge space that can withstand a bias voltage of 1000 V, edge regions susceptible to microdischarge (MD) should be carefully optimized. We fabricated diodes with various edge distances (slim-edge diodes) and with 1-3 multiple guard rings (multi-guard diodes). AC coupling insulators of strip sensors are vulnerable to sudden heavy charge deposition, such as an accidental beam splash, which may destroy the readout AC capacitors. Thus various types of punch-through-protection (PTP) structures were implemented in order to find the most effective structure to protect against heavy charge deposition. These samples were irradiated with 70 MeV protons at fluences of 5×1012 1 MeV neq/cm2-1×1016 1 MeV neq/cm2. Their performances were evaluated before and after irradiation in terms of an onset voltage of the MD, a turn-on voltage of the PTP, and PTP saturation resistance.
Achieving High Performance With TCP Over 40 GbE on NUMA Architectures for CMS Data Acquisition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bawej, Tomasz; et al.
2014-01-01
TCP and the socket abstraction have barely changed over the last two decades, but at the network layer there has been a giant leap from a few megabits to 100 gigabits in bandwidth. At the same time, CPU architectures have evolved into the multicore era and applications are expected to make full use of all available resources. Applications in the data acquisition domain based on the standard socket library running in a Non-Uniform Memory Access (NUMA) architecture are unable to reach full efficiency and scalability without the software being adequately aware about the IRQ (Interrupt Request), CPU and memory affinities.more » During the first long shutdown of LHC, the CMS DAQ system is going to be upgraded for operation from 2015 onwards and a new software component has been designed and developed in the CMS online framework for transferring data with sockets. This software attempts to wrap the low-level socket library to ease higher-level programming with an API based on an asynchronous event driven model similar to the DAT uDAPL API. It is an event-based application with NUMA optimizations, that allows for a high throughput of data across a large distributed system. This paper describes the architecture, the technologies involved and the performance measurements of the software in the context of the CMS distributed event building.« less
NASA Technical Reports Server (NTRS)
Justus, C. G.; Johnson, Dale
1990-01-01
The Global Reference Atmospheric Model (GRAM) is currently available in the 'GRAM-88' version (Justus, et al., 1986; 1988), which includes relatively minor upgrades and changes from the 'MOD-3' version (Justus, et al., 1980). Currently a project is underway to use large amounts of data, mostly collected under the Middle Atmosphere Program (MAP) to produce a major upgrade of the program planned for release as the GRAM-90 version. The new data and program revisions will particularly affect the 25-90 km height range. Sources of data and preliminary results are described here in the form of cross-sectional plots.
14 CFR 121.419 - Pilots and flight engineers: Initial, transition, and upgrade ground training.
Code of Federal Regulations, 2012 CFR
2012-01-01
...; (ix) Flight planning; (x) Each normal and emergency procedure; and (xi) The approved Airplane Flight... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Pilots and flight engineers: Initial... Program § 121.419 Pilots and flight engineers: Initial, transition, and upgrade ground training. (a...
14 CFR 121.419 - Pilots and flight engineers: Initial, transition, and upgrade ground training.
Code of Federal Regulations, 2011 CFR
2011-01-01
...; (ix) Flight planning; (x) Each normal and emergency procedure; and (xi) The approved Airplane Flight... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Pilots and flight engineers: Initial... Program § 121.419 Pilots and flight engineers: Initial, transition, and upgrade ground training. (a...
14 CFR 121.419 - Pilots and flight engineers: Initial, transition, and upgrade ground training.
Code of Federal Regulations, 2014 CFR
2014-01-01
...; (ix) Flight planning; (x) Each normal and emergency procedure; and (xi) The approved Airplane Flight... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Pilots and flight engineers: Initial... Program § 121.419 Pilots and flight engineers: Initial, transition, and upgrade ground training. Link to...
14 CFR 121.419 - Pilots and flight engineers: Initial, transition, and upgrade ground training.
Code of Federal Regulations, 2010 CFR
2010-01-01
...; (ix) Flight planning; (x) Each normal and emergency procedure; and (xi) The approved Airplane Flight... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Pilots and flight engineers: Initial... Program § 121.419 Pilots and flight engineers: Initial, transition, and upgrade ground training. (a...
14 CFR 121.419 - Pilots and flight engineers: Initial, transition, and upgrade ground training.
Code of Federal Regulations, 2013 CFR
2013-01-01
...; (ix) Flight planning; (x) Each normal and emergency procedure; and (xi) The approved Airplane Flight... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Pilots and flight engineers: Initial... Program § 121.419 Pilots and flight engineers: Initial, transition, and upgrade ground training. (a...
Development Status of Ion Source at J-PARC Linac Test Stand
NASA Astrophysics Data System (ADS)
Yamazaki, S.; Takagi, A.; Ikegami, K.; Ohkoshi, K.; Ueno, A.; Koizumi, I.; Oguri, H.
The Japan Proton Accelerator Research Complex (J-PARC) linac power upgrade program is now in progress in parallel with user operation. To realize a nominal performance of 1 MW at 3 GeV Rapid Cycling Synchrotron and 0.75 MW at the Main Ring synchrotron, we need to upgrade the peak beam current (50 mA) of the linac. For the upgrade program, we are testing a new front-end system, which comprises a cesiated RF-driven H- ion source and a new radio -frequency quadrupole linac (RFQ). The H- ion source was developed to satisfy the J-PARC upgrade requirements of an H- ion-beam current of 60 mA and a lifetime of more than 50 days. On February 6, 2014, the first 50 mA H- beams were accelerated by the RFQ during a beam test. To demonstrate the performance of the ion source before its installation in the summer of 2014, we tested the long-term stability through continuous beam operation, which included estimating the lifetime of the RF antenna and evaluating the cesium consumption.
2016-03-23
An American flag flutters in the breeze as NASA’s upgraded crawler-transporter 2 (CT-2) travels along the crawlerway during its trek to Launch Pad 39B at the agency’s Kennedy Space Center in Florida, to test recently completed upgrades and modifications for NASA’s journey to Mars. The Ground Systems Development and Operations Program at Kennedy oversaw upgrades to the crawler in the Vehicle Assembly Building. The crawler will carry the mobile launcher with Orion atop the Space Launch System rocket to Pad 39B for Exploration Mission-1, scheduled for 2018.
2016-03-23
Technicians walk alongside NASA’s upgraded crawler-transporter 2 (CT-2) as it continues the trek on the crawlerway from the Vehicle Assembly Building (VAB) at the agency’s Kennedy Space Center in Florida to Launch Pad 39B to test recently completed upgrades and modifications for NASA’s journey to Mars. The Ground Systems Development and Operations Program at Kennedy oversaw upgrades to the crawler in the VAB. The crawler will carry the mobile launcher with Orion atop the Space Launch System rocket to Pad 39B for Exploration Mission-1, scheduled for 2018.
2016-03-23
Technicians walk alongside NASA’s upgraded crawler-transporter 2 (CT-2) as it continues the trek along the crawlerway from the Vehicle Assembly Building (VAB) at the agency’s Kennedy Space Center in Florida to Launch Pad 39B to test recently completed upgrades and modifications for NASA’s journey to Mars. The Ground Systems Development and Operations Program at Kennedy oversaw upgrades to the crawler in the VAB. The crawler will carry the mobile launcher with Orion atop the Space Launch System rocket to Pad 39B for Exploration Mission-1, scheduled for 2018.
2016-03-23
A truck sprays water in front of NASA’s upgraded crawler-transporter 2 (CT-2) to control dust as it begins the trek from the Vehicle Assembly Building (VAB) at the agency’s Kennedy Space Center in Florida to Launch Pad 39B to test recently completed upgrades and modifications for NASA’s journey to Mars. The Ground Systems Development and Operations Program at Kennedy oversaw upgrades to the crawler in the VAB. The crawler will carry the mobile launcher with Orion atop the Space Launch System rocket to Pad 39B for Exploration Mission-1, scheduled for 2018.
Radiation hard programmable delay line for LHCb calorimeter upgrade
NASA Astrophysics Data System (ADS)
Mauricio, J.; Gascón, D.; Vilasís, X.; Picatoste, E.; Machefert, F.; Lefrancois, J.; Duarte, O.; Beigbeder, C.
2014-01-01
This paper describes the implementation of a SPI-programmable clock delay chip based on a Delay Locked Loop (DLL) in order to shift the phase of the LHC clock (25 ns) in steps of 1ns, with less than 5 ps jitter and 23 ps of DNL. The delay lines will be integrated into ICECAL, the LHCb calorimeter front-end analog signal processing ASIC in the near future. The stringent noise requirements on the ASIC imply minimizing the noise contribution of digital components. This is accomplished by implementing the DLL in differential mode. To achieve the required radiation tolerance several techniques are applied: double guard rings between PMOS and NMOS transistors as well as glitch suppressors and TMR Registers. This 5.7 mm2 chip has been implemented in CMOS 0.35 μm technology.
NASA Astrophysics Data System (ADS)
Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.
2018-05-01
The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.
Measurements of Dynamic Effects in FNAL 11 T Nb 3Sn Dipole Models
Velev, Gueorgui; Strauss, Thomas; Barzi, Emanuela; ...
2018-01-17
Fermilab, in collaboration with CERN, has developed a twin-aperture 11 T Nb 3Sn dipole suitable for the high-luminosity LHC upgrade. During 2012-2014, a 2-m long single-aperture dipole demonstrator and three 1-m long single-aperture dipole models were fabricated by FNAL and tested at its Vertical Magnet Test Facility. Collared coils from two of the 1-m long models were then used to assemble the first twin-aperture dipole demonstrator. This magnet had extensive testing in 2015-2016, including quench performance, quench protection, and field quality studies. Here, this paper reports the results of measurements of persistent current effects in the single-aperture and twin-aperture 11more » T Nb 3Sn dipoles and compares them with similar measurements in previous NbTi magnets« less
Measurements of Dynamic Effects in FNAL 11 T Nb 3Sn Dipole Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Velev, Gueorgui; Strauss, Thomas; Barzi, Emanuela
Fermilab, in collaboration with CERN, has developed a twin-aperture 11 T Nb 3Sn dipole suitable for the high-luminosity LHC upgrade. During 2012-2014, a 2-m long single-aperture dipole demonstrator and three 1-m long single-aperture dipole models were fabricated by FNAL and tested at its Vertical Magnet Test Facility. Collared coils from two of the 1-m long models were then used to assemble the first twin-aperture dipole demonstrator. This magnet had extensive testing in 2015-2016, including quench performance, quench protection, and field quality studies. Here, this paper reports the results of measurements of persistent current effects in the single-aperture and twin-aperture 11more » T Nb 3Sn dipoles and compares them with similar measurements in previous NbTi magnets« less
MWPC prototyping and testing for STAR inner TPC upgrade
NASA Astrophysics Data System (ADS)
Shen, F.; Wang, S.; Yang, C.; Xu, Q.
2017-06-01
STAR experiment at the Relativistic Heavy Ion Collider (RHIC) is upgrading the inner sectors of the Time Projection Chamber (iTPC). The iTPC upgrade project will increase the segmentation on the inner pad plane from 13 to 40 pad rows and renew the inner sector wire chambers. The upgrade will expand the TPC's acceptance from |η|<=1.0 to |η|<=1.5. Furthermore, the detector will have better acceptance for tracks with low momentum, as well as better resolution in both momentum and dE/dx for tracks of all momenta. The enhanced measurement capabilities of STAR-iTPC upgrade are crucial to the physics program of the Phase II of Beam Energy Scan (BES-II) at RHIC during 2019-2020, in particular the QCD phase transition study. In this proceedings, I will discuss the iTPC MWPC module fabrication and testing results from the first full size iTPC MWPC pre-prototype made at Shandong University.
Upgrading the Household Worker. Final Report (January 1967-September 1968).
ERIC Educational Resources Information Center
Willmart Services, Inc., Washington, DC.
A total of 108 women from low-income families who were underemployed and unemployed and who needed to develop marketable skills were selected for a 9-week training course offered by an agency which was established to upgrade the economic and social status of the household worker. The experimental program combined attitudinal training with training…
Upgrade to the Cryogenic Hydrogen Gas Target Monitoring System
NASA Astrophysics Data System (ADS)
Slater, Michael; Tribble, Robert
2013-10-01
The cryogenic hydrogen gas target at Texas A&M is a vital component for creating a secondary radioactive beam that is then used in experiments in the Momentum Achromat Recoil Spectrometer (MARS). A stable beam from the K500 superconducting cyclotron enters the gas cell and some incident particles are transmuted by a nuclear reaction into a radioactive beam, which are separated from the primary beam and used in MARS experiments. The pressure in the target chamber is monitored so that a predictable isotope production rate can be assured. A ``black box'' received the analog pressure data and sent RS232 serial data through an outdated serial connection to an outdated Visual Basic 6 (VB6) program, which plotted the chamber pressure continuously. The black box has been upgraded to an Arduino UNO microcontroller [Atmel Inc.], which can receive the pressure data and output via USB to a computer. It has been programmed to also accept temperature data for future upgrade. A new computer program, with updated capabilities, has been written in Python. The software can send email alerts, create audible alarms through the Arduino, and plot pressure and temperature. The program has been designed to better fit the needs of the users. Funded by DOE and NSF-REU Program.
Robotic acquisition programs: technical and performance challenges
NASA Astrophysics Data System (ADS)
Thibadoux, Steven A.
2002-07-01
The Unmanned Ground Vehicles/ Systems Joint Project Office (UGV/S JPO) is developing and fielding a variety of tactical robotic systems for the Army and Marine Corps. The Standardized Robotic System (SRS) provides a family of common components that can be installed in existing military vehicles, to allow unmanned operation of the vehicle and its payloads. The Robotic Combat Support System (RCSS) will be a medium sized unmanned system with interchangeable attachments, allowing a remote operator to perform a variety of engineering tasks. The Gladiator Program is a USMC initiative for a small to medium sized, highly mobile UGV to conduct scout/ surveillance missions and to carry various lethal and non-lethal payloads. Acquisition plans for these programs require preplanned evolutionary block upgrades to add operational capability, as new technology becomes available. This paper discusses technical and performance issues that must be resolved and the enabling technologies needed for near term block upgrades of these first generation robotic systems. Additionally, two Joint Robotics Program (JRP) initiatives, Robotic Acquisition through Virtual Environments and Networked Simulations (RAVENS) and Joint Architecture for Unmanned Ground Systems (JAUGS), will be discussed. RAVENS and JAUGS will be used to efficiently evaluate and integrate new technologies to be incorporated in system upgrades.
Status of PLS-II Upgrade Program
NASA Astrophysics Data System (ADS)
Kim, Kyung-Ryul; Wiedemann, Helmut; Park, Sung-Ju; Kim, Dong-Eon; Park, Chong-Do; Park, Sung-Soo; Kim, Seong-Hwan; Kim, Bongsoo; Namkung, Won; Nam, Sanghoon; Ree, Moonhor
2010-06-01
The Pohang Light Source (PLS) at the Pohang Accelerator Laboratory has been operated first at 2.0 GeV since 1995, and later was upgraded to 2.5 GeV. During this time, 6 insertion devices like undulators and multipole wigglers have been put into operation to produce special photon beams, with a total of 27 beamlines installed and 3 beamlines under construction. Recently, Korea synchrotron user's community is demanding high beam stability, higher photon energies as well as more straight sections for insertion devices in the PLS. To meet the user requirements, the PLS-II upgrade program has been launched in January, 2009, incorporating a modified chromatic version of Double Bend Achromat (DBA) to achieve almost twice as many straight sections as the current PLS with a design goal of the relatively low emittance, ɛ, of 5.9 nmṡrad. In the PLS-II, the top-up injection using full energy linac is planned for much higher stable beam as well and thus the production of hard x-ray undulator radiation of 8 to 13 keV is anticipated to allow for the successful research program namely Protein Crystallography. The PLS-II machine components of storage ring, linear accelerator and photon beamlines will be partly dismantled and reinstalled in a 6-months shutdown beginning January, 2011 and then the PLS-II upgrade be started the initial commissioning with a 100 mA beam current from July in 2011.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nadolsky, Pavel M.
2015-08-31
The report summarizes research activities of the project ”Integrated analysis of particle interactions” at Southern Methodist University, funded by 2010 DOE Early Career Research Award DE-SC0003870. The goal of the project is to provide state-of-the-art predictions in quantum chromodynamics in order to achieve objectives of the LHC program for studies of electroweak symmetry breaking and new physics searches. We published 19 journal papers focusing on in-depth studies of proton structure and integration of advanced calculations from different areas of particle phenomenology: multi-loop calculations, accurate long-distance hadronic functions, and precise numerical programs. Methods for factorization of QCD cross sections were advancedmore » in order to develop new generations of CTEQ parton distribution functions (PDFs), CT10 and CT14. These distributions provide the core theoretical input for multi-loop perturbative calculations by LHC experimental collaborations. A novel ”PDF meta-analysis” technique was invented to streamline applications of PDFs in numerous LHC simulations and to combine PDFs from various groups using multivariate stochastic sampling of PDF parameters. The meta-analysis will help to bring the LHC perturbative calculations to the new level of accuracy, while reducing computational efforts. The work on parton distributions was complemented by development of advanced perturbative techniques to predict observables dependent on several momentum scales, including production of massive quarks and transverse momentum resummation at the next-to-next-to-leading order in QCD.« less
SIPT: a seismic refraction inverse modeling program for timeshare terminal computer systems
Scott, James Henry
1977-01-01
SIPB is an interactive Fortran computer program that was developed for use with a timeshare computer system with program control information submitted from a remote terminal, and output data displayed on the terminal or printed on a line printer. The program is an upgraded version of FSIPI (Scott, Tibbetts, and Burdick, 1972) with several major improvements in addition to .its adaptation to timeshare operation. The most significant improvement was made in the procedure for handling data from in-line offset shotpoints beyond the end shotpoints of the geophone spread. The changes and improvements are described, user's instructions are outlined, examples of input and output data for a test problem are presented, and the Fortran program is listed in this report. An upgraded batch-mode program, SIPB, is available for users who do not have a timeshare computer system available (Scott, 1977).
SIPB: a seismic refraction inverse modeling program for batch computer systems
Scott, James Henry
1977-01-01
SIPB is an interactive Fortran computer program that was developed for use with a timeshare computer system with program control information submitted from a remote terminal, and output data displayed on the terminal or printed on a line printer. The program is an upgraded version of FSIPI (Scott, Tibbetts, and Burdick, 1972) with several major improvements in addition to .its adaptation to timeshare operation. The most significant improvement was made in the procedure for handling data from in-line offset shotpoints beyond the end shotpoints of the geophone spread. The changes and improvements are described, user's instructions are outlined, examples of input and output data for a test problem are presented, and the Fortran program is listed in this report. An upgraded batch-mode program, SIPB, is available for users who do not have a timeshare computer system available (Scott, 1977).
Automotive Stirling engine systems development
NASA Technical Reports Server (NTRS)
Richey, A. E.
1984-01-01
The objective of the Automotive Stirling Engine (ASE) program is to develop a Stirling engine for automotive use that provides a 30 percent improvement in fuel economy relative to a comparable internal-combustion engine while meeting emissions goals. This paper traces the engine systems' development efforts focusing on: (1) a summary of engine system performance for all Mod I engines; (2) the development, program conducted for the upgraded Mod I; and (3) vehicle systems work conducted to enhance vehicle fuel economy. Problems encountered during the upgraded Mod I test program are discussed. The importance of the EPA driving cycle cold-start penalty and the measures taken to minimize that penalty with the Mod II are also addressed.
NASA Astrophysics Data System (ADS)
Mdhluli, J. E.; Jivan, H.; Erasmus, R.; Davydov, Yu I.; Baranov, V.; Mthembu, S.; Mellado, B.; Sideras-Haddad, E.; Solovyanov, O.; Sandrock, C.; Peter, G.; Tlou, S.; Khanye, N.; Tjale, B.
2017-07-01
With the prediction that the plastic scintillators in the gap region of the Tile Calorimeter will sustain a significantly large amount of radiation damage during the HL-LHC run time, the current plastic scintillators will need to be replaced during the phase 2 upgrade in 2018. The scintillators in the gap region were exposed to a radiation environment of up to 10 kGy/year during the first run of data taking and with the luminosity being increased by a factor of 10, the radiation environment will be extremely harsh. We report on the radiation damage to the optical properties of plastic scintillators following irradiation using a neutron beam of the IBR-2 pulsed reactor in Joint Institute for Nuclear Research (JINR), Dubna. A comparison is drawn between polyvinyl toluene based commercial scintillators EJ200, EJ208 and EJ260 as well as polystyrene based scintillator from Kharkov. The samples were subjected to irradiation with high energy neutrons and a flux density range of 1 × 106-7.7 × 106. Light transmission, Raman spectroscopy, fluorescence spectroscopy and light yield testing was performed to characterize the damage induced in the samples. Preliminary results from the tests done indicate a minute change in the optical properties of the scintillators with further studies underway to gain a better understanding of the interaction between neutrons with plastic scintillators.
Photodetectors and front-end electronics for the LHCb RICH upgrade
NASA Astrophysics Data System (ADS)
Cassina, L.; LHCb RICH
2017-12-01
The RICH detectors of the LHCb experiment provide identification of hadrons produced in high energy proton-proton collisions in the LHC at CERN over a wide momentum range (2-100 GeV/c). Cherenkov light is collected on photon detector planes sensitive to single photons. The RICH will be upgraded (in 2019) to read out every bunch crossing, at a rate of 40 MHz. The current hybrid photon detectors (HPD) will be replaced with multi-anode photomultiplier tubes (customisations of the Hamamatsu R11265 and the H12699 MaPMTs). These 8×8 pixel devices meet the experimental requirements thanks to their small pixel size, high gain, negligible dark count rate (∼50 Hz/cm2) and moderate cross-talk. The measured performance of several tubes is reported, together with their long-term stability. A new 8-channel front-end chip, named CLARO, has been designed in 0.35 μm CMOS AMS technology for the MaPMT readout. The CLARO chip operates in binary mode and combines low power consumption (∼1 mW/Ch), wide bandwidth (baseline restored in ⩽ 25 ns) and radiation hardness. A 12-bit digital register permits the optimisation of the dynamic range and the threshold level for each channel and provides tools for the on-site calibration. The design choices and the characterization of the electronics are presented.
LHCb RICH Upgrade: an overview of the photon detector and electronic system
NASA Astrophysics Data System (ADS)
Cassina, L.
2016-01-01
The LHCb experiment is one of the four large detectors operating at the LHC at CERN and it is mainly devoted to CP violation measurements and to the search for new physics in rare decays of beauty and charm hadrons. The data from the two Ring Image Cherenkov (RICH-1 and RICH-2) detectors are essential to identify particles in a wide momentum range. From 2019 onwards 14 TeV collisions with luminosities reaching up to 2 × 1033 cm-2s-1 with 25 ns bunch spacing are planned, with the goal of collecting 5 fb-1 of data per year. In order to avoid degradation of the PID performance at such high rate (40 MHz), the RICH detector has to be upgraded. New photodetectors (Multi-anode photomultiplier tubes, MaPMTs) have been chosen and will be read out using an 8-channel chip, named CLARO, designed to sustain a photon counting rate up to 40 MHz, while minimizing the power consumption and the cross-talk. A 128-bit digital register allows selection of thresholds and attenuation values and provides features useful for testing and debugging. Photosensors and electronics are arranged in basic units, the first prototypes of which have been tested in charged particle beams in autumn 2014. An overview of the CLARO features and of the readout electronics is presented.
Integration of the ATLAS FE-I4 Pixel Chip in the Mini Time Projection Chamber
NASA Astrophysics Data System (ADS)
Lopez-Thibodeaux, Mayra; Garcia-Sciveres, Maurice; Kadyk, John; Oliver-Mallory, Kelsey
2013-04-01
This project deals with development of readout for a Time Projection Chamber (TPC) prototype. This is a type of detector proposed for direct detection of dark matter (WIMPS) with direction information. The TPC is a gaseous charged particle tracking detector composed of a field cage and a gas avalanche detector. The latter is made of two Gas Electron Multipliers in series, illuminating a pixel readout integrated circuit, which measures the distribution in position and time of the output charge. We are testing the TPC prototype, filled with ArCO2 gas, using a Fe-55 x-ray source and cosmic rays. The present prototype uses an FE-I3 chip for readout. This chip was developed about 10 years ago and is presently in use within the ATLAS pixel detector at the LHC. The aim of this work is to upgrade the TPC prototype to use an FE-I4 chip. The FE-I4 has an active area of 336 mm^2 and 26880 pixels, over nine times the number of pixels in the FE-I3 chip, and an active area about six times as much. The FE-I4 chip represents the state of the art of pixel detector readout, and is presently being used to build an upgrade of the ATLAS pixel detector.
An overview of autonomous rendezvous and docking system technology development
NASA Astrophysics Data System (ADS)
Nelson, Kurt D.
The Centaur upper stage was selected for an airborne avionics modernization program. The parts used in the existing avionics units were obsolete. Continued use of existing hardware would require substantial redesign, yet would result in the use of outdated hardware. Out of date processes, with very expensive and labor intensive technologies, were being used for manufacturing. The Atlas/Centaur avionics were to be procured at a fairly high rate that demanded the use of modern components. The new avionics also reduce size, weight, power, and parts count with a dramatic improvement in reliability. Finally, the cost leverage derived from upgrading the avionics as opposed to any other subsystem for the existing Atlas/Centaur was a very large consideration in the upgrade decision. The upgrade program is a multiyear effort that began in 1989. It includes telemetry, guidance and navigation, control electronics, thrust vector control, and redundancy levels.
The new CMS DAQ system for run-2 of the LHC
Bawej, Tomasz; Behrens, Ulf; Branson, James; ...
2015-05-21
The data acquisition (DAQ) system of the CMS experiment at the CERN Large Hadron Collider assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high level trigger (HLT) farm. The HLT farm selects interesting events for storage and offline analysis at a rate of around 1 kHz. The DAQ system has been redesigned during the accelerator shutdown in 2013/14. The motivation is twofold: Firstly, the current compute nodes, networking, and storage infrastructure will have reached the end of their lifetime by the time the LHC restarts. Secondly, in ordermore » to handle higher LHC luminosities and event pileup, a number of sub-detectors will be upgraded, increasing the number of readout channels and replacing the off-detector readout electronics with a μTCA implementation. The new DAQ architecture will take advantage of the latest developments in the computing industry. For data concentration, 10/40 Gb/s Ethernet technologies will be used, as well as an implementation of a reduced TCP/IP in FPGA for a reliable transport between custom electronics and commercial computing hardware. A Clos network based on 56 Gb/s FDR Infiniband has been chosen for the event builder with a throughput of ~ 4 Tb/s. The HLT processing is entirely file based. This allows the DAQ and HLT systems to be independent, and to use the HLT software in the same way as for the offline processing. The fully built events are sent to the HLT with 1/10/40 Gb/s Ethernet via network file systems. Hierarchical collection of HLT accepted events and monitoring meta-data are stored into a global file system. As a result, this paper presents the requirements, technical choices, and performance of the new system.« less
Fartoukh, Stéphane; Valishev, Alexander; Papaphilippou, Yannis; ...
2015-12-01
Colliding bunch trains in a circular collider demands a certain crossing angle in order to separate the two beams transversely after the collision. The magnitude of this crossing angle is a complicated function of the bunch charge, the number of long-range beam-beam interactions, of β* and type of optics (flat or round), and possible compensation or additive effects between several low-β insertions in the ring depending on the orientation of the crossing plane at each interaction point. About 15 years ago, the use of current bearing wires was proposed at CERN in order to mitigate the long-range beam-beam effects [J.P. Koutchouk,more » CERN Report No. LHC-Project-Note 223, 2000], therefore offering the possibility to minimize the crossing angle with all the beneficial effects this might have: on the luminosity performance by reducing the need for crab-cavities or lowering their voltage, on the required aperture of the final focus magnets, on the strength of the orbit corrector involved in the crossing bumps, and finally on the heat load and radiation dose deposited in the final focus quadrupoles. In this paper, a semianalytical approach is developed for the compensation of the long-range beam-beam interactions with current wires. This reveals the possibility of achieving optimal correction through a careful adjustment of the aspect ratio of the β functions at the wire position. We consider the baseline luminosity upgrade plan of the Large Hadron Collider (HL-LHC project), and compare it to alternative scenarios, or so-called ''configurations,'' where modifications are applied to optics, crossing angle, or orientation of the crossing plane in the two low-β insertions of the ring. Furthermore, for all these configurations, the beneficial impact of beam-beam compensation devices is then demonstrated on the tune footprint, the dynamical aperture, and/or the frequency map analysis of the nonlinear beam dynamics as the main figures of merit.« less
The DAQ needle in the big-data haystack
NASA Astrophysics Data System (ADS)
Meschi, E.
2015-12-01
In the last three decades, HEP experiments have faced the challenge of manipulating larger and larger masses of data from increasingly complex, heterogeneous detectors with millions and then tens of millions of electronic channels. LHC experiments abandoned the monolithic architectures of the nineties in favor of a distributed approach, leveraging the appearence of high speed switched networks developed for digital telecommunication and the internet, and the corresponding increase of memory bandwidth available in off-the-shelf consumer equipment. This led to a generation of experiments where custom electronics triggers, analysing coarser-granularity “fast” data, are confined to the first phase of selection, where predictable latency and real time processing for a modest initial rate reduction are “a necessary evil”. Ever more sophisticated algorithms are projected for use in HL- LHC upgrades, using tracker data in the low-level selection in high multiplicity environments, and requiring extremely complex data interconnects. These systems are quickly obsolete and inflexible but must nonetheless survive and be maintained across the extremely long life span of current detectors. New high-bandwidth bidirectional links could make high-speed low-power full readout at the crossing rate a possibility already in the next decade. At the same time, massively parallel and distributed analysis of unstructured data produced by loosely connected, “intelligent” sources has become ubiquitous in commercial applications, while the mass of persistent data produced by e.g. the LHC experiments has made multiple pass, systematic, end-to-end offline processing increasingly burdensome. A possible evolution of DAQ and trigger architectures could lead to detectors with extremely deep asynchronous or even virtual pipelines, where data streams from the various detector channels are analysed and indexed in situ quasi-real-time using intelligent, pattern-driven data organization, and the final selection is operated as a distributed “search for interesting event parts”. A holistic approach is required to study the potential impact of these different developments on the design of detector readout, trigger and data acquisition systems in the next decades.
MLP-1 on Crawler Transporter 2 (CT-2)
2017-03-22
NASA's upgraded crawler-transporter 2 (CT-2), carrying mobile launcher platform 1, moves slowly along the crawlerway at the agency's Kennedy Space Center in Florida. The crawler's upgrades and modifications will be monitored and tested under loaded conditions during its travel to the crawlerway Pad A/B split and back to the crawler yard to confirm it is ready to support the load of the mobile launcher carrying the Space Launch System with Orion atop for the first test flight, Exploration Mission 1. The Ground Systems Development and Operations Program at Kennedy is managing upgrades to the crawler.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haase, M.; Hine, C.; Robertson, C.
1996-12-31
Approximately five years ago, the Safe, Secure Dismantlement program was started between the US and countries of the Former Soviet Union (FSU). The purpose of the program is to accelerate progress toward reducing the risk of nuclear weapons proliferation, including such threats as theft, diversion, and unauthorized possession of nuclear materials. This would be accomplished by strengthening the material protection, control, and accounting systems within the FSU countries. Under the US Department of Energy`s program of providing cooperative assistance to the FSU countries in the areas of Material Protection, Control, and Accounting (MPC and A), the Latvian Academy of Sciencesmore » Nuclear Research Center (LNRC) near Riga, Latvia, was identified as a candidate site for a cooperative MPC and A project. The LNRC is the site of a 5-megawatt IRT-C pool-type research reactor. This paper describes: the process involved, from initial contracting to project completion, for the physical protection upgrades now in place at the LNRC; the intervening activities; and a brief overview of the technical aspects of the upgrades.« less
H4DAQ: a modern and versatile data-acquisition package for calorimeter prototypes test-beams
NASA Astrophysics Data System (ADS)
Marini, A. C.
2018-02-01
The upgrade of the particle detectors for the HL-LHC or for future colliders requires an extensive program of tests to qualify different detector prototypes with dedicated test beams. A common data-acquisition system, H4DAQ, was developed for the H4 test beam line at the North Area of the CERN SPS in 2014 and it has since been adopted in various applications for the CMS experiment and AIDA project. Several calorimeter prototypes and precision timing detectors have used our system from 2014 to 2017. H4DAQ has proven to be a versatile application and has been ported to many other beam test environments. H4DAQ is fast, simple, modular and can be configured to support various kinds of setup. The functionalities of the DAQ core software are split into three configurable finite state machines: data readout, run control, and event builder. The distribution of information and data between the various computers is performed using ZEROMQ (0MQ) sockets. Plugins are available to read different types of hardware, including VME crates with many types of boards, PADE boards, custom front-end boards and beam instrumentation devices. The raw data are saved as ROOT files, using the CERN C++ ROOT libraries. A Graphical User Interface, based on the python gtk libraries, is used to operate the H4DAQ and an integrated data quality monitoring (DQM), written in C++, allows for fast processing of the events for quick feedback to the user. As the 0MQ libraries are also available for the National Instruments LabVIEW program, this environment can easily be integrated within H4DAQ applications.
Study of the Cooldown and Warmup for the Eight Sectors of the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Liu, L.; Riddone, G.; Tavian, L.
2004-06-01
The LHC cryogenic system is based on a five-point feed scheme with eight refrigerators serving the eight sectors of the LHC machine. The paper presents the simplified flow scheme of the eight sectors and the mathematical methods including the program flowchart and the boundary conditions to simulate the cooldown and warmup of these sectors. The methods take into account the effect of the pressure drop across the valves as well as the pressure evolution in the different headers of the cryogenic distribution line. The simulated pressure and temperature profiles of headers of the LHC sector during the cooldown and warmup are given and the temperature evolutions of entire processes of cooldown and warmup are presented. As a conclusion, the functions of the input temperature for the normal and fast cooldown and warmup, the cooldown and warmup time of each sector and the distributions of mass flow rates in each sector are summarized. The results indicate that it is possible to cool down any of the LHC sector within 12.7 days in normal operation and 6.8 days in case of fast operation.
1996-05-01
Systems 17 Motion Bases Upgraded Aero Misc > Contact Info • AMC, Lt Col Letica , Phone (618) 256-5696 L8 Upgrade Flow Plan Advance Planning Briefing...Projected Schedule: FY 05 > Requirement Document: AMMP >• Projected Program Size: ? < $40M > Funding Status/Stability: > POCs: • AMC, Lt Col Letica
New PDS will predict performance of pallets made with used parts
John W. Clarke; Marshall S. White; Philip A. Araman
2001-01-01
The Pallet Design System (PDS) is a computer design program developed by Virginia Tech, the National Wooden Pallet & Container Association, and the U.S. Forest Service to quickly and accurately predict the performance of new wood pallets. PDS has been upgraded annually since its original version in 1984. All of the previous upgrades, however, have continued to...
Lewis Research Center support of Chrysler upgraded engine program
NASA Technical Reports Server (NTRS)
Warren, E. L.
1978-01-01
Running of the upgraded engine has indicated that, although the engine is mechanically sound, it is deficient in power. Recent modifications and corrective action have improved this. Testing of the engine is being done in the test cell. This simulates an automobile installation. Located in the inlet flow ducts are two turbine flow meters to measure engine air flow.
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY. Office of Research, Evaluation, and Assessment.
An evaluation was done of the New York City Public Schools' Student Upgrading through Computer and Career Education Systems Services Program (Project SUCCESS). Project SUCCESS operated at 3 high schools in Brooklyn and Manhattan (Murry Bergtraum High School, Edward R. Murrow High School, and John Dewey High School). It enrolled limited English…
ERIC Educational Resources Information Center
Ross, Stan
1970-01-01
Traces the evolution of a program of vocational education in which local firms' advice and surplus equipment have upgraded the school's program through teaching aids, summer work experiences for teachers, and curriculum materials. (RH)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carvill, Anna; Bushman, Kate; Ellsworth, Amy
2014-06-17
The EnergyFit Nevada (EFN) Better Buildings Neighborhood Program (BBNP, and referred to in this document as the EFN program) currently encourages Nevada residents to make whole-house energy-efficient improvements by providing rebates, financing, and access to a network of qualified home improvement contractors. The BBNP funding, consisting of 34 Energy Efficiency Conservation Block Grants (EECBG) and seven State Energy Program (SEP) grants, was awarded for a three-year period to the State of Nevada in 2010 and used for initial program design and implementation. By the end of first quarter in 2014, the program had achieved upgrades in 553 homes, with anmore » average energy reduction of 32% per home. Other achievements included: Completed 893 residential energy audits and installed upgrades in 0.05% of all Nevada single-family homes1 Achieved an overall conversation rate of 38.1%2 7,089,089 kWh of modeled energy savings3 Total annual homeowner energy savings of approximately $525,7523 Efficiency upgrades completed on 1,100,484 square feet of homes3 $139,992 granted in loans to homeowners for energy-efficiency upgrades 29,285 hours of labor and $3,864,272 worth of work conducted by Nevada auditors and contractors4 40 contractors trained in Nevada 37 contractors with Building Performance Institute (BPI) certification in Nevada 19 contractors actively participating in the EFN program in Nevada 1 Calculated using 2012 U.S. Census data reporting 1,182,870 homes in Nevada. 2 Conversion rate through March 31, 2014, for all Nevada Retrofit Initiative (NRI)-funded projects, calculated using the EFN tracking database. 3 OptiMiser energy modeling, based on current utility rates. 4 This is the sum of $3,596,561 in retrofit invoice value and $247,711 in audit invoice value.« less
Single top quark photoproduction at the LHC
NASA Astrophysics Data System (ADS)
de Favereau de Jeneret, J.; Ovyn, S.
2008-08-01
High-energy photon-proton interactions at the LHC offer interesting possibilities for the study of the electroweak sector up to TeV scale and searches for processes beyond the Standard Model. An analysis of the W associated single top photoproduction has been performed using the adapted MadGraph/MadEvent [F. Maltoni and T. Stelzer, JHEP 0302, (2003) 027; T. Stelzer and W.F. Long, Phys. Commun. 81, (1994) 357-371] and CalcHEP [A. Pukhov, Nucl. Inst. Meth A 502, (2003) 596-598] programs interfaced to the Pythia [T. Sjöstrand et al., Comput. Phys. Commun. 135, (2001) 238] generator and a fast detector simulation program. Event selection and suppression of main backgrounds have been studied. A comparable sensitivity to |V| to those obtained using the standard single top production in pp collisions has been achieved already for 10 fb of integrated luminosity. Photoproduction at the LHC provides also an attractive framework for observation of the anomalous production of single top due to Flavour-Changing Neutral Currents. The sensitivity to anomalous coupling parameters, k and k is presented and indicates that stronger limits can be placed on anomalous couplings after 1 fb.
Implementation and evolution of a regional chronic disease self-management program.
Liddy, Clare; Johnston, Sharon; Nash, Kate; Irving, Hannah; Davidson, Rachel
2016-08-15
To establish a comprehensive, community-based program to improve and sustain self-management support for individuals with chronic diseases and complement office-based strategies to support behaviour change. Health service delivery organizations. The Champlain Local Health Integration Network (LHIN), a health district in Eastern Ontario. We created Living Healthy Champlain (LHC), a regional organization providing peer leader training and coordination for the group Stanford Chronic Disease Self-Management Program (CDSMP); skills training and mentorship in behaviour change approaches for health care providers; and support to organizations to integrate self-management support into routine practice. We used the RE-AIM framework to evaluate the overall program's impact by exploring its reach, effectiveness, adoption, implementation and maintenance. A total of 232 Stanford CDSMP sessions (63 during the pilot project and 169 post-pilot) have been held at 127 locations in 24 cities across the Champlain LHIN, reaching approximately 4,000 patients. The effectiveness of the service was established through ongoing evidence reviews, a focus group and a pre-post utilization study of the pilot. LHC trained over 300 peer volunteers to provide the Stanford CDSMP sessions, 98 of whom continue to activelyhost workshops. An additional 1,327 providers have been trained in other models of self-management support, such as Health Coaching and Motivational Interviewing. Over the study period, LHC grew from a small pilot project to a regional initiative with sustainable provincial funding and was adopted by the province as a model for similar service delivery across Ontario. A community-based self-management program working in partnership with primary care can be effectively and broadly implemented in support of patients living with chronic conditions.
Linear Collider Physics Resource Book Snowmass 2001
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronan
The American particle physics community can look forward to a well-conceived and vital program of experimentation for the next ten years, using both colliders and fixed target beams to study a wide variety of pressing questions. Beyond 2010, these programs will be reaching the end of their expected lives. The CERN LHC will provide an experimental program of the first importance. But beyond the LHC, the American community needs a coherent plan. The Snowmass 2001 Workshop and the deliberations of the HEPAP subpanel offer a rare opportunity to engage the full community in planning our future for the next decademore » or more. A major accelerator project requires a decade from the beginning of an engineering design to the receipt of the first data. So it is now time to decide whether to begin a new accelerator project that will operate in the years soon after 2010. We believe that the world high-energy physics community needs such a project. With the great promise of discovery in physics at the next energy scale, and with the opportunity for the uncovering of profound insights, we cannot allow our field to contract to a single experimental program at a single laboratory in the world. We believe that an e{sup +}e{sup -} linear collider is an excellent choice for the next major project in high-energy physics. Applying experimental techniques very different from those used at hadron colliders, an e{sup +}e{sup -} linear collider will allow us to build on the discoveries made at the Tevatron and the LHC, and to add a level of precision and clarity that will be necessary to understand the physics of the next energy scale. It is not necessary to anticipate specific results from the hadron collider programs to argue for constructing an e{sup +}e{sup -} linear collider; in any scenario that is now discussed, physics will benefit from the new information that e{sup +}e{sup -} experiments can provide. This last point merits further emphasis. If a new accelerator could be designed and built in a few years, it would make sense to wait for the results of each accelerator before planning the next one. Thus, we would wait for the results from the Tevatron before planning the LHC experiments, and wait for the LHC before planning any later stage. In reality accelerators require a long time to construct, and they require such specialized resources and human talent that delay can cripple what would be promising opportunities. In any event, we believe that the case for the linear collider is so compelling and robust that we can justify this facility on the basis of our current knowledge, even before the Tevatron and LHC experiments are done. The physics prospects for the linear collider have been studied intensively for more than a decade, and arguments for the importance of its experimental program have been developed from many different points of view. This book provides an introduction and a guide to this literature. We hope that it will allow physicists new to the consideration of linear collider physics to start from their own personal perspectives and develop their own assessments of the opportunities afforded by a linear collider.« less
Upgrades at the Duke Free Electron Laser Laboratory
NASA Astrophysics Data System (ADS)
Howell, Calvin R.
2004-11-01
Major upgrades to the storage-ring based photon sources at the Duke Free Electron Laser Laboratory (DFELL) are underway. The photon sources at the DFELL are well suited for research in the areas of medicine, biophysics, accelerator physics, nuclear physics and material science. These upgrades, which will be completed by the summer 2006, will significantly enhance the capabilities of the ultraviolet (UV) free-electron laser (FEL) and the high intensity gamma-ray source (HIGS). The HIGS is a relatively new research facility at the DFELL that is operated jointly by the DFELL and the Triangle Universities Nuclear Laboratory. The gamma-ray beam is produced by Compton back scattering of the UV photons inside the FEL optical cavity off the circulating electrons in the storage ring. The gamma-ray beam is 100% polarized and its energy resolution is selected by collimation. The capabilities of the upgraded facility will be described, the status of the upgrades will be summarized, and the proposed first-generation research program at HIGS will be presented.
Assessment of thermal loads in the CERN SPS crab cavities cryomodule1
NASA Astrophysics Data System (ADS)
Carra, F.; Apeland, J.; Calaga, R.; Capatina, O.; Capelli, T.; Verdú-Andrés, S.; Zanoni, C.
2017-07-01
As a part of the HL-LHC upgrade, a cryomodule is designed to host two crab cavities for a first test with protons in the SPS machine. The evaluation of the cryomodule heat loads is essential to dimension the cryogenic infrastructure of the system. The current design features two cryogenic circuits. The first circuit adopts superfluid helium at 2 K to maintain the cavities in the superconducting state. The second circuit, based on helium gas at a temperature between 50 K and 70 K, is connected to the thermal screen, also serving as heat intercept for all the interfaces between the cold mass and the external environment. An overview of the heat loads to both circuits, and the combined numerical and analytical estimations, is presented. The heat load of each element is detailed for the static and dynamic scenarios, with considerations on the design choices for the thermal optimization of the most critical components.
The FE-I4 Pixel Readout Chip and the IBL Module
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbero, Marlon; Arutinov, David; Backhaus, Malte
2012-05-01
FE-I4 is the new ATLAS pixel readout chip for the upgraded ATLAS pixel detector. Designed in a CMOS 130 nm feature size process, the IC is able to withstand higher radiation levels compared to the present generation of ATLAS pixel Front-End FE-I3, and can also cope with higher hit rate. It is thus suitable for intermediate radii pixel detector layers in the High Luminosity LHC environment, but also for the inserted layer at 3.3 cm known as the 'Insertable B-Layer' project (IBL), at a shorter timescale. In this paper, an introduction to the FE-I4 will be given, focusing on testmore » results from the first full size FE-I4A prototype which has been available since fall 2010. The IBL project will be introduced, with particular emphasis on the FE-I4-based module concept.« less
Commercial Buck Converters and Custom Coil Development for the ATLAS Inner Detector Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dhawan, S.; Lanni, F.; Baker, O.
2010-04-01
A new generation of higher gain commercial buck converters built using advanced short channel CMOS processes has the potential to operate in the Atlas Inner Detector at the Super Large Hadron Collider (sLHC). This approach would inherently be more efficient than the existing practice of locating the power conversion external to the detector. The converters must operate in a large magnetic field and be able to survive both high doses of ionizing radiation and large neutron fluences. The presence of a large magnetic field necessitates the use of an air core inductor which is developed and discussed here. Noise measurementsmore » will be made to investigate the effect of the high frequency switching of the buck converter on the sensitive front end electronics. Radiation hardness of selected buck converters and mosfets will also be reported.« less
NASA Astrophysics Data System (ADS)
Delle Fratte, C.; Kennedy, J. A.; Kluth, S.; Mazzaferro, L.
2015-12-01
In a grid computing infrastructure tasks such as continuous upgrades, services installations and software deployments are part of an admins daily work. In such an environment tools to help with the management, provisioning and monitoring of the deployed systems and services have become crucial. As experiments such as the LHC increase in scale, the computing infrastructure also becomes larger and more complex. Moreover, today's admins increasingly work within teams that share responsibilities and tasks. Such a scaled up situation requires tools that not only simplify the workload on administrators but also enable them to work seamlessly in teams. In this paper will be presented our experience from managing the Max Planck Institute Tier2 using Puppet and Gitolite in a cooperative way to help the system administrator in their daily work. In addition to describing the Puppet-Gitolite system, best practices and customizations will also be shown.
New detectors to explore the lifetime frontier
NASA Astrophysics Data System (ADS)
Chou, John Paul; Curtin, David; Lubatti, H. J.
2017-04-01
Long-lived particles (LLPs) are a common feature in many beyond the Standard Model theories, including supersymmetry, and are generically produced in exotic Higgs decays. Unfortunately, no existing or proposed search strategy will be able to observe the decay of non-hadronic electrically neutral LLPs with masses above ∼ GeV and lifetimes near the limit set by Big Bang Nucleosynthesis (BBN), cτ ≲107-108 m. We propose the MATHUSLA surface detector concept (MAssive Timing Hodoscope for Ultra Stable neutraL pArticles), which can be implemented with existing technology and in time for the high luminosity LHC upgrade to find such ultra-long-lived particles (ULLPs), whether produced in exotic Higgs decays or more general production modes. We also advocate a dedicated LLP detector at a future 100 TeV collider, where a modestly sized underground design can discover ULLPs with lifetimes at the BBN limit produced in sub-percent level exotic Higgs decays.
The design of a fast Level 1 Track trigger for the ATLAS High Luminosity Upgrade
NASA Astrophysics Data System (ADS)
Miller Allbrooke, Benedict Marc; ATLAS Collaboration
2017-10-01
The ATLAS experiment at the high-luminosity LHC will face a five-fold increase in the number of interactions per collision relative to the ongoing Run 2. This will require a proportional improvement in rejection power at the earliest levels of the detector trigger system, while preserving good signal efficiency, due to the increase in the likelihood of individual trigger thresholds being passed as a result of pile-up related activity. One critical aspect of this improvement will be the implementation of precise track reconstruction, through which sharper turn-on curves, b-tagging and tau-tagging techniques can in principle be implemented. The challenge of such a project comes in the development of a fast, precise custom electronic device integrated in the hardware-based first trigger level of the experiment, with repercussions propagating as far as the detector read-out philosophy.
Better O and M Programs is Ultimate Answer to O and M Problems
ERIC Educational Resources Information Center
Davanzo, A. C.; Thompson, William B.
1978-01-01
Describes is an improvement program for the operation and maintenance of municipal wastewater treatment plants in Detroit, Michigan. Improvements included expansion and upgrading of the existing city plant and a training program for plant personnel. (MA)
Overview of the Icing and Flow Quality Improvements Program for the NASA Glenn Icing Research Tunnel
NASA Technical Reports Server (NTRS)
Irvine, Thomas B.; Kevdzija, Susan L.; Sheldon, David W.; Spera, David A.
2001-01-01
Major upgrades were made in 1999 to the 6- by 9-Foot (1.8- by 2.7-m) Icing Research Tunnel (IRT) at the NASA Glenn Research Center. These included replacement of the electronic controls for the variable-speed drive motor, replacement of the heat exchanger, complete replacement and enlargement of the leg of the tunnel containing the new heat-exchanger, the addition of flow-expanding and flow-contracting turning vanes upstream and downstream of the heat exchanger, respectively, and the addition of fan outlet guide vanes (OGV's). This paper describes the rationale behind this latest program of IRT upgrades and the program's requirements and goals. An overview is given of the scope of work undertaken by the design and construction contractors, the scale-model IRT (SMIRT) design verification program, the comprehensive reactivation test program initiated upon completion of construction, and the overall management approach followed.
Audette, Jennifer Gail; Baldew, Se-Sergio; Chang, Tony C M S; de Vries, Jessica; Ho A Tham, Nancy; Janssen, Johanna; Vyt, Andre
2017-01-01
To describe how a multinational team worked together to transition a physical therapy (PT) educational program in Paramaribo, Suriname, from a Bachelor level to a Master of Science in Physical Therapy (MSPT) level. The team was made up of PT faculty from Anton De Kom Universiteit van Suriname (AdeKUS), the Flemish Interuniversity Council University Development Cooperation (VLIR-UOS) leadership, and Health Volunteers Overseas volunteers. In this case study, the process for curricular assessment, redesign, and upgrade is described retrospectively using a Plan, Do, Study, Act (PDSA) framework. PT educational programs in developing countries are eager for upgrade to meet international expectations and to better meet community health-care needs. An ongoing process which included baseline assessment of all aspects of the existing bachelor's program in PT, development of a plan for a MSPT, implementation of the master's program, and evaluation following implementation is described. Curricular assessment and upgrade in resource-limited countries requires the implementation of process-oriented methods. The PDSA process is a useful tool to explore curricular development. The international collaboration described in this paper provides an example of the diligence, consistency, and dedication required to see a project through and achieve success while providing adequate support to the host site. This project might provide valuable insights for those involved in curricular redesign in similar settings.
Detailed studies of full-size ATLAS12 sensors
NASA Astrophysics Data System (ADS)
Hommels, L. B. A.; Allport, P. P.; Baca, M.; Broughton, J.; Chisholm, A.; Nikolopoulos, K.; Pyatt, S.; Thomas, J. P.; Wilson, J. A.; Kierstead, J.; Kuczewski, P.; Lynn, D.; Arratia, M.; Klein, C. T.; Ullan, M.; Fleta, C.; Fernandez-Tejero, J.; Bloch, I.; Gregor, I. M.; Lohwasser, K.; Poley, L.; Tackmann, K.; Trofimov, A.; Yildirim, E.; Hauser, M.; Jakobs, K.; Kuehn, S.; Mahboubi, K.; Mori, R.; Parzefall, U.; Clark, A.; Ferrere, D.; Gonzalez Sevilla, S.; Ashby, J.; Blue, A.; Bates, R.; Buttar, C.; Doherty, F.; McMullen, T.; McEwan, F.; O`Shea, V.; Kamada, S.; Yamamura, K.; Ikegami, Y.; Nakamura, K.; Takubo, Y.; Unno, Y.; Takashima, R.; Chilingarov, A.; Fox, H.; Affolder, A. A.; Casse, G.; Dervan, P.; Forshaw, D.; Greenall, A.; Wonsak, S.; Wormald, M.; Cindro, V.; Kramberger, G.; Mandić, I.; Mikuž, M.; Gorelov, I.; Hoeferkamp, M.; Palni, P.; Seidel, S.; Taylor, A.; Toms, K.; Wang, R.; Hessey, N. P.; Valencic, N.; Hanagaki, K.; Dolezal, Z.; Kodys, P.; Bohm, J.; Stastny, J.; Mikestikova, M.; Bevan, A.; Beck, G.; Milke, C.; Domingo, M.; Fadeyev, V.; Galloway, Z.; Hibbard-Lubow, D.; Liang, Z.; Sadrozinski, H. F.-W.; Seiden, A.; To, K.; French, R.; Hodgson, P.; Marin-Reyes, H.; Parker, K.; Jinnouchi, O.; Hara, K.; Sato, K.; Sato, K.; Hagihara, M.; Iwabuchi, S.; Bernabeu, J.; Civera, J. V.; Garcia, C.; Lacasta, C.; Marti i Garcia, S.; Rodriguez, D.; Santoyo, D.; Solaz, C.; Soldevila, U.
2016-09-01
The "ATLAS ITk Strip Sensor Collaboration" R&D group has developed a second iteration of single-sided n+-in-p type micro-strip sensors for use in the tracker upgrade of the ATLAS experiment at the High-Luminosity (HL) LHC. The full size sensors measure approximately 97 × 97mm2 and are designed for tolerance against the 1.1 ×1015neq /cm2 fluence expected at the HL-LHC. Each sensor has 4 columns of 1280 individual 23.9 mm long channels, arranged at 74.5 μm pitch. Four batches comprising 120 sensors produced by Hamamatsu Photonics were evaluated for their mechanical, and electrical bulk and strip characteristics. Optical microscopy measurements were performed to obtain the sensor surface profile. Leakage current and bulk capacitance properties were measured for each individual sensor. For sample strips across the sensor batches, the inter-strip capacitance and resistance as well as properties of the punch-through protection structure were measured. A multi-channel probecard was used to measure leakage current, coupling capacitance and bias resistance for each individual channel of 100 sensors in three batches. The compiled results for 120 unirradiated sensors are presented in this paper, including summary results for almost 500,000 strips probed. Results on the reverse bias voltage dependence of various parameters and frequency dependence of tested capacitances are included for validation of the experimental methods used. Comparing results with specified values, almost all sensors fall well within specification.
Extending the farm on external sites: the INFN Tier-1 experience
NASA Astrophysics Data System (ADS)
Boccali, T.; Cavalli, A.; Chiarelli, L.; Chierici, A.; Cesini, D.; Ciaschini, V.; Dal Pra, S.; dell'Agnello, L.; De Girolamo, D.; Falabella, A.; Fattibene, E.; Maron, G.; Prosperini, A.; Sapunenko, V.; Virgilio, S.; Zani, S.
2017-10-01
The Tier-1 at CNAF is the main INFN computing facility offering computing and storage resources to more than 30 different scientific collaborations including the 4 experiments at the LHC. It is also foreseen a huge increase in computing needs in the following years mainly driven by the experiments at the LHC (especially starting with the run 3 from 2021) but also by other upcoming experiments such as CTA[1] While we are considering the upgrade of the infrastructure of our data center, we are also evaluating the possibility of using CPU resources available in other data centres or even leased from commercial cloud providers. Hence, at INFN Tier-1, besides participating to the EU project HNSciCloud, we have also pledged a small amount of computing resources (˜ 2000 cores) located at the Bari ReCaS[2] for the WLCG experiments for 2016 and we are testing the use of resources provided by a commercial cloud provider. While the Bari ReCaS data center is directly connected to the GARR network[3] with the obvious advantage of a low latency and high bandwidth connection, in the case of the commercial provider we rely only on the General Purpose Network. In this paper we describe the set-up phase and the first results of these installations started in the last quarter of 2015, focusing on the issues that we have had to cope with and discussing the measured results in terms of efficiency.
MLP-1 on Crawler Transporter 2 (CT-2)
2017-03-22
Ground support technicians walk alongside NASA's upgraded crawler-transporter 2 (CT-2), carrying mobile launcher platform 1, as it slowly travels on the crawlerway at the agency's Kennedy Space Center in Florida. The crawler's upgrades and modifications will be monitored and tested under loaded conditions during its travel to the crawlerway Pad A/B split and back to the crawler yard to confirm it is ready to support the load of the mobile launcher carrying the Space Launch System with Orion atop for the first test flight, Exploration Mission 1. The Ground Systems Development and Operations Program at Kennedy is managing upgrades to the crawler.
MLP-1 on Crawler Transporter 2 (CT-2)
2017-03-22
NASA's upgraded crawler-transporter 2 (CT-2), carrying mobile launcher platform 1, moves slowly along the crawlerway toward the Vehicle Assembly Building at the agency's Kennedy Space Center in Florida. The crawler's upgrades and modifications were monitored and tested during a loaded test to the crawlerway Pad A/B split. CT-2 will return to the crawler yard. The crawler is being tested to confirm it is ready to support the load of the mobile launcher carrying the Space Launch System with Orion atop for the first test flight, Exploration Mission 1. The Ground Systems Development and Operations Program at Kennedy is managing upgrades to the crawler.
Studies of the beam extraction system of the GTS-LHC electron cyclotron resonance ion source at CERN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toivanen, V., E-mail: ville.aleksi.toivanen@cern.ch; Küchler, D.
2016-02-15
The 14.5 GHz GTS-LHC Electron Cyclotron Resonance Ion Source (ECRIS) provides multiply charged heavy ion beams for the CERN experimental program. The GTS-LHC beam formation has been studied extensively with lead, argon, and xenon beams with varied beam extraction conditions using the ion optical code IBSimu. The simulation model predicts self-consistently the formation of triangular and hollow beam structures which are often associated with ECRIS ion beams, as well as beam loss patterns which match the observed beam induced markings in the extraction region. These studies provide a better understanding of the properties of the extracted beams and a waymore » to diagnose the extraction system performance and limitations, which is otherwise challenging due to the lack of direct diagnostics in this region and the limited availability of the ion source for development work.« less
Toivanen, V; Küchler, D
2016-02-01
The 14.5 GHz GTS-LHC Electron Cyclotron Resonance Ion Source (ECRIS) provides multiply charged heavy ion beams for the CERN experimental program. The GTS-LHC beam formation has been studied extensively with lead, argon, and xenon beams with varied beam extraction conditions using the ion optical code IBSimu. The simulation model predicts self-consistently the formation of triangular and hollow beam structures which are often associated with ECRIS ion beams, as well as beam loss patterns which match the observed beam induced markings in the extraction region. These studies provide a better understanding of the properties of the extracted beams and a way to diagnose the extraction system performance and limitations, which is otherwise challenging due to the lack of direct diagnostics in this region and the limited availability of the ion source for development work.
Electro-mechanical characterization of MgB2 wires for the Superconducting Link Project at CERN
NASA Astrophysics Data System (ADS)
Konstantopoulou, K.; Ballarino, A.; Gharib, A.; Stimac, A.; Garcia Gonzalez, M.; Perez Fontenla, A. T.; Sugano, M.
2016-08-01
In previous years, the R & D program between CERN and Columbus Superconductors SpA led to the development of several configurations of MgB2 wires. The aim was to achieve excellent superconducting properties in high-current MgB2 cables for the HL-LHC upgrade. In addition to good electrical performance, the superconductor shall have good mechanical strength in view of the stresses during operation (Lorenz forces and thermal contraction) and handling (tension and bending) during cabling and installation at room temperature. Thus, the study of the mechanical properties of MgB2 wires is crucial for the cable design and its functional use. In the present work we report on the electro-mechanical characterization of ex situ processed composite MgB2 wires. Tensile tests (critical current versus strain) were carried out at 4.2 K and in a 3 T external field by means of a purpose-built bespoke device to determine the irreversible strain limit of the wire. The minimum bending radius of the wire was calculated taking into account the dependence of the critical current with the strain and it was then used to obtain the minimum twist pitch of MgB2 wires in the cable. Strands extracted from cables having different configurations were tested to quantify the critical current degradation. The Young’s modulus of the composite wire was measured at room temperature. Finally, all measured mechanical parameters will be used to optimize an 18-strand MgB2 cable configuration.
ERIC Educational Resources Information Center
American School & University, 1996
1996-01-01
Describes innovative strategies that schools and universities are using to save money and reshape operations. Focuses on ideas in energy efficiency and facilities improvement, direct purchasing, energy management, retrofitting buildings, ceiling insulation upgrades, automation systems, electric demand programs, facilities programs, warranty…
CheckMATE 2: From the model to the limit
NASA Astrophysics Data System (ADS)
Dercks, Daniel; Desai, Nishita; Kim, Jong Soo; Rolbiecki, Krzysztof; Tattersall, Jamie; Weber, Torsten
2017-12-01
We present the latest developments to the CheckMATE program that allows models of new physics to be easily tested against the recent LHC data. To achieve this goal, the core of CheckMATE now contains over 60 LHC analyses of which 12 are from the 13 TeV run. The main new feature is that CheckMATE 2 now integrates the Monte Carlo event generation via MadGraph5_aMC@NLO and Pythia 8. This allows users to go directly from a SLHA file or UFO model to the result of whether a model is allowed or not. In addition, the integration of the event generation leads to a significant increase in the speed of the program. Many other improvements have also been made, including the possibility to now combine signal regions to give a total likelihood for a model.
Promon's participation in the Brasilsat program: first & second generations
NASA Astrophysics Data System (ADS)
Depaiva, Ricardo N.
This paper presents an overview of the Brasilsat program, space and ground segments, developed by Hughes and Promon. Promon is a Brazilian engineering company that has been actively participating in the Brasilsat Satellite Telecommunications Program since its beginning. During the first generation, as subcontractor of the Spar/Hughes/SED consortium, Promon had a significant participation in the site installation of the Ground Segment, including the antennas. During the second generation, as partner of a consortium with Hughes, Promon participated in the upgrade of Brasilsat's Ground Segment systems: the TT&C (TCR1, TCR2, and SCC) and the COCC (Communications and Operations Control Center). This upgrade consisted of the design and development of hardware and software to support the second generation requirements, followed by integration and tests, factory acceptance tests, transport to site, site installation, site acceptance tests and warranty support. The upgraded systems are distributed over four sites with remote access to the main ground station. The solutions adopted provide a high level of automation, and easy operator interaction. The hardware and software technologies were selected to provide the flexibility to incorporate new technologies and services from the demanding satellite telecommunications market.
RBDMS, FracFocus, and Gateway Initiatives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yates, Dan
Award DE-FE-0000880 from the Department of Energy to the Ground Water Protection Council (GWPC) focuses on state and federal priorities in the areas of state RBDMS development, connectivity between state systems and FracFocus.org, and data sharing initiatives across agencies. The overarching goals of these projects are to: - upgrade the Risk Based Data Management System (RBDMS) to a web based application; - upgrade the RBDMS user interface to make it more intuitive and easier to use - increase connectivity between state RBDMS systems and FracFocus; - provide outreach to states and agencies through training programs, special meetings, and state bymore » state outreach on all deliverables; - provide greater functionality for non-IT users; - update and install RBDMS modules in additional state programs (IL, CA, WY, WV); and; - help make data access more transparent. The primary objective is to enhance the Risk Based Data Management System (RBDMS) by adding new components relevant to current environmental topics such as hydraulic fracturing, increasing field inspection capabilities, creating linkages between FracFocus and state programs, upgrading eForm capabilities, and analyzing potential for data sharing. The GWPC will work with state agencies to develop an RBDMS module(s) that meets these needs.« less
Sleep and Quality of Life in Urban Poverty: The Effect of a Slum Housing Upgrading Program
Simonelli, Guido; Leanza, Yvan; Boilard, Alexandra; Hyland, Martín; Augustinavicius, Jura L.; Cardinali, Daniel P.; Vallières, Annie; Pérez-Chada, Daniel; Vigo, Daniel E.
2013-01-01
Study Objectives: To evaluate the effect of a housing transition on sleep quality and quality of life in slum dwellers, participating in a slum housing upgrading program. Design: Observational before-and-after study with a convergent-parallel mixed method design. Setting: Five slums located in the metropolitan area of Buenos Aires, Argentina. Participants: A total of 150 slum dwellers benefited by a housing program of the nonprofit organization TECHO (spanish word for “roof”). Interventions: Participants moved from their very low-quality house to a basic prefabricated 18 m2 modular house provided by TECHO. Measurements and Results: The Pittsburgh Sleep Quality Index (PSQI) and World Health Organization Quality of Life brief scale (WHOQOL-BREF) were administered before and after housing upgrading. Data about housing conditions, income, education, sleeping conditions, and cardiovascular risk were also collected. Semistructured interviews were used to expand and nuance quantitative data obtained from a poorly educated sample. Results showed that sleep quality significantly increased after the housing program (z = -6.57, P < 0.001). Overall quality of life (z = -6.85, P < 0.001), physical health domain (z = -4.35, P < 0.001), psychological well-being domain (z = -3.72, P < 0.001) and environmental domain (z = -7.10, P < 0.001) of WHOQOL-BREF were also improved. Interviews demonstrated the importance of serenity for improving quality of life. Conclusions: A minimal improvement in the quality of basic housing can significantly increase sleep quality and quality of life among slum dwellers. Understanding sleep and daily life conditions in informal urban settlements could help to define what kind of low-cost intervention may improve sleep quality, quality of life, and reduce existent sleep disparity. Citation: Simonelli G; Leanza Y; Boilard A; Hyland M; Augustinavicius JL; Cardinali DP; Vallières A; Pérez-Chada D; Vigo DE. Sleep and quality of life in urban poverty: the effect of a slum housing upgrading program. SLEEP 2013;36(11):1669-1676. PMID:24179300
Panoramic projection avionics displays
NASA Astrophysics Data System (ADS)
Kalmanash, Michael H.
2003-09-01
Avionics projection displays are entering production in advanced tactical aircraft. Early adopters of this technology in the avionics community used projection displays to replace or upgrade earlier units incorporating direct-view CRT or AMLCD devices. Typical motivation for these upgrades were the alleviation of performance, cost and display device availability concerns. In these systems, the upgraded (projection) displays were one-for-one form / fit replacements for the earlier units. As projection technology has matured, this situation has begun to evolve. The Lockheed-Martin F-35 is the first program in which the cockpit has been specifically designed to take advantage of one of the more unique capabilities of rear projection display technology, namely the ability to replace multiple small screens with a single large conformal viewing surface in the form of a panoramic display. Other programs are expected to follow, since the panoramic formats enable increased mission effectiveness, reduced cost and greater information transfer to the pilot. Some of the advantages and technical challenges associated with panoramic projection displays for avionics applications are described below.
CERN openlab: Engaging industry for innovation in the LHC Run 3-4 R&D programme
NASA Astrophysics Data System (ADS)
Girone, M.; Purcell, A.; Di Meglio, A.; Rademakers, F.; Gunne, K.; Pachou, M.; Pavlou, S.
2017-10-01
LHC Run3 and Run4 represent an unprecedented challenge for HEP computing in terms of both data volume and complexity. New approaches are needed for how data is collected and filtered, processed, moved, stored and analysed if these challenges are to be met with a realistic budget. To develop innovative techniques we are fostering relationships with industry leaders. CERN openlab is a unique resource for public-private partnership between CERN and leading Information Communication and Technology (ICT) companies. Its mission is to accelerate the development of cutting-edge solutions to be used by the worldwide HEP community. In 2015, CERN openlab started its phase V with a strong focus on tackling the upcoming LHC challenges. Several R&D programs are ongoing in the areas of data acquisition, networks and connectivity, data storage architectures, computing provisioning, computing platforms and code optimisation and data analytics. This paper gives an overview of the various innovative technologies that are currently being explored by CERN openlab V and discusses the long-term strategies that are pursued by the LHC communities with the help of industry in closing the technological gap in processing and storage needs expected in Run3 and Run4.
MonALISA, an agent-based monitoring and control system for the LHC experiments
NASA Astrophysics Data System (ADS)
Balcas, J.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.
2017-10-01
MonALISA, which stands for Monitoring Agents using a Large Integrated Services Architecture, has been developed over the last fifteen years by California Insitute of Technology (Caltech) and its partners with the support of the software and computing program of the CMS and ALICE experiments at the Large Hadron Collider (LHC). The framework is based on Dynamic Distributed Service Architecture and is able to provide complete system monitoring, performance metrics of applications, Jobs or services, system control and global optimization services for complex systems. A short overview and status of MonALISA is given in this paper.
Matching next-to-leading order predictions to parton showers in supersymmetric QCD
Degrande, Céline; Fuks, Benjamin; Hirschi, Valentin; ...
2016-02-03
We present a fully automated framework based on the FeynRules and MadGraph5_aMC@NLO programs that allows for accurate simulations of supersymmetric QCD processes at the LHC. Starting directly from a model Lagrangian that features squark and gluino interactions, event generation is achieved at the next-to-leading order in QCD, matching short-distance events to parton showers and including the subsequent decay of the produced supersymmetric particles. As an application, we study the impact of higher-order corrections in gluino pair-production in a simplified benchmark scenario inspired by current gluino LHC searches.
Matching next-to-leading order predictions to parton showers in supersymmetric QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Degrande, Céline; Fuks, Benjamin; Hirschi, Valentin
We present a fully automated framework based on the FeynRules and MadGraph5_aMC@NLO programs that allows for accurate simulations of supersymmetric QCD processes at the LHC. Starting directly from a model Lagrangian that features squark and gluino interactions, event generation is achieved at the next-to-leading order in QCD, matching short-distance events to parton showers and including the subsequent decay of the produced supersymmetric particles. As an application, we study the impact of higher-order corrections in gluino pair-production in a simplified benchmark scenario inspired by current gluino LHC searches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The ATLAS collaboration at LHC has chosen the Micromegas (Micro Mesh Gaseous Structure) technology along with the small-strip Thin Gap Chambers (sTGC) for the high luminosity upgrade of the inner muon station in the high-rapidity region, the so called New Small Wheel (NSW). It employs eight layers of Micromegas detectors and eight layers of sTGC. The NSW project requires fully efficient Micromegas chambers with spatial resolution down to 100 μm in the precision coordinate for momentum reconstruction, and at mm level in the azimuthal (second) coordinate, over a total active area of 1200 m{sup 2}, with a rate capability upmore » to about 15 kHz/cm{sup 2} and operation in a moderate magnetic field up to B = 0.4 T. The required tracking capability is provided by the intrinsic space resolution combined with a mechanical precision at the level of 30 μm along the precision coordinate. Together with the precise tracking capability the Micromegas chambers should provide a trigger signal. Several tests have been performed on small (10x10 cm{sup 2}) and large (1 x 1 m{sup 2}) size single gap chambers prototypes using high energy hadron beams at CERN, low and intermediate energy (0.5-5 GeV) electron beams at Frascati and DESY, neutron beams at Demokritos (Athens) and Garching (Munich) and cosmic rays. More recently two quadruplets with dimensions 1.2 x 0.5 m{sup 2} and the same configuration and structure foreseen for the NSW upgrade have been built at CERN and tested with high energy pions/muons beam. Results obtained in the most recent tests, in different configurations and operating conditions, in dependence with the magnetic field, will be presented, along with a comparison between different read-out electronics, either based on the APV25 chips, or based on a new digital front-end ASIC developed in its second version (VMM2) as a new prototype of the final chip that will be employed in the NSW upgrade. (authors)« less
Fermilab proton accelerator complex status and improvement plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiltsev, Vladimir
2017-05-30
Fermilab carries out an extensive program of accelerator-based high energy particle physics research at the Intensity Frontier that relies on the operation of 8 GeV and 120 GeV proton beamlines for a n umber of fixed target experiments. Routine operation with a world-record 700kW of average 120 GeV beam power on the neutrino target was achieved in 2017 as the result of the Proton Improvement Plan (PIP) upgrade. There are plans to further increase the power to 900 – 1000 kW. The next major upgrade of the FNAL accelerator complex, called PIP-II, is under development. It aims at 1.2MW beammore » power on target at the start of the LBNF/DUNE experiment in the middle of the next decade and assumes replacement of the existing 40-years old 400 MeV normal-conducting Linac with a modern 800 MeV superconducting RF linear accelerator. There are several concepts to further double the beam power to >2.4MW after replacement of the existing 8 GeV Booster synchrotron. In this article we discuss current performance of the Fermilab proton accelerator complex, the upgrade plans for the next two decades and the accelerator R&D program to address cost and performance risks for these upgrades.« less
Keys to the House: Unlocking Residential Savings With Program Models for Home Energy Upgrades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grevatt, Jim; Hoffman, Ian; Hoffmeyer, Dale
After more than 40 years of effort, energy efficiency program administrators and associated contractors still find it challenging to penetrate the home retrofit market, especially at levels commensurate with state and federal goals for energy savings and emissions reductions. Residential retrofit programs further have not coalesced around a reliably successful model. They still vary in design, implementation and performance, and they remain among the more difficult and costly options for acquiring savings in the residential sector. If programs are to contribute fully to meeting resource and policy objectives, administrators need to understand what program elements are key to acquiring residentialmore » savings as cost effectively as possible. To that end, the U.S. Department of Energy (DOE) sponsored a comprehensive review and analysis of home energy upgrade programs with proven track records, focusing on those with robustly verified savings and constituting good examples for replication. The study team reviewed evaluations for the period 2010 to 2014 for 134 programs that are funded by customers of investor-owned utilities. All are programs that promote multi-measure retrofits or major system upgrades. We paid particular attention to useful design and implementation features, costs, and savings for nearly 30 programs with rigorous evaluations of performance. This meta-analysis describes program models and implementation strategies for (1) direct install retrofits; (2) heating, ventilating and air-conditioning (HVAC) replacement and early retirement; and (3) comprehensive, whole-home retrofits. We analyze costs and impacts of these program models, in terms of both energy savings and emissions avoided. These program models can be useful guides as states consider expanding their strategies for acquiring energy savings as a resource and for emissions reductions. We also discuss the challenges of using evaluations to create program models that can be confidently applied in multiple jurisdictions.« less
Investing in Florida's Economy. Florida's School-to-Work Continuum. Second Edition.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.
This booklet describes programs designed to offer a comprehensive system to improve Florida's work force. Through these programs, students and workers in Florida are prepared to enter the labor force, attend technical training programs, enroll in other postsecondary programs, or upgrade their skills on the job. The following are discussed: the…
Wang, Xiuqin; Congdon, Nathan; Ma, Yue; Hu, Min; Zhou, Yuan; Liao, Weiqi; Jin, Ling; Xiao, Baixiang; Wu, Xiaoyi; Ni, Ming; Yi, Hongmei; Huang, Yiwen; Varga, Beatrice; Zhang, Hong; Cun, Yongkang; Li, Xianshun; Yang, Luhua; Liang, Chaoguang; Huang, Wan; Rozelle, Scott; Ma, Xiaochen
2017-01-01
Offering free glasses can be important to increase children's wear. We sought to assess whether "Upgrade glasses" could avoid reduced glasses sales when offering free glasses to children in China. In this cluster-randomized, controlled trial, children with uncorrected visual acuity (VA)< = 6/12 in either eye correctable to >6/12 in both eyes at 138 randomly-selected primary schools in 9 counties in Guangdong and Yunnan provinces, China, were randomized by school to one of four groups: glasses prescription only (Control); Free Glasses; Free Glasses + offer of $15 Upgrade Glasses; Free Glasses + offer of $30 Upgrade Glasses. Spectacle purchase (main outcome) was assessed 6 months after randomization. Among 10,234 children screened, 882 (8.62%, mean age 10.6 years, 45.5% boys) were eligible and randomized: 257 (29.1%) at 37 schools to Control; 253 (28.7%) at 32 schools to Free Glasses; 187 (21.2%) at 31 schools to Free Glasses + $15 Upgrade; and 185 (21.0%) at 27 schools to Free Glasses +$30 Upgrade. Baseline ownership among these children needing glasses was 11.8% (104/882), and 867 (98.3%) children completed follow-up. Glasses purchase was significantly less likely when free glasses were given: Control: 59/250 = 23.6%; Free glasses: 32/252 = 12.7%, P = 0.010. Offering Upgrade Glasses eliminated this difference: Free + $15 Upgrade: 39/183 = 21.3%, multiple regression relative risk (RR) 0.90 (0.56-1.43), P = 0.65; Free + $30 Upgrade: 38/182 = 20.9%, RR 0.91 (0.59, 1.42), P = 0.69. Upgrade glasses can prevent reductions in glasses purchase when free spectacles are provided, providing important program income. ClinicalTrials.gov Identifier: NCT02231606. Registered on 31 August 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meinking, Rick; Adamson, Joy M
2013-12-20
Energy efficiency is vitally important in Maine. Nearly 70% of Maine households rely on fuel oil as their primary energy source for home heating, a higher share than in any other state. Coupled with the state's long, cold winters, Maine's dependence on oil renders homeowners particularly vulnerable to fluctuating fuel costs. With $4.5 million in seed funding from the Energy Department's Better Buildings Neighborhood Program, the Governor's Energy Office (GEO), through Efficiency Maine Trust (the Trust), is spurring Maine landlords to lower their monthly energy bills and improve comfort for their tenants during the state's cold winter months and increasinglymore » warmer summers. Maine's aging multifamily housing stock can be expensive to heat and costly to maintain. It is not unusual to find buildings with little or no insulation, drafty windows, and significant air leaks, making them ideal candidates for energy efficiency upgrades. Maine modeled its Multifamily Efficiency Program (MEP) after the state's highly successful Home Energy Savings Program (HESP) for single-family homes. HESP provided cash incentives and financing opportunities to owners of one-to four-unit structures, which resulted in thousands of energy assessments and whole-house energy upgrades in 225 communities. Maine's new MEP multifamily energy efficiency upgrade and weatherization initiative focuses on small to medium-sized (i.e., five to 20 units) apartment buildings. The program's energy efficiency upgrades will provide at least 20% energy savings for each upgraded multifamily unit. The Trust’s MEP relies on a network of approved program partners who help move projects through the pipeline from assessment to upgrade. MEP has two components: benchmarking and development of an Energy Reduction Plan (ERP). Using the ENERGY STAR® Portfolio Manager benchmarking tool, MEP provides an assessment of current energy usage in the building, establishes a baseline for future energy efficiency improvements, and enables tracking and monitoring of future energy usage at the building— all at no cost to the building owner. The ERP is developed by a program partner using either the Trust’s approved modeling or prescriptive tools; it provides detailed information about the current energyrelated conditions in the building and recommends energy efficiency, health, and safety improvements. The Trust's delivery contractor provides quality assurance and controls throughout the process. Through this effort, MEP's goal is to establish a self-sustaining, market-driven program, demonstrating the value of energy efficiency to other building owners. The increasing value of properties across the state will help incentivize these owners to continue upgrades after the grant period has ended. Targeting urban areas in Maine with dense clusters of multifamily units—such as Portland, Lewiston- Auburn, Bangor, and Augusta—MEP engaged a variety of stakeholder groups early on to design its multifamily program. Through direct emails and its website, program officials invited lending institutions, building professionals, engineering firms, equipment distributors, and local property owners associations to attend open meetings around the state to learn about the goals of the multifamily program and to help define its parameters. These meetings helped program administrators understand the diversity of the customer base: some owners are individuals with a single building, while other owners are groups of people or management companies with an entire portfolio of multifamily buildings. The diversity of the customer base notwithstanding, owners see MEP as an opportunity to make gains in their respective properties. Consistently high turnouts at stakeholder meetings fueled greater customer interest as awareness of the program spread through word of mouth. The program also gained traction by utilizing the program partner networks and building on the legacy of the Trust’s successful HESP for single-family residences. MEP offers significant incentives for building owners to participate in the upgrade program. Wholebuilding benchmarking services are available to most multifamily housing buildings free of charge. The service provides the building owner with an assessment of the building's current energy efficiency as compared to other multifamily buildings on a national scale, establishes a baseline to measure future improvements, and enables owners to track monthly energy consumption using the ENERGY STAR Portfolio Manager. Once the benchmarking process is complete, the program links building owners with approved program partners (e.g., energy professionals, home performance contractors) to identify and implement specific energy-saving opportunities in the building. Program partners can also provide project quotes with estimated financing incentives and payback period calculations that enable building owners to make informed decisions. What's more, the Trust provides two financial incentives for successful completion of program milestones. The first is a per-unit incentive for completion of an approved ERP (i.e., $100 per unit if a prescriptive path is followed, and $200 per unit for a modeled ERP). Upon final inspection of the installed project scope of work, an incentive of $1,400 per unit or 50% of installed cost—whichever is less—is paid. The Trust originally established a $1 million loan-loss reserve fund (LLRF) to further enhance financing opportunities for qualified multifamily building owners. This funding mechanism was designed to connect building owners with lenders that retain the mortgages for their properties and encourages the lenders to offer financing for energy efficiency improvements. However, there has been no interest in the LLRF and therefore the LLRF has been reduced. Ultimately, MEP plans to build an online tool for building owners to assess opportunities to make upgrades in their multifamily units. The tool will include a performance rating system to provide a way for building owners to more easily understand energy use in their building, and how it could be improved with energy efficiency upgrades. Prospective tenants will also be able to use the rating system to make informed decisions about where to rent. Furthermore, the rating can be incorporated into real estate listings as a way for prospective home buyers and the real estate financial community to evaluate a home's operating costs. The Trust’s MEP has identified the state's most experienced energy professionals, vendors, suppliers, and contractors that install energy efficiency equipment in the multifamily sector to be qualified program partners. To be eligible for partnership, energy assessment professionals and contractors are required to have demonstrated experience in the multifamily sector and hold associated professional certifications, such as Building Operator Certification (BOC), Certified Energy Manager (CEM), Professional Engineer (PE), or Building Performance Institute (BPI) Multifamily Building Analyst. Widespread program interest has enabled the Trust to redirect funds that might otherwise be needed for program promotion to building capacity through contractor training. In addition to boosting professional training and certification opportunities, MEP teaches its partners how to market the multifamily program to prospective multifamily homeowners.« less
PROGRAMS TO UPGRADE CULTURALLY DEPRIVED YOUTH IN NEW YORK CITY.
ERIC Educational Resources Information Center
KARPAS, MELVIN R.
THE TWO PROGRAMS REVIEWED ARE THE DEMONSTRATION GUIDANCE PROJECT AND HIGHER HORIZONS PROGRAM. CHARACTERISTICS OF THE DEMONSTRATION GUIDANCE PROJECT INCLUDED EXTRA TEACHERS, SPECIAL TUTORING, GROUP AND INDIVIDUAL GUIDANCE AND COUNSELING, CLINICAL SERVICES, INTENSIVE COURSES IN ENGLISH, CULTURAL, AND ARTISTIC EVENTS. THE PROJECT STARTED WITH JUNIOR…
UNION-SPONSORED RETRAINING PROGRAMS.
ERIC Educational Resources Information Center
HOOS, IDA R.
UNION-SPONSORED TRAINING PROGRAMS WERE PROVIDED IN THE SAN FRANCISCO BAY AREA TO UPGRADE SKILLS OF MARINE COOKS AND STEWARDS, SHIPS' RADIO OPERATORS, JOURNEYMAN PLUMBERS AND GASFITTERS, AND MEMBERS OF THE INTERNATIONAL BROTHERHOOD OF ELECTRICAL WORKERS (IBEW). THESE PROGRAMS WERE THE ONLY COHESIVE UNION SPONSORED CURRICULA IN THAT AREA. MAJOR…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zimring, Mark; Fuller, Merrian
2011-01-24
The New York legislature passed the Green Jobs-Green New York (GJGNY) Act in 2009. Administered by the New York State Energy Research and Development Authority (NYSERDA), GJGNY programs provide New Yorkers with access to free or low-cost energy assessments,1 energy upgrade services,2 low-cost financing, and training for various 'green-collar' careers. Launched in November 2010, GJGNY's residential initiative is notable for its use of novel underwriting criteria to expand access to energy efficiency financing for households seeking to participate in New York's Home Performance with Energy Star (HPwES) program.3 The GJGNY financing program is a valuable test of whether alternatives tomore » credit scores can be used to responsibly expand credit opportunities for households that do not qualify for traditional lending products and, in doing so, enable more households to make energy efficiency upgrades.« less
Quench simulations for superconducting elements in the LHC accelerator
NASA Astrophysics Data System (ADS)
Sonnemann, F.; Schmidt, R.
2000-08-01
The design of the protection system for the superconducting elements in an accelerator such as the large Hadron collider (LHC), now under construction at CERN, requires a detailed understanding of the thermo-hydraulic and electrodynamic processes during a quench. A numerical program (SPQR - simulation program for quench research) has been developed to evaluate temperature and voltage distributions during a quench as a function of space and time. The quench process is simulated by approximating the heat balance equation with the finite difference method in presence of variable cooling and powering conditions. The simulation predicts quench propagation along a superconducting cable, forced quenching with heaters, impact of eddy currents induced by a magnetic field change, and heat transfer through an insulation layer into helium, an adjacent conductor or other material. The simulation studies allowed a better understanding of experimental quench data and were used for determining the adequate dimensioning and protection of the highly stabilised superconducting cables for connecting magnets (busbars), optimising the quench heater strip layout for the main magnets, and studying quench back by induced eddy currents in the superconductor. After the introduction of the theoretical approach, some applications of the simulation model for the LHC dipole and corrector magnets are presented and the outcome of the studies is compared with experimental data.
Compendium of Instrumentation Whitepapers on Frontier Physics Needs for Snowmass 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipton, R.
2013-01-01
Contents of collection of whitepapers include: Operation of Collider Experiments at High Luminosity; Level 1 Track Triggers at HL-LHC; Tracking and Vertex Detectors for a Muon Collider; Triggers for hadron colliders at the energy frontier; ATLAS Upgrade Instrumentation; Instrumentation for the Energy Frontier; Particle Flow Calorimetry for CMS; Noble Liquid Calorimeters; Hadronic dual-readout calorimetry for high energy colliders; Another Detector for the International Linear Collider; e+e- Linear Colliders Detector Requirements and Limitations; Electromagnetic Calorimetry in Project X Experiments The Project X Physics Study; Intensity Frontier Instrumentation; Project X Physics Study Calorimetry Report; Project X Physics Study Tracking Report; The LHCbmore » Upgrade; Neutrino Detectors Working Group Summary; Advanced Water Cherenkov R&D for WATCHMAN; Liquid Argon Time Projection Chamber (LArTPC); Liquid Scintillator Instrumentation for Physics Frontiers; A readout architecture for 100,000 pixel Microwave Kinetic In- ductance Detector array; Instrumentation for New Measurements of the Cosmic Microwave Background polarization; Future Atmospheric and Water Cherenkov ?-ray Detectors; Dark Energy; Can Columnar Recombination Provide Directional Sensitivity in WIMP Search?; Instrumentation Needs for Detection of Ultra-high Energy Neu- trinos; Low Background Materials for Direct Detection of Dark Matter; Physics Motivation for WIMP Dark Matter Directional Detection; Solid Xenon R&D at Fermilab; Ultra High Energy Neutrinos; Instrumentation Frontier: Direct Detection of WIMPs; nEXO detector R&D; Large Arrays of Air Cherenkov Detectors; and Applications of Laser Interferometry in Fundamental Physics Experiments.« less
Study of the VMM1 read-out chip in a neutron irradiation environment
NASA Astrophysics Data System (ADS)
Alexopoulos, T.; Fanourakis, G.; Geralis, T.; Kokkoris, M.; Kourkoumeli-Charalampidi, A.; Papageorgiou, K.; Tsipolitis, G.
2016-05-01
Within 2015, the LHC operated close to the design energy of √s = 13-14 TeV delivering instantaneous luminosities up to Script L = 5 × 1033 cm-2s-1. The ATLAS Phase-I upgrade in 2018/19 will introduce the MicroMEGAS detectors in the area of the small wheel at the end caps. Accompanying new electronics are designed and built such as the VMM front end ASIC, which provides energy, timing and triggering information and allows fast data read-out. The first VMM version (VMM1) has been widely produced and tested in various test beams, whilst the second version (VMM2) is currently being tested. This paper focuses on the VMM1 single event upset studies and more specifically on the response of the configuration registers under harsh radiation environments. Similar conditions are expected at Run III with Script L = 2 × 1034 cm-2s-1 and a mean of 55 interactions per bunch crossing. Two VMM1s were exposed in a neutron irradiation environment using the TANDEM Van Der Graaff accelerator at NSCR Demokritos, Athens, Greece. The results showed a rate of SEU occurrences at a measured cross section of (4.1±0.8)×10-14 cm2/bit for each VMM. Consequently, when extrapolating this value to the luminosity expected in Run III, the occurrence is roughly 6 SEUs/min in all the read-out system comprising 40,000 VMMs installed during the Phase-I upgrade.
MWPC prototyping and performance test for the STAR inner TPC upgrade
NASA Astrophysics Data System (ADS)
Shen, Fuwang; Wang, Shuai; Kong, Fangang; Bai, Shiwei; Li, Changyu; Videbæk, Flemming; Xu, Zhangbu; Zhu, Chengguang; Xu, Qinghua; Yang, Chi
2018-07-01
A new prototype of STAR inner Time Projection Chamber (iTPC) MWPC sector has been fabricated and tested in an X-ray test system. The wire chamber built at Shandong University has a wire tension precision better than 6% and wire pitch precision better than 10 μm. The gas gain uniformity and energy resolution are measured to be better than 1% (RMS) and 20% (FWHM), respectively, using an 55Fe X-ray source. The iTPC upgrade project is to replace all 24 STAR TPC inner sectors as a crucial detector upgrade for the RHIC beam energy scan phase II program. The test results show that the constructed iTPC prototype meets all project requirements.
Crawler Transporter 2 (CT-2) Trek from Pad 39B to VAB
2017-03-21
Crawler-transport 2 (CT-2) moves slowly along the crawlerway on its way back to the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. The crawler took a trip to the pad A/B split to test upgrades recently completed that will allow the giant vehicle to handle the load of the agency's Space Launch System rocket and Orion spacecraft atop the mobile launcher. The Ground Systems Development and Operations Program oversaw upgrades to the 50-year-old CT-2. New generators, gear assemblies, jacking, equalizing and leveling (JEL) hydraulic cylinders, roller bearings and brakes were installed, and other components were upgraded to prepare for Exploration Mission 1.
Crawler Transporter 2 (CT-2) Trek from Pad 39B to VAB
2017-03-21
Crawler-transporter 2 (CT-2) moves slowly along the crawlerway on its way back to the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. The crawler took a trip to the Pad A/B split to test upgrades recently completed that will allow the giant vehicle to handle the load of the agency's Space Launch System rocket and Orion spacecraft atop the mobile launcher. The Ground Systems Development and Operations Program oversaw upgrades to the 50-year-old CT-2. New generators, gear assemblies, jacking, equalizing and leveling (JEL) hydraulic cylinders, roller bearings and brakes were installed, and other components were upgraded to prepare for Exploration Mission 1.
Crawler Transporter 2 (CT-2) Trek from Pad 39B to VAB
2017-03-21
Crawler-transporter 2 (CT-2) moves slowly along the crawlerway toward the Vehicle Assembly Building (in the background) at NASA's Kennedy Space Center in Florida. The crawler took a trip to the Pad A/B split to test upgrades recently completed that will allow the giant vehicle to handle the load of the agency's Space Launch System rocket and Orion spacecraft atop the mobile launcher. The Ground Systems Development and Operations Program oversaw upgrades to the 50-year-old CT-2. New generators, gear assemblies, jacking, equalizing and leveling (JEL) hydraulic cylinders, roller bearings and brakes were installed, and other components were upgraded to prepare for Exploration Mission 1.
Assessment of DoD Job Skill Enhancement Programs.
ERIC Educational Resources Information Center
Fletcher, J. D.; And Others
In response to Congressional direction, an assessment was undertaken of programs developed by the Department of Defense (DoD) that can be made available to civilian organizations to provide immediate support and assistance to upgrade skills for better civilian employment opportunities. The assessment focuses on interactive courseware programs and…