Sample records for us-lhc accelerator research

  1. Future HEP Accelerators: The US Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhat, Pushpalatha; Shiltsev, Vladimir

    2015-11-02

    Accelerator technology has advanced tremendously since the introduction of accelerators in the 1930s, and particle accelerators have become indispensable instruments in high energy physics (HEP) research to probe Nature at smaller and smaller distances. At present, accelerator facilities can be classified into Energy Frontier colliders that enable direct discoveries and studies of high mass scale particles and Intensity Frontier accelerators for exploration of extremely rare processes, usually at relatively low energies. The near term strategies of the global energy frontier particle physics community are centered on fully exploiting the physics potential of the Large Hadron Collider (LHC) at CERN throughmore » its high-luminosity upgrade (HL-LHC), while the intensity frontier HEP research is focused on studies of neutrinos at the MW-scale beam power accelerator facilities, such as Fermilab Main Injector with the planned PIP-II SRF linac project. A number of next generation accelerator facilities have been proposed and are currently under consideration for the medium- and long-term future programs of accelerator-based HEP research. In this paper, we briefly review the post-LHC energy frontier options, both for lepton and hadron colliders in various regions of the world, as well as possible future intensity frontier accelerator facilities.« less

  2. Magnetic Measurements of the First Nb 3Sn Model Quadrupole (MQXFS) for the High-Luminosity LHC

    DOE PAGES

    DiMarco, J.; Ambrosio, G.; Chlachidze, G.; ...

    2016-12-12

    The US LHC Accelerator Research Program (LARP) and CERN are developing high-gradient Nb 3Sn magnets for the High Luminosity LHC interaction regions. Magnetic measurements of the first 1.5 m long, 150 mm aperture model quadrupole, MQXFS1, were performed during magnet assembly at LBNL, as well as during cryogenic testing at Fermilab’s Vertical Magnet Test Facility. This paper reports on the results of these magnetic characterization measurements, as well as on the performance of new probes developed for the tests.

  3. Will there be energy frontier colliders after LHC?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiltsev, Vladimir

    2016-09-15

    High energy particle colliders have been in the forefront of particle physics for more than three decades. At present the near term US, European and international strategies of the particle physics community are centered on full exploitation of the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). The future of the world-wide HEP community critically depends on the feasibility of possible post-LHC colliders. The concept of the feasibility is complex and includes at least three factors: feasibility of energy, feasibility of luminosity and feasibility of cost. Here we overview all current options for post-LHC collidersmore » from such perspective (ILC, CLIC, Muon Collider, plasma colliders, CEPC, FCC, HE-LHC) and discuss major challenges and accelerator R&D required to demonstrate feasibility of an energy frontier accelerator facility following the LHC. We conclude by taking a look into ultimate energy reach accelerators based on plasmas and crystals, and discussion on the perspectives for the far future of the accelerator-based particle physics.« less

  4. The LHC Experiments

    ScienceCinema

    Lincoln, Don

    2018-01-16

    The Large Hadron Collider or LHC is the world’s biggest particle accelerator, but it can only get particles moving very quickly. To make measurements, scientists must employ particle detectors. There are four big detectors at the LHC: ALICE, ATLAS, CMS, and LHCb. In this video, Fermilab’s Dr. Don Lincoln introduces us to these detectors and gives us an idea of each one’s capabilities.

  5. Considerations on Energy Frontier Colliders after LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiltsev, Vladimir

    2016-11-15

    Since 1960’s, particle colliders have been in the forefront of particle physics, 29 total have been built and operated, 7 are in operation now. At present the near term US, European and international strategies of the particle physics community are centered on full exploitation of the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). The future of the world-wide HEP community critically depends on the feasibility of possible post-LHC colliders. The concept of the feasibility is complex and includes at least three factors: feasibility of energy, feasibility of luminosity and feasibility of cost. Here wemore » overview all current options for post-LHC colliders from such perspective (ILC, CLIC, Muon Collider, plasma colliders, CEPC, FCC, HE-LHC) and discuss major challenges and accelerator R&D required to demonstrate feasibility of an energy frontier accelerator facility following the LHC. We conclude by taking a look into ultimate energy reach accelerators based on plasmas and crystals, and discussion on the perspectives for the far future of the accelerator-based particle physics. This paper largely follows previous study [1] and the presenta ion given at the ICHEP’2016 conference in Chicago [2].« less

  6. Fermilab Heroes of the LHC: Joel Butler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, Joel

    2017-08-23

    Particle physics research is both international and collaborative, with large national laboratories working together to most efficiently advance science. Joel Butler, Distinguished Scientist at Fermi National Accelerator Laboratory is the leader of the Compact Muon Solenoid experiment at the CERN laboratory in Europe. In this video, Joel tells us a bit about what it’s like.

  7. Performance of the first short model 150 mm aperture Nb$$_3$$Sn Quadrupole MQXFS for the High- Luminosity LHC upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chlachidze, G.; et al.

    2016-08-30

    The US LHC Accelerator Research Program (LARP) and CERN combined their efforts in developing Nb3Sn magnets for the High-Luminosity LHC upgrade. The ultimate goal of this collaboration is to fabricate large aperture Nb3Sn quadrupoles for the LHC interaction regions (IR). These magnets will replace the present 70 mm aperture NbTi quadrupole triplets for expected increase of the LHC peak luminosity by a factor of 5. Over the past decade LARP successfully fabricated and tested short and long models of 90 mm and 120 mm aperture Nb3Sn quadrupoles. Recently the first short model of 150 mm diameter quadrupole MQXFS was builtmore » with coils fabricated both by the LARP and CERN. The magnet performance was tested at Fermilab’s vertical magnet test facility. This paper reports the test results, including the quench training at 1.9 K, ramp rate and temperature dependence studies.« less

  8. Larp Nb3Sn Quadrupole Magnets for the Lhc Luminosity Upgrade

    NASA Astrophysics Data System (ADS)

    Ferracin, P.

    2010-04-01

    The US LHC Accelerator Research Program (LARP) is a collaboration between four US laboratories (BNL, FNAL, LBNL, and SLAC) aimed at contributing to the commissioning and operation of the LHC and conducting R&D on its luminosity upgrade. Within LARP, the Magnet Program's main goal is to demonstrate that Nb3Sn superconducting magnets are a viable option for a future upgrade of the LHC Interaction Regions. Over the past four years, LARP has successfully fabricated and tested several R&D magnets: 1) the subscale quadrupole magnet SQ, to perform technology studies with 300 mm long racetrack coils, 2) the technology quadrupole TQ, to investigate support structure behavior with 1 m long cos 2θ coils, and 3) the long racetrack magnet LR, to test 3.6 m long racetrack coils. The next milestone consists in the fabrication and test of the 3.7 m long quadrupole magnet LQ, with the goal of demonstrating that Nb3Sn technology is mature for use in high energy accelerators. After an overview of design features and test result of the LARP magnets fabricated so far, this paper focuses on the status of the fabrication of LQ: we describe the production of the 3.4 m long cos 2θ coils, and the of the qualification support structure. Finally, the status of the development of the next 1 m long model HQ, conceived to explore stress and field limits of Nb3Sn superconducting, magnets, is presented.

  9. Catching Collisions in the LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fruguiele, Claudia; Hirschauer, Jim

    Now that the Large Hadron Collider has officially turned back on for its second run, within every proton collision could emerge the next new discovery in particle physics. Learn how the detectors on the Compact Muon Solenoid, or CMS, experiment capture and track particles as they are expelled from a collision. Talking us through these collisions are Claudia Fruguiele and Jim Hirschauer of Fermi National Accelerator Laboratory, the largest U.S. institution collaborating on the LHC.

  10. Catching Collisions in the LHC

    ScienceCinema

    Fruguiele, Claudia; Hirschauer, Jim

    2018-01-16

    Now that the Large Hadron Collider has officially turned back on for its second run, within every proton collision could emerge the next new discovery in particle physics. Learn how the detectors on the Compact Muon Solenoid, or CMS, experiment capture and track particles as they are expelled from a collision. Talking us through these collisions are Claudia Fruguiele and Jim Hirschauer of Fermi National Accelerator Laboratory, the largest U.S. institution collaborating on the LHC.

  11. Revised LHC deal quiets congress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawler, A.

    The roughest part of the ride may be over for U.S. physicists who want to participate in the Large Hadron Collider (LHC), the $5 billion accelerator planned for CERN in Geneva. They have found themselves on a political roller coaster for the past few months. This week, U.S. and European negotiators were putting the final touches on a revamped agreement that should pave the way for the United States to help pay for construction of the accelerator and its two main detectors, and guarantee U.S. scientists a role in research on the machine. The trouble began in March, when Representativemore » Joe Barton (R-TX) declared war on a proposed $530 million U.S. contribution to the new facility, slated for completion in 2005. Barton and many other members of Congress were still smarting from what they said was a lack of European support for the canceled Superconducting Super Collider that was being built in Barton`s backyard. Representative James Sensenbrenner (R-WI), who chairs the House Science Committee, led the charge to alter a draft agreement initialed this winter by Department of Energy (DOE) and CERN officials that spelled out the details of U.S. participation. After hurried negotiations, both sides have sharpened the agreement to address the lawmakers` concerns. The new deal, says Energy Secretary Federico Pena, {open_quotes}has made that project even better.{close_quotes}« less

  12. Summary of Test Results of MQXFS1 - The First Short Model 150 mm Aperture $$Nb_3Sn$$ Quadrupole for the High-Luminosity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoynev, S.; et al.

    The development ofmore » $$Nb_3Sn$$ quadrupole magnets for the High-Luminosity LHC upgrade is a joint venture between the US LHC Accelerator Research Program (LARP)* and CERN with the goal of fabricating large aperture quadrupoles for the LHC in-teraction regions (IR). The inner triplet (low-β) NbTi quadrupoles in the IR will be replaced by the stronger Nb3Sn magnets boosting the LHC program of having 10-fold increase in integrated luminos-ity after the foreseen upgrades. Previously LARP conducted suc-cessful tests of short and long models with up to 120 mm aperture. The first short 150 mm aperture quadrupole model MQXFS1 was assembled with coils fabricated by both CERN and LARP. The magnet demonstrated strong performance at the Fermilab’s verti-cal magnet test facility reaching the LHC operating limits. This paper reports the latest results from MQXFS1 tests with changed pre-stress levels. The overall magnet performance, including quench training and memory, ramp rate and temperature depend-ence, is also summarized.« less

  13. LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    NASA Astrophysics Data System (ADS)

    Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor

    2017-12-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lackner, Friedrich; Ferracin, Paolo; Todesco, Ezio

    The High luminosity LHC upgrade target is to increase the integrated luminosity by a factor 10, resulting in an integrated luminosity of 3000 fb-1. One major improvement foreseen is the reduction of the beam size at the collision points. This requires the development of 150 mm single aperture quadrupoles for the interaction regions. These quadrupoles are under development in a joint collaboration between CERN and the US-LHC Accelerator Research Program (LARP). The chosen approach for achieving a nominal quadrupole field gradient of 132.6 T/m is based on the Nb3Sn technology. The coils with a length of 7281 mm will bemore » the longest Nb3Sn coils fabricated so far for accelerator magnets. The production of the long coils was launched in 2016 based on practise coils made from copper. This paper provides a status of the production of the first low grade and full performance coils and describes the production process and applied quality control. Furthermore an outlook for the prototype assembly is provided.« less

  15. Electron Lenses for the Large Hadron Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stancari, Giulio; Valishev, Alexander; Bruce, Roderik

    Electron lenses are pulsed, magnetically confined electron beams whose current-density profile is shaped to obtain the desired effect on the circulating beam. Electron lenses were used in the Fermilab Tevatron collider for bunch-by-bunch compensation of long-range beam-beam tune shifts, for removal of uncaptured particles in the abort gap, for preliminary experiments on head-on beam-beam compensation, and for the demonstration of halo scraping with hollow electron beams. Electron lenses for beam-beam compensation are being commissioned in RHIC at BNL. Within the US LHC Accelerator Research Program and the European HiLumi LHC Design Study, hollow electron beam collimation was studied as anmore » option to complement the collimation system for the LHC upgrades. This project is moving towards a technical design in 2014, with the goal to build the devices in 2015-2017, after resuming LHC operations and re-assessing needs and requirements at 6.5 TeV. Because of their electric charge and the absence of materials close to the proton beam, electron lenses may also provide an alternative to wires for long-range beam-beam compensation in LHC luminosity upgrade scenarios with small crossing angles.« less

  16. Race for the Higgs hots up as Tevatron seeks extension

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2009-12-01

    With researchers at CERN's Large Hadron Collider (LHC) having circulated protons for the first time since last year's accident, the US Department of Energy (DOE) is requesting 25m so that the Tevatron collider at the Fermi National Accelerator Laboratory in Illinois can run for an extra year until 2011. If the additional funding is granted, it would give physicists in the US an extra 12 months to close in on discovering the elusive Higgs boson. The DOE's request will now be reviewed before being part of President Barack Obama's 2011 budget request, which will be sent to Congress in February.

  17. Run II of the LHC: The Accelerator Science

    NASA Astrophysics Data System (ADS)

    Redaelli, Stefano

    2015-04-01

    In 2015 the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) starts its Run II operation. After the successful Run I at 3.5 TeV and 4 TeV in the 2010-2013 period, a first long shutdown (LS1) was mainly dedicated to the consolidation of the LHC magnet interconnections, to allow the LHC to operate at its design beam energy of 7 TeV. Other key accelerator systems have also been improved to optimize the performance reach at higher beam energies. After a review of the LS1 activities, the status of the LHC start-up progress is reported, addressing in particular the status of the LHC hardware commissioning and of the training campaign of superconducting magnets that will determine the operation beam energy in 2015. Then, the plans for the Run II operation are reviewed in detail, covering choice of initial machine parameters and strategy to improve the Run II performance. Future prospects of the LHC and its upgrade plans are also presented.

  18. Using tevatron magnets for HE-LHC or new ring in LHC tunnel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piekarz, Henryk; /Fermilab

    Two injector accelerator options for HE-LHC of p{sup +} - p{sup +} collisions at 33 TeV cms energy are briefly outlined. One option is based on the Super-SPS (S-SPS) accelerator in the SPS tunnel, and the other one is based on the LER (Low-Energy-Ring) accelerator in the LHC tunnel. Expectations of performance of the main arc accelerator magnets considered for the construction of the S-SPS and of the LER accelerators are used to tentatively devise some selected properties of these accelerators as potential injectors to HE-LHC.

  19. Explorer : des clés pour mieux comprendre la matière

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellis, Jonathan R.

    2011-02-14

    Will the LHC upset theories of the infinitely small? Physicists would like the accelerator to shake the standard model. This theory of elementary particles and forces leaves many gray areas. The LHC and its experiments have been designed to enlighten them. [Le LHC va-t-il bouleverser les théories de l'infiniment petit ? Les physiciens aimeraient que l'accélérateur fasse trembler le modèle standard. Cette théorie des particules élémentaires et des forces laisse de nombreuses zones d'ombre. Le LHC et ses expériences ont été conçus pour les éclairer.

  20. LHC Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, Don

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  1. Explorer : des clés pour mieux comprendre la matière

    ScienceCinema

    Ellis, Jonathan R.

    2018-01-12

    Will the LHC upset theories of the infinitely small? Physicists would like the accelerator to shake the standard model. This theory of elementary particles and forces leaves many gray areas. The LHC and its experiments have been designed to enlighten them. [Le LHC va-t-il bouleverser les théories de l'infiniment petit ? Les physiciens aimeraient que l'accélérateur fasse trembler le modèle standard. Cette théorie des particules élémentaires et des forces laisse de nombreuses zones d'ombre. Le LHC et ses expériences ont été conçus pour les éclairer.

  2. On the Feasibility of a Pulsed 14 TeV C.M.E. Muon Collider in the LHC Tunnel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiltsev, Vladimir; Neuffer, D.

    We discuss the technical feasibility, key machine pa-rameters and major challenges of a 14 TeV c.m.e. muon-muon collider in the LHC tunnel [1]. The luminosity of the collider is evaluated for three alternative muon sources – the PS synchrotron, one of a type developed by the US Muon Accelerator Program (MAP) and a low-emittance option based on resonant μ-pair production.

  3. LHC Computing

    ScienceCinema

    Lincoln, Don

    2018-01-16

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  4. Review of EuCARD project on accelerator infrastructure in Europe

    NASA Astrophysics Data System (ADS)

    Romaniuk, Ryszard S.

    2013-01-01

    The aim of big infrastructural and research programs (like pan-European Framework Programs) and individual projects realized inside these programs in Europe is to structure the European Research Area - ERA in this way as to be competitive with the leaders of the world. One of this projects in EuCARD (European Coordination of Accelerator Research and Development) with the aim to structure and modernize accelerator, (including accelerators for big free electron laser machines) research infrastructure. This article presents the periodic development of EuCARD which took place between the annual meeting, April 2012 in Warsaw and SC meeting in Uppsala, December 2012. The background of all these efforts are achievements of the LHC machine and associated detectors in the race for new physics. The LHC machine works in the regime of p-p, Pb-p, Pb-Pb (protons and lead ions). Recently, a discovery by the LHC of Higgs like boson, has started vivid debates on the further potential of this machine and the future. The periodic EuCARD conference, workshop and meetings concern building of the research infrastructure, including in this advanced photonic and electronic systems for servicing large high energy physics experiments. There are debated a few basic groups of such systems like: measurement - control networks of large geometrical extent, multichannel systems for large amounts of metrological data acquisition, precision photonic networks of reference time, frequency and phase distribution. The aim of the discussion is not only summarize the current status but make plans and prepare practically to building new infrastructures. Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. Accelerator technology is intensely developed in all developed nations and regions of the world. The EuCARD project contains a lot of subjects related directly and indirectly to photon physics and photonics, as well as optoelectronics, electronics and integration of these with large research infrastructure.

  5. Diamond Pixel Luminosity Telescopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halyo, Valerie

    2014-12-15

    In this document, Halyo summaries her key contributions to CMS at the LHC and provide an explanation of their importance and her role in each project. At the end Halyo describes her recent research interest that includes GPU/MIC Acceleration of the High Level Trigger (HLT) to Extend the Physics Research at the LHC. A description of her work the recent promising results that she accomplished and the deliverable are also elaborated. These contribution were only possible thanks to DOE support of junior faculty research and their clear goal to promote research and innovations.

  6. Current Lead Design for the Accelerator Project for Upgrade of LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, Jeffrey S.; Cheban, Sergey; Feher, Sandor

    2010-01-01

    The Accelerator Project for Upgrade of LHC (APUL) is a U.S. project participating in and contributing to CERN's Large Hadron Collider (LHC) upgrade program. In collaboration with Brookhaven National Laboratory, Fermilab is developing sub-systems for an upgrade of the LHC final focus magnet systems. A concept of main and auxiliary helium flow was developed that allows the superconductor to remain cold while the lead body warms up to prevent upper section frosting. The auxiliary flow will subsequently cool the thermal shields of the feed box and the transmission line cryostats. A thermal analysis of the current lead central heat exchangemore » section was performed using analytic and FEA techniques. A method of remote soldering was developed that allows the current leads to be field replaceable. The remote solder joint was designed to be made without flux or additional solder, and able to be remade up to ten full cycles. A method of upper section attachment was developed that allows high pressure sealing of the helium volume. Test fixtures for both remote soldering and upper section attachment for the 13 kA lead were produced. The cooling concept, thermal analyses, and test results from both remote soldering and upper section attachment fixtures are presented.« less

  7. Design Studies and Optimization of High-Field Nb$$_3$$Sn Dipole Magnets for a Future Very High Energy PP Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kashikhin, V. V.; Novitski, I.; Zlobin, A. V.

    2017-05-01

    High filed accelerator magnets with operating fields of 15-16 T based on themore » $$Nb_3Sn$$ superconductor are being considered for the LHC energy upgrade or a future Very High Energy pp Collider. Magnet design studies are being conducted in the U.S., Europe and Asia to explore the limits of the $$Nb_3Sn$$ accelerator magnet technology while optimizing the magnet design and performance parame-ters, and reducing magnet cost. The first results of these studies performed at Fermilab in the framework of the US-MDP are reported in this paper.« less

  8. Fabrication of First 4-m Coils for the LARP MQXFA Quadrupole and Assembly in Mirror Structure

    DOE PAGES

    Holik, Eddie Frank; Ambrosio, Giorgio; Anerella, Michael; ...

    2017-01-23

    The US LHC Accelerator Research Program is constructing prototype interaction region quadrupoles as part of the US in-kind contribution to the Hi-Lumi LHC project. The low-beta MQXFA Q1/Q3 coils have a 4-m length and a 150 mm bore. The design is first validated on short, one meter models (MQXFS) developed as part of the longstanding Nb3Sn quadrupole R&D by LARP in collaboration with CERN. In parallel, facilities and tooling are being developed and refined at BNL, LBNL, and FNAL to enable long coil production, assembly, and cold testing. Long length scale-up is based on the experience from the LARP 90more » mm aperture (TQ-LQ) and 120 mm aperture (HQ and Long HQ) programs. A 4-m long MQXF practice coil was fabricated, water jet cut and analyzed to verify procedures, parts, and tooling. In parallel, the first complete prototype coil (QXFP01a) was fabricated and assembled in a long magnetic mirror, MQXFPM1, to provide early feedback on coil design and fabrication following the successful experience of previous LARP mirror tests.« less

  9. Dimensional changes of Nb 3Sn Rutherford cables during heat treatment

    DOE PAGES

    Rochepault, E.; Ferracin, P.; Ambrosio, G.; ...

    2016-06-01

    In high field magnet applications, Nb 3Sn coils undergo a heat treatment step after winding. During this stage, coils radially expand and longitudinally contract due to the Nb 3Sn phase change. In order to prevent residual strain from altering superconducting performances, the tooling must provide the adequate space for these dimensional changes. The aim of this paper is to understand the behavior of cable dimensions during heat treatment and to provide estimates of the space to be accommodated in the tooling for coil expansion and contraction. In addition, this paper summarizes measurements of dimensional changes on strands, single Rutherford cables,more » cable stacks, and coils performed between 2013 and 2015. These samples and coils have been performed within a collaboration between CERN and the U.S. LHC Accelerator Research Program to develop Nb 3Sn quadrupole magnets for the HiLumi LHC. The results are also compared with other high field magnet projects.« less

  10. Field Quality and Fabrication Analysis of HQ02 Reconstructed Nb3Sn Coil Cross Sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holik, Eddie Frank; Ambrosio, Giorgio; Carbonara, Andrea

    2017-01-23

    The US LHC Accelerator Research Program (LARP) quadrupole HQ02 was designed and fully tested as part of the low-beta quad development for Hi-Lumi LHC. HQ02’s design is well documented with full fabrication accounting along with full field analysis at low and high current. With this history, HQ02 is an excellent test bed for developing a methodology for measuring turn locations from magnet cross sections and comparing with CAD models and measured field. All 4 coils of HQ02 were cut in identical locations along the magnetic length corresponding to magnetic field measurement and coil metrology. A real-time camera and coordinate measuringmore » equipment was used to plot turn corners. Measurements include systematic and random displacements of winding blocks and individual turns along the magnetic length. The range of cable shifts and the field harmonic range along the length are in agreement, although correlating turn locations and measured harmonics in each cross section is challenging.« less

  11. Radiation Hard Silicon Particle Detectors for Phase-II LHC Trackers

    NASA Astrophysics Data System (ADS)

    Oblakowska-Mucha, A.

    2017-02-01

    The major LHC upgrade is planned after ten years of accelerator operation. It is foreseen to significantly increase the luminosity of the current machine up to 1035 cm-2s-1 and operate as the upcoming High Luminosity LHC (HL-LHC) . The major detectors upgrade, called the Phase-II Upgrade, is also planned, a main reason being the aging processes caused by severe particle radiation. Within the RD50 Collaboration, a large Research and Development program has been underway to develop silicon sensors with sufficient radiation tolerance for HL-LHC trackers. In this summary, several results obtained during the testing of the devices after irradiation to HL-LHC levels are presented. Among the studied structures, one can find advanced sensors types like 3D silicon detectors, High-Voltage CMOS technologies, or sensors with intrinsic gain (LGAD). Based on these results, the RD50 Collaboration gives recommendation for the silicon detectors to be used in the detector upgrade.

  12. HL-LHC and HE-LHC Upgrade Plans and Opportunities for US Participation

    NASA Astrophysics Data System (ADS)

    Apollinari, Giorgio

    2017-01-01

    The US HEP community has identified the exploitation of physics opportunities at the High Luminosity-LHC (HL-LHC) as the highest near-term priority. Thanks to multi-year R&D programs, US National Laboratories and Universities have taken the leadership in the development of technical solutions to increase the LHC luminosity, enabling the HL-LHC Project and uniquely positioning this country to make critical contributions to the LHC luminosity upgrade. This talk will describe the shaping of the US Program to contribute in the next decade to HL-LHC through newly developed technologies such as Nb3Sn focusing magnets or superconducting crab cavities. The experience gained through the execution of the HL-LHC Project in the US will constitute a pool of knowledge and capabilities allowing further developments in the future. Opportunities for US participations in proposed hadron colliders, such as a possible High Energy-LHC (HE-LHC), will be described as well.

  13. The operation of the LHC accelerator complex (2/2)

    ScienceCinema

    Redaelli, Stefano

    2018-05-23

    These lectures will give an overview of what happens when the LHC is in running mode. They are aimed at students working on the LHC experiments, but all those who are curious about what happens behind the scenes of the LHC are welcomed. You will learn all you always wanted to know about the LHC, and never had the courage to ask! The only pre-requisite is a basic, college-level, knowledge of EM and of the principles that allow to steer charged beams. Topics covered will include, among others: - the description of the injector chain, from the generation of the protons, to the delivery of bunches to the LHC. - the discussion of the steps required to accelerate the beams in the LHC, to bring them into collision, and to control the luminosity at the interaction points. - the description of the monitoring tools available to the LHC operators, and an explanation of the various plots and panels that can be found on the LHC web pages

  14. The operation of the LHC accelerator complex (1/2)

    ScienceCinema

    Redaelli, Stefano

    2018-05-23

    These lectures will give an overview of what happens when the LHC is in running mode. They are aimed at students working on the LHC experiments, but all those who are curious about what happens behind the scenes of the LHC are welcomed. You will learn all you always wanted to know about the LHC, and never had the courage to ask! The only pre-requisite is a basic, college-level, knowledge of EM and of the principles that allow to steer charged beams. Topics covered will include, among others: - the description of the injector chain, from the generation of the protons, to the delivery of bunches to the LHC. - the discussion of the steps required to accelerate the beams in the LHC, to bring them into collision, and to control the luminosity at the interaction points. - the description of the monitoring tools available to the LHC operators, and an explanation of the various plots and panels that can be found on the LHC web pages.

  15. 2017 Topical Workshop on Electronics for Particle Physics

    NASA Astrophysics Data System (ADS)

    2017-09-01

    The workshop will cover all aspects of electronics for particle physics experiments, and accelerator instrumentation of general interest to users. LHC experiments (and their operational experience) will remain a focus of the meeting but a strong emphasis on R&D for future experimentation will be maintained, such as SLHC, CLIC, ILC, neutrino facilities as well as other particle and astroparticle physics experiments. The purpose of the workshop is: To present results and original concepts for electronic research and development relevant to experiments as well as accelerator and beam instrumentation at future facilities; To review the status of electronics for the LHC experiments; To identify and encourage common efforts for the development of electronics; To promote information exchange and collaboration in the relevant engineering and physics communities.

  16. Future Circular Colliders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, Don

    While the LHC is currently the highest energy particle accelerator ever built, nothing is forever. In this video, Fermilab’s Dr. Don Lincoln discusses a new particle accelerator currently under discussion. This accelerator will dwarf the LHC, fully 60 miles around and will accelerate protons to seven times higher energy. The project is merely in the discussion stages and it is a staggering endeavor, but it is the next natural step in our millennium long journey to understand the universe.

  17. Thermo-magnetic instabilities in Nb 3Sn superconducting accelerator magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bordini, Bernardo

    2006-09-01

    The advance of High Energy Physics research using circulating accelerators strongly depends on increasing the magnetic bending field which accelerator magnets provide. To achieve high fields, the most powerful present-day accelerator magnets employ NbTi superconducting technology; however, with the start up of Large Hadron Collider (LHC) in 2007, NbTi magnets will have reached the maximum field allowed by the intrinsic properties of this superconductor. A further increase of the field strength necessarily requires a change in superconductor material; the best candidate is Nb 3Sn. Several laboratories in the US and Europe are currently working on developing Nb 3Sn accelerator magnets,more » and although these magnets have great potential, it is suspected that their performance may be fundamentally limited by conductor thermo-magnetic instabilities: an idea first proposed by the Fermilab High Field Magnet group early in 2003. This thesis presents a study of thermo-magnetic instability in high field Nb 3Sn accelerator magnets. In this chapter the following topics are described: the role of superconducting magnets in High Energy Physics; the main characteristics of superconductors for accelerator magnets; typical measurements of current capability in superconducting strands; the properties of Nb 3Sn; a description of the manufacturing process of Nb 3Sn strands; superconducting cables; a typical layout of superconducting accelerator magnets; the current state of the art of Nb 3Sn accelerator magnets; the High Field Magnet program at Fermilab; and the scope of the thesis.« less

  18. Development of MQXF: The Nb 3Sn low-β quadrupole for the HiLumi LHC

    DOE PAGES

    Ferracin, P.; G. Ambrosio; Anerella, M.; ...

    2015-12-18

    The High Luminosity (HiLumi) Large Hadron Collider (LHC) project has, as the main objective, to increase the LHC peak luminosity by a factor five and the integrated luminosity by a factor ten. This goal will be achieved mainly with a new interaction region layout, which will allow a stronger focusing of the colliding beams. The target will be to reduce the beam size in the interaction points by a factor of two, which requires doubling the aperture of the low-β (or inner triplet) quadrupole magnets. The use of Nb3Sn superconducting material and, as a result, the possibility of operating atmore » magnetic field levels in the windings higher than 11 T will limit the increase in length of these quadrupoles, called MQXF, to acceptable levels. After the initial design phase, where the key parameters were chosen and the magnet's conceptual design finalized, the MQXF project, a joint effort between the U.S. LHC Accelerator Research Program and the Conseil Europeen pour la Recherche Nucleaire (CERN), has now entered the construction and test phase of the short models. Concurrently, the preparation for the development of the full-length prototypes has been initiated. Lastly, this paper will provide an overview of the project status, describing and reporting on the performance of the superconducting material, the lessons learnt during the fabrication of superconducting coils and support structure, and the fine tuning of the magnet design in view of the start of the prototyping phase.« less

  19. Fabrication and Analysis of 150-mm-Aperture Nb 3Sn MQXF Coils

    DOE PAGES

    Holik, E. F.; Ambrosio, G.; Anerella, M.; ...

    2016-01-12

    The U.S. LHC Accelerator Research Program (LARP) and CERN are combining efforts for the HiLumi-LHC upgrade to design and fabricate 150-mm-aperture, interaction region quadrupoles with a nominal gradient of 130 T/m using Nb 3Sn. To successfully produce the necessary long MQXF triplets, the HiLumi-LHC collaboration is systematically reducing risk and design modification by heavily relying upon the experience gained from the successful 120-mm-aperture LARP HQ program. First generation MQXF short (MQXFS) coils were predominately a scaling up of the HQ quadrupole design allowing comparable cable expansion during Nb 3Sn formation heat treatment and increased insulation fraction for electrical robustness. Amore » total of 13 first generation MQXFS coils were fabricated between LARP and CERN. Systematic differences in coil size, coil alignment symmetry, and coil length contraction during heat treatment are observed and likely due to slight variances in tooling and insulation/cable systems. Analysis of coil cross sections indicate that field-shaping wedges and adjacent coil turns are systematically displaced from the nominal location and the cable is expanding less than nominally designed. Lastly, a second generation MQXF coil design seeks to correct the expansion and displacement discrepancies by increasing insulation and adding adjustable shims at the coil pole and midplanes to correct allowed magnetic field harmonics.« less

  20. The HL-LHC Accelerator Physics Challenges

    NASA Astrophysics Data System (ADS)

    Fartoukh, S.; Zimmermann, F.

    The conceptual baseline of the HL-LHC project is reviewed, putting into perspective the main beam physics challenges of this new collider in comparison with the existing LHC, and the series of solutions and possible mitigation measures presently envisaged.

  1. First Test Results of the 150 mm Aperture IR Quadrupole Models for the High Luminosity LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosio, G.; Chlachidze, G.; Wanderer, P.

    2016-10-06

    The High Luminosity upgrade of the LHC at CERN will use large aperture (150 mm) quadrupole magnets to focus the beams at the interaction points. The high field in the coils requires Nb3Sn superconductor technology, which has been brought to maturity by the LHC Accelerator Re-search Program (LARP) over the last 10 years. The key design targets for the new IR quadrupoles were established in 2012, and fabrication of model magnets started in 2014. This paper discusses the results from the first single short coil test and from the first short quadrupole model test. Remaining challenges and plans to addressmore » them are also presented and discussed.« less

  2. High Luminosity LHC: challenges and plans

    NASA Astrophysics Data System (ADS)

    Arduini, G.; Barranco, J.; Bertarelli, A.; Biancacci, N.; Bruce, R.; Brüning, O.; Buffat, X.; Cai, Y.; Carver, L. R.; Fartoukh, S.; Giovannozzi, M.; Iadarola, G.; Li, K.; Lechner, A.; Medina Medrano, L.; Métral, E.; Nosochkov, Y.; Papaphilippou, Y.; Pellegrini, D.; Pieloni, T.; Qiang, J.; Redaelli, S.; Romano, A.; Rossi, L.; Rumolo, G.; Salvant, B.; Schenk, M.; Tambasco, C.; Tomás, R.; Valishev, S.; Van der Veken, F. F.

    2016-12-01

    The Large Hadron Collider (LHC) is one of the largest scientific instruments ever built. Since opening up a new energy frontier for exploration in 2010, it has gathered a global user community working in fundamental particle physics and the physics of hadronic matter at extreme temperature and density. To sustain and extend its discovery potential, the LHC will undergo a major upgrade in the 2020s. This will increase its rate of collisions by a factor of five beyond the original design value and the integrated luminosity by a factor ten. The new configuration, known as High Luminosity LHC (HL-LHC), will rely on a number of key innovations that push accelerator technology beyond its present limits. Among these are cutting-edge 11-12 T superconducting magnets, including Nb3Sn-based magnets never used in accelerators before, compact superconducting cavities for longitudinal beam rotation, new technology and physical processes for beam collimation. The dynamics of the HL-LHC beams will be also particularly challenging and this aspect is the main focus of this paper.

  3. Instrumentation status of the low-b magnet systems at the Large Hadron Collider (LHC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darve, C.; /Fermilab; Balle, C.

    2011-05-01

    The low-{beta} magnet systems are located in the Large Hadron Collider (LHC) insertion regions around the four interaction points. They are the key elements in the beams focusing/defocusing process allowing proton collisions at luminosity up to 10{sup 34}cm{sup -2}s{sup -1}. Those systems are a contribution of the US-LHC Accelerator project. The systems are mainly composed of the quadrupole magnets (triplets), the separation dipoles and their respective electrical feed-boxes (DFBX). The low-{beta} magnet systems operate in an environment of extreme radiation, high gradient magnetic field and high heat load to the cryogenic system due to the beam dynamic effect. Due tomore » the severe environment, the robustness of the diagnostics is primordial for the operation of the triplets. The hardware commissioning phase of the LHC was completed in February 2010. In the sake of a safer and more user-friendly operation, several consolidations and instrumentation modifications were implemented during this commissioning phase. This paper presents the instrumentation used to optimize the engineering process and operation of the final focusing/defocusing quadrupole magnets for the first years of operation.« less

  4. LHC interaction region quadrupole cryostat design

    NASA Astrophysics Data System (ADS)

    Nicol, T. H.; Darve, Ch.; Huang, Y.; Page, T. M.

    2002-05-01

    The cryostat of a Large Hadron Collider (LHC) Interaction Region (IR) quadrupole magnet consists of all components of the inner triplet except the magnet assembly itself. It serves to support the magnet accurately and reliably within the vacuum vessel, to house all required cryogenic piping, and to insulate the cold mass from heat radiated and conducted from the environment. It must function reliably during storage, shipping and handling, normal magnet operation, quenches, and seismic excitations, and must be able to be manufactured at low cost. The major components of the cryostat are the vacuum vessel, thermal shield, multi-layer insulation system, cryogenic piping, and suspension system. The overall design of a cryostat for superconducting accelerator magnets requires consideration of fluid flow, proper selection of materials for their thermal and structural performance at both ambient and operating temperature, and knowledge of the environment to which the magnets will be subjected over the course of their expected operating lifetime. This paper describes the current LHC IR inner triplet quadrupole magnet cryostats being designed and manufactured at Fermilab as part of the US-LHC collaboration, and includes discussions on the structural and thermal considerations involved in the development of each of the major systems.

  5. INSTRUMENTS AND METHODS OF INVESTIGATION: Giant pulses of thermal neutrons in large accelerator beam dumps. Possibilities for experiments

    NASA Astrophysics Data System (ADS)

    Stavissky, Yurii Ya

    2006-12-01

    A short review is presented of the development in Russia of intense pulsed neutron sources for physical research — the pulsating fast reactors IBR-1, IBR-30, IBR-2 (Joint Institute for Nuclear Research, Dubna), and the neutron-radiation complex of the Moscow meson factory — the 'Troitsk Trinity' (RAS Institute for Nuclear Research, Troitsk, Moscow region). The possibility of generating giant neutron pulses in beam dumps of superhigh energy accelerators is discussed. In particular, the possibility of producing giant pulsed thermal neutron fluxes in modified beam dumps of the large hadron collider (LHD) under construction at CERN is considered. It is shown that in the case of one-turn extraction ov 7-TeV protons accumulated in the LHC main rings on heavy targets with water or zirconium-hydride moderators placed in the front part of the LHC graphite beam-dump blocks, every 10 hours relatively short (from ~100 µs) thermal neutron pulses with a peak flux density of up to ~1020 neutrons cm-2 s-1 may be produced. The possibility of applying such neutron pulses in physical research is discussed.

  6. Test results of the LARP Nb$$_3$$Sn quadrupole HQ03a

    DOE PAGES

    DiMarco, J.; G. Ambrosio; Chlachidze, G.; ...

    2016-03-09

    The US LHC Accelerator Research Program (LARP) has been developingmore » $$Nb_3Sn$$ quadrupoles of progressively increasing performance for the high luminosity upgrade of the Large Hadron Collider. The 120 mm aperture High-field Quadrupole (HQ) models are the last step in the R&D phase supporting the development of the new IR Quadrupoles (MQXF). Three series of HQ coils were fabricated and assembled in a shell-based support structure, progressively optimizing the design and fabrication process. The final set of coils consistently applied the optimized design solutions, and was assembled in the HQ03a model. Furthermore, this paper reports a summary of the HQ03a test results, including training, mechanical performance, field quality and quench studies.« less

  7. Contextualized Magnetism in Secondary School: Learning from the LHC (CERN)

    ERIC Educational Resources Information Center

    Cid, Ramon

    2005-01-01

    Physics teachers in secondary schools usually mention the world's largest particle physics laboratory--CERN (European Organization for Nuclear Research)--only because of the enormous size of the accelerators and detectors used there, the number of scientists involved in their activities and also the necessary international scientific…

  8. Final Report for U.S. DOE GRANT No. DEFG02-96ER41015 November 1, 2010 - April 30, 2013 entitled HIGH ENERGY ACCELERATOR AND COLLIDING BEAM USER GROUP at the UNIVERSITY of MARYLAND

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadley, Nicholas; Jawahery, Abolhassan; Eno, Sarah C

    2013-07-26

    We have finished the third year of a three year grant cycle with the U.S. Department of Energy for which we were given a five month extension (U.S. D.O.E. Grant No. DEFG02-96ER41015). This document is the fi nal report for this grant and covers the period from November 1, 2010 to April 30, 2013. The Maryland program is administered as a single task with Professor Nicholas Hadley as Principal Investigator. The Maryland experimental HEP group is focused on two major research areas. We are members of the CMS experiment at the LHC at CERN working on the physics of themore » Energy Frontier. We are also analyzing the data from the Babar experiment at SLAC while doing design work and R&D towards a Super B experiment as part of the Intensity Frontier. We have recently joined the LHCb experiment at CERN. We concluded our activities on the D experiment at Fermilab in 2009.« less

  9. LARP Long Quadrupole: A "Long" Step Toward an LHC

    ScienceCinema

    Giorgio Ambrosio

    2017-12-09

    The beginning of the development of Nb3Sn magnets for particle accelerators goes back to the 1960’s. But only very recently has this development begun to face the challenges of fabricating Nb3Sn magnets which can meet the requirements of modern particle accelerators. LARP (the LHC Accelerator Research Program) is leading this effort focusing on long models of the Interaction Region quadrupoles for a possible luminosity upgrade of the Large Hadron Collider. A major milestone in this development is to test, by the end of 2009, 4m-long quadrupole models, which will be the first Nb3Sn accelerator-type magnets approaching the length of real accelerator magnets. The Long Quadrupoles (LQ) are “Proof-of-Principle” magnets which are to demonstrate that Nb3Sn technology is sufficiently mature for use in high energy particle accelerators. Their design is based on the LARP Technological Quadrupole (TQ) models, under development at FNAL and LBNL, which have design gradients higher than 200 T/m and an aperture of 90 mm. Several challenges must be addressed for the successful fabrication of long Nb3Sn coils and magnets. These challenges and the solutions adopted will be presented together with the main features of the LQ magnets. Several R&D lines are participating to this effort and their contributions will be also presented.

  10. Reliability and degradation of oxide VCSELs due to reaction to atmospheric water vapor

    NASA Astrophysics Data System (ADS)

    Dafinca, Alexandru; Weidberg, Anthony R.; McMahon, Steven J.; Grillo, Alexander A.; Farthouat, Philippe; Ziolkowski, Michael; Herrick, Robert W.

    2013-03-01

    850nm oxide-aperture VCSELs are susceptible to premature failure if operated while exposed to atmospheric water vapor, and not protected by hermetic packaging. The ATLAS detector in CERN's Large Hadron Collider (LHC) has had approximately 6000 channels of Parallel Optic VCSELs fielded under well-documented ambient conditions. Exact time-to-failure data has been collected on this large sample, providing for the first time actual failure data at use conditions. In addition, the same VCSELs were tested under a variety of accelerated conditions to allow us to construct a more accurate acceleration model. Failure analysis information will also be presented to show what we believe causes corrosion-related failure for such VCSELs.

  11. High Luminosity LHC: Challenges and plans

    DOE PAGES

    Arduini, G.; Barranco, J.; Bertarelli, A.; ...

    2016-12-28

    The Large Hadron Collider (LHC) is one of the largest scientific instruments ever built. Since opening up a new energy frontier for exploration in 2010, it has gathered a global user community working in fundamental particle physics and the physics of hadronic matter at extreme temperature and density. To sustain and extend its discovery potential, the LHC will undergo a major upgrade in the 2020s. This will increase its rate of collisions by a factor of five beyond the original design value and the integrated luminosity by a factor ten. The new configuration, known as High Luminosity LHC (HL-LHC), willmore » rely on a number of key innovations that push accelerator technology beyond its present limits. Among these are cutting-edge 11–12 T superconducting magnets, including Nb 3Sn-based magnets never used in accelerators before, compact superconducting cavities for longitudinal beam rotation, new technology and physical processes for beam collimation. As a result, the dynamics of the HL-LHC beams will be also particularly challenging and this aspect is the main focus of this paper.« less

  12. Progress in the Long $${\\rm Nb}_{3}{\\rm Sn}$$ Quadrupole R&D by LARP

    DOE PAGES

    Ambrosio, G.; Andreev, N.; Anerella, M.; ...

    2011-11-14

    After the successful test of the first long Nb 3Sn quadrupole (LQS01) the US LHC Accelerator Research Program (LARP, a collaboration of BNL, FNAL, LBNL and SLAC) is assessing training memory, reproducibility, and other accelerator quality features of long Nb 3Sn quadrupole magnets. LQS01b (a reassembly of LQS01 with more uniform and higher pre-stress) was subjected to a full thermal cycle and reached the previous plateau of 222 T/m at 4.5 K in two quenches. A new set of four coils, made of the same type of conductor used in LQS01 (RRP 54/61 by Oxford Superconducting Technology), was assembled inmore » the LQS01 structure and tested at 4.5 K and lower temperatures. The new magnet (LQS02) reached the target gradient (200 T/m) only at 2.6 K and lower temperatures, at intermediate ramp rates. The preliminary test analysis, here reported, showed a higher instability in the limiting coil than in the other coils of LQS01 and LQS02.« less

  13. Numerical simulations of a proposed hollow electron beam collimator for the LHC upgrade at CERN.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Previtali, V.; Stancari, G.; Valishev, A.

    2013-07-12

    In the last years the LHC collimation system has been performing over the expectations, providing the machine with a nearly perfect e cient cleaning system[1]. Nonetheless, when trying to push the existing accelerators to - and over - their design limits, all the accelerator components are required to boost their performances. In particular, in view of the high luminosity frontier for the LHC, the increased intensity would ask for a more e cient cleaning system. In this framework innovative collimation solutions are under evaluation[2]: one option is the usage of an hollow electron lens for beam halo cleaning. This workmore » intends to study the applicability of an the hollow electron lens for the LHC collimation, by evaluating the case of the existing Tevatron e-lens applied to the nominal LHC 7 TeV beam. New e-lens operation modes are here proposed to standard enhance the electron lens halo removal e ect.« less

  14. Smashing Protons to Smithereens

    ScienceCinema

    Pleier, Marc-André

    2018-01-05

    Pleier discusses the extraordinary research taking place at the Large Hadron Collider (LHC) — the world’s newest, biggest, and highest energy particle accelerator located at CERN. Pleier is one of hundreds of researchers from around the world working on ATLAS, a seven-story particle detector positioned at a point where the LHC’s oppositely circulating beams of protons slam into one another head-on.

  15. Fermilab | Fermilab Disclaimer

    Science.gov Websites

    Accelerator Science and Technology Facility LHC, LCLS-II and future accelerators Accelerators for science and usefulness of any information, apparatus, product or process disclosed, or represents that its use would not

  16. Radiation tolerant power converter controls

    NASA Astrophysics Data System (ADS)

    Todd, B.; Dinius, A.; King, Q.; Uznanski, S.

    2012-11-01

    The Large Hadron Collider (LHC) at the European Organisation for Nuclear Research (CERN) is the world's most powerful particle collider. The LHC has several thousand magnets, both warm and super-conducting, which are supplied with current by power converters. Each converter is controlled by a purpose-built electronic module called a Function Generator Controller (FGC). The FGC allows remote control of the power converter and forms the central part of a closed-loop control system where the power converter voltage is set, based on the converter output current and magnet-circuit characteristics. Some power converters and FGCs are located in areas which are exposed to beam-induced radiation. There are numerous radiation induced effects, some of which lead to a loss of control of the power converter, having a direct impact upon the accelerator's availability. Following the first long shut down (LS1), the LHC will be able to run with higher intensity beams and higher beam energy. This is expected to lead to significantly increased radiation induced effects in materials close to the accelerator, including the FGC. Recent radiation tests indicate that the current FGC would not be sufficiently reliable. A so-called FGClite is being designed to work reliably in the radiation environment in the post-LS1 era. This paper outlines the concepts of power converter controls for machines such as the LHC, introduces the risks related to radiation and a radiation tolerant project flow. The FGClite is then described, with its key concepts and challenges: aiming for high reliability in a radiation field.

  17. Overview of LHC physics results at ICHEP

    ScienceCinema

    Mangano, Michelangelo

    2018-06-20

    This month LHC physics day will review the physics results presented by the LHC experiments at the 2010 ICHEP in Paris. The experimental presentations will be preceeded by the bi-weekly LHC accelerator status report.The meeting will be broadcast via EVO (detailed info will appear at the time of the meeting in the "Video Services" item on the left menu bar). For those attending, information on accommodation, access to CERN and laptop registration is available from http://cern.ch/lpcc/visits

  18. Overview of LHC physics results at ICHEP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2011-02-25

     This month LHC physics day will review the physics results presented by the LHC experiments at the 2010 ICHEP in Paris. The experimental presentations will be preceeded by the bi-weekly LHC accelerator status report.The meeting will be broadcast via EVO (detailed info will appear at the time of the meeting in the "Video Services" item on the left menu bar)For those attending, information on accommodation, access to CERN and laptop registration is available from http://cern.ch/lpcc/visits

  19. Quench simulations for superconducting elements in the LHC accelerator

    NASA Astrophysics Data System (ADS)

    Sonnemann, F.; Schmidt, R.

    2000-08-01

    The design of the protection system for the superconducting elements in an accelerator such as the large Hadron collider (LHC), now under construction at CERN, requires a detailed understanding of the thermo-hydraulic and electrodynamic processes during a quench. A numerical program (SPQR - simulation program for quench research) has been developed to evaluate temperature and voltage distributions during a quench as a function of space and time. The quench process is simulated by approximating the heat balance equation with the finite difference method in presence of variable cooling and powering conditions. The simulation predicts quench propagation along a superconducting cable, forced quenching with heaters, impact of eddy currents induced by a magnetic field change, and heat transfer through an insulation layer into helium, an adjacent conductor or other material. The simulation studies allowed a better understanding of experimental quench data and were used for determining the adequate dimensioning and protection of the highly stabilised superconducting cables for connecting magnets (busbars), optimising the quench heater strip layout for the main magnets, and studying quench back by induced eddy currents in the superconductor. After the introduction of the theoretical approach, some applications of the simulation model for the LHC dipole and corrector magnets are presented and the outcome of the studies is compared with experimental data.

  20. Progress with High-Field Superconducting Magnets for High-Energy Colliders

    NASA Astrophysics Data System (ADS)

    Apollinari, Giorgio; Prestemon, Soren; Zlobin, Alexander V.

    2015-10-01

    One of the possible next steps for high-energy physics research relies on a high-energy hadron or muon collider. The energy of a circular collider is limited by the strength of bending dipoles, and its maximum luminosity is determined by the strength of final focus quadrupoles. For this reason, the high-energy physics and accelerator communities have shown much interest in higher-field and higher-gradient superconducting accelerator magnets. The maximum field of NbTi magnets used in all present high-energy machines, including the LHC, is limited to ˜10 T at 1.9 K. Fields above 10 T became possible with the use of Nb3Sn superconductors. Nb3Sn accelerator magnets can provide operating fields up to ˜15 T and can significantly increase the coil temperature margin. Accelerator magnets with operating fields above 15 T require high-temperature superconductors. This review discusses the status and main results of Nb3Sn accelerator magnet research and development and work toward 20-T magnets.

  1. Progress with high-field superconducting magnets for high-energy colliders

    DOE PAGES

    Apollinari, Giorgio; Prestemon, Soren; Zlobin, Alexander V.

    2015-10-01

    One of the possible next steps for high-energy physics research relies on a high-energy hadron or muon collider. The energy of a circular collider is limited by the strength of bending dipoles, and its maximum luminosity is determined by the strength of final focus quadrupoles. For this reason, the high-energy physics and accelerator communities have shown much interest in higher-field and higher-gradient superconducting accelerator magnets. The maximum field of NbTi magnets used in all present high-energy machines, including the LHC, is limited to ~10 T at 1.9 K. Fields above 10 T became possible with the use of Nbmore » $$_3$$Sn superconductors. Nb$$_3$$Sn accelerator magnets can provide operating fields up to ~15 T and can significantly increase the coil temperature margin. Accelerator magnets with operating fields above 15 T require high-temperature superconductors. Furthermore, this review discusses the status and main results of Nb$$_3$$Sn accelerator magnet research and development and work toward 20-T magnets.« less

  2. Two-Layer 16 Tesla Cosθ Dipole Design for the FCC

    DOE PAGES

    Holik, Eddie Frank; Ambrosio, Giorgio; Apollinari, G.

    2018-02-13

    The Future Circular Collider or FCC is a study aimed at exploring the possibility to reach 100 TeV total collision energy which would require 16 tesla dipoles. Upon the conclusion of the High Luminosity Upgrade, the US LHC Accelerator Upgrade Pro-ject in collaboration with CERN will have extensive Nb3Sn magnet fabrication experience. This experience includes robust Nb3Sn conductor and insulation scheming, 2-layer cos2θ coil fabrication, and bladder-and-key structure and assembly. By making im-provements and modification to existing technology the feasibility of a two-layer 16 tesla dipole is investigated. Preliminary designs indicate that fields up to 16.6 tesla are feasible withmore » conductor grading while satisfying the HE-LHC and FCC specifications. Key challenges include accommodating high-aspect ratio conductor, narrow wedge design, Nb3Sn conductor grading, and especially quench protection of a 16 tesla device.« less

  3. Two-Layer 16 T Cos θ Dipole Design for the FCC

    DOE PAGES

    Holik, Eddie Frank; Ambrosio, Giorgio; Apollinari, Giorgio

    2018-02-22

    Here, the Future Circular Collider or FCC is a study aimed at exploring the possibility to reach 100 TeV total collision energy which would require 16 tesla dipoles. Upon the conclusion of the High Luminosity Upgrade, the US LHC Accelerator Upgrade Pro-ject in collaboration with CERN will have extensive Nb 3Sn magnet fabrication experience. This experience includes robust Nb 3Sn conductor and insulation scheming, 2-layer cos2θ coil fabrication, and bladder-and-key structure and assembly. By making im-provements and modification to existing technology the feasibility of a two-layer 16 tesla dipole is investigated. Preliminary designs indicate that fields up to 16.6 teslamore » are feasible with conductor grading while satisfying the HE-LHC and FCC specifications. Key challenges include accommodating high-aspect ratio conductor, narrow wedge design, Nb 3Sn conductor grading, and especially quench protection of a 16 tesla device.« less

  4. Two-Layer 16 T Cos θ Dipole Design for the FCC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holik, Eddie Frank; Ambrosio, Giorgio; Apollinari, Giorgio

    Here, the Future Circular Collider or FCC is a study aimed at exploring the possibility to reach 100 TeV total collision energy which would require 16 tesla dipoles. Upon the conclusion of the High Luminosity Upgrade, the US LHC Accelerator Upgrade Pro-ject in collaboration with CERN will have extensive Nb 3Sn magnet fabrication experience. This experience includes robust Nb 3Sn conductor and insulation scheming, 2-layer cos2θ coil fabrication, and bladder-and-key structure and assembly. By making im-provements and modification to existing technology the feasibility of a two-layer 16 tesla dipole is investigated. Preliminary designs indicate that fields up to 16.6 teslamore » are feasible with conductor grading while satisfying the HE-LHC and FCC specifications. Key challenges include accommodating high-aspect ratio conductor, narrow wedge design, Nb 3Sn conductor grading, and especially quench protection of a 16 tesla device.« less

  5. Linear Collider Physics Resource Book Snowmass 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronan

    The American particle physics community can look forward to a well-conceived and vital program of experimentation for the next ten years, using both colliders and fixed target beams to study a wide variety of pressing questions. Beyond 2010, these programs will be reaching the end of their expected lives. The CERN LHC will provide an experimental program of the first importance. But beyond the LHC, the American community needs a coherent plan. The Snowmass 2001 Workshop and the deliberations of the HEPAP subpanel offer a rare opportunity to engage the full community in planning our future for the next decademore » or more. A major accelerator project requires a decade from the beginning of an engineering design to the receipt of the first data. So it is now time to decide whether to begin a new accelerator project that will operate in the years soon after 2010. We believe that the world high-energy physics community needs such a project. With the great promise of discovery in physics at the next energy scale, and with the opportunity for the uncovering of profound insights, we cannot allow our field to contract to a single experimental program at a single laboratory in the world. We believe that an e{sup +}e{sup -} linear collider is an excellent choice for the next major project in high-energy physics. Applying experimental techniques very different from those used at hadron colliders, an e{sup +}e{sup -} linear collider will allow us to build on the discoveries made at the Tevatron and the LHC, and to add a level of precision and clarity that will be necessary to understand the physics of the next energy scale. It is not necessary to anticipate specific results from the hadron collider programs to argue for constructing an e{sup +}e{sup -} linear collider; in any scenario that is now discussed, physics will benefit from the new information that e{sup +}e{sup -} experiments can provide. This last point merits further emphasis. If a new accelerator could be designed and built in a few years, it would make sense to wait for the results of each accelerator before planning the next one. Thus, we would wait for the results from the Tevatron before planning the LHC experiments, and wait for the LHC before planning any later stage. In reality accelerators require a long time to construct, and they require such specialized resources and human talent that delay can cripple what would be promising opportunities. In any event, we believe that the case for the linear collider is so compelling and robust that we can justify this facility on the basis of our current knowledge, even before the Tevatron and LHC experiments are done. The physics prospects for the linear collider have been studied intensively for more than a decade, and arguments for the importance of its experimental program have been developed from many different points of view. This book provides an introduction and a guide to this literature. We hope that it will allow physicists new to the consideration of linear collider physics to start from their own personal perspectives and develop their own assessments of the opportunities afforded by a linear collider.« less

  6. Fermilab | Illinois Accelerator Research Center | Illinois Accelerator

    Science.gov Websites

    Department of Commerce and Economic Opportunity and the U.S. Department of Energy. Construction Progress as Research Center IARC Illinois Accelerator Research Center Fermilab U.S. Department of Energy Stewardship Pilot Program Contact IARC Funded By Illinois Department of Commerce and Economic Opportunity U.S

  7. Studies of QCD structure in high-energy collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nadolsky, Pavel M.

    2016-06-26

    ”Studies of QCD structure in high-energy collisions” is a research project in theoretical particle physics at Southern Methodist University funded by US DOE Award DE-SC0013681. The award furnished bridge funding for one year (2015/04/15-2016/03/31) between the periods funded by Nadolsky’s DOE Early Career Research Award DE-SC0003870 (in 2010-2015) and a DOE grant DE-SC0010129 for SMU Department of Physics (starting in April 2016). The primary objective of the research is to provide theoretical predictions for Run-2 of the CERN Large Hadron Collider (LHC). The LHC physics program relies on state-of-the-art predictions in the field of quantum chromodynamics. The main effort ofmore » our group went into the global analysis of parton distribution functions (PDFs) employed by the bulk of LHC computations. Parton distributions describe internal structure of protons during ultrarelivistic collisions. A new generation of CTEQ parton distribution functions (PDFs), CT14, was released in summer 2015 and quickly adopted by the HEP community. The new CT14 parametrizations of PDFs were obtained using benchmarked NNLO calculations and latest data from LHC and Tevatron experiments. The group developed advanced methods for the PDF analysis and estimation of uncertainties in LHC predictions associated with the PDFs. We invented and refined a new ’meta-parametrization’ technique that streamlines usage of PDFs in Higgs boson production and other numerous LHC processes, by combining PDFs from various groups using multivariate stochastic sampling. In 2015, the PDF4LHC working group recommended to LHC experimental collaborations to use ’meta-parametrizations’ as a standard technique for computing PDF uncertainties. Finally, to include new QCD processes into the global fits, our group worked on several (N)NNLO calculations.« less

  8. High-Performance Secure Database Access Technologies for HEP Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysismore » capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure authorization is pushed into the database engine will eliminate inefficient data transfer bottlenecks. Furthermore, traditionally separated database and security layers provide an extra vulnerability, leaving a weak clear-text password authorization as the only protection on the database core systems. Due to the legacy limitations of the systems’ security models, the allowed passwords often can not even comply with the DOE password guideline requirements. We see an opportunity for the tight integration of the secure authorization layer with the database server engine resulting in both improved performance and improved security. Phase I has focused on the development of a proof-of-concept prototype using Argonne National Laboratory’s (ANL) Argonne Tandem-Linac Accelerator System (ATLAS) project as a test scenario. By developing a grid-security enabled version of the ATLAS project’s current relation database solution, MySQL, PIOCON Technologies aims to offer a more efficient solution to secure database access.« less

  9. Commissioning of the cryogenics of the LHC long straight sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perin, A.; Casas-Cubillos, J.; Claudet, S.

    2010-01-01

    The LHC is made of eight circular arcs interspaced with eight Long Straight Sections (LSS). Most powering interfaces to the LHC are located in these sections where the particle beams are focused and shaped for collision, cleaning and acceleration. The LSSs are constituted of several unique cryogenic devices and systems like electrical feed-boxes, standalone superconducting magnets, superconducting links, RF cavities and final focusing superconducting magnets. This paper presents the cryogenic commissioning and the main results obtained during the first operation of the LHC Long Straight Sections.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Syphers, M. J.; Chattopadhyay, S.

    An overview is provided of the currently envisaged landscape of charged particle accelerators at the energy and intensity frontiers to explore particle physics beyond the standard model via 1-100 TeV-scale lepton and hadron colliders and multi-Megawatt proton accelerators for short- and long- baseline neutrino experiments. The particle beam physics, associated technological challenges and progress to date for these accelerator facilities (LHC, HL-LHC, future 100 TeV p-p colliders, Tev-scale linear and circular electron-positron colliders, high intensity proton accelerator complex PIP-II for DUNE and future upgrade to PIP-III) are outlined. Potential and prospects for advanced “nonlinear dynamic techniques” at the multi-MW levelmore » intensity frontier and advanced “plasma- wakefield-based techniques” at the TeV-scale energy frontier and are also described.« less

  11. Accelerator physics and technology challenges of very high energy hadron colliders

    NASA Astrophysics Data System (ADS)

    Shiltsev, Vladimir D.

    2015-08-01

    High energy hadron colliders have been in the forefront of particle physics for more than three decades. At present, international particle physics community considers several options for a 100 TeV proton-proton collider as a possible post-LHC energy frontier facility. The method of colliding beams has not fully exhausted its potential but has slowed down considerably in its progress. This paper briefly reviews the accelerator physics and technology challenges of the future very high energy colliders and outlines the areas of required research and development towards their technical and financial feasibility.

  12. Accelerator physics and technology challenges of very high energy hadron colliders

    DOE PAGES

    Shiltsev, Vladimir D.

    2015-08-20

    High energy hadron colliders have been in the forefront of particle physics for more than three decades. At present, international particle physics community considers several options for a 100 TeV proton–proton collider as a possible post-LHC energy frontier facility. The method of colliding beams has not fully exhausted its potential but has slowed down considerably in its progress. This article briefly reviews the accelerator physics and technology challenges of the future very high energy colliders and outlines the areas of required research and development towards their technical and financial feasibility.

  13. Conductor Specification and Validation for High-Luminosity LHC Quadrupole Magnets

    DOE PAGES

    Cooley, L. D.; Ghosh, A. K.; Dietderich, D. R.; ...

    2017-06-01

    The High Luminosity Upgrade of the Large Hadron Collider (HL-LHC) at CERN will replace the main ring inner triplet quadrupoles, identified by the acronym MQXF, adjacent to the main ring intersection regions. For the past decade, the U.S. LHC Accelerator R&D Program, LARP, has been evaluating conductors for the MQXFA prototypes, which are the outer magnets of the triplet. Recently, the requirements for MQXF magnets and cables have been published in P. Ferracin et al., IEEE Trans. Appl. Supercond., vol. 26, no. 4, 2016, Art. no.4000207, along with the final specification for Ti-alloyed Nb3Sn conductor determined jointly by CERN andmore » LARP. This paper describes the rationale beneath the 0.85 mm diameter strand’s chief parameters, which are 108 or more sub-elements, a copper fraction not less than 52.4%, strand critical current at 4.22 K not less than 631 A at 12 T and 331 A at 15 T, and residual resistance ratio of not less than 150. This paper also compares the performance for ~100 km production lots of the five most recent LARP conductors to the first 163 km of strand made according to the HL-LHC specification. Two factors emerge as significant for optimizing performance and minimizing risk: a modest increase of the sub-element diameter from 50 to 55 μm, and a Nb:Sn molar ratio of 3.6 instead of 3.4. Furthermore, the statistics acquired so far give confidence that the present conductor can balance competing demands in production for the HL-LHC project.« less

  14. [The CERN and the megascience].

    PubMed

    Aguilar Peris, José

    2006-01-01

    In this work we analyse the biggest particle accelerator in the world: the LHC (Large Hadron Collider). The ring shaped tunnel is 27 km long and it is buried over 110 meters underground, straddling the border betwen France and Switzerland at the CERN laboratory near Geneva. Its mission is to recreate the conditions that existed shortly after the Big-Bang and to look for the hypothesised Higgs particle. The LHC will accelerate protons near the speed of the light and collide them head on at an energy of to 14 TeV (1 TeV = 10(12) eV). Keeping such high energy in the proton beams requires enormous magnetic fields which are generated by superconducting electromagnets chilled to less than two degrees above absolute zero. It is expected that LHC will be inaugurated in summer 2007.

  15. Accelerators Beyond The Tevatron?

    NASA Astrophysics Data System (ADS)

    Lach, Joseph

    2010-07-01

    Following the successful operation of the Fermilab superconducting accelerator three new higher energy accelerators were planned. They were the UNK in the Soviet Union, the LHC in Europe, and the SSC in the United States. All were expected to start producing physics about 1995. They did not. Why?.

  16. Analysis of field errors for LARP Nb 3Sn HQ03 quadrupole magnet

    DOE PAGES

    Wang, Xiaorong; Ambrosio, Giorgio; Chlachidze, Guram; ...

    2016-12-01

    The U.S. LHC Accelerator Research Program, in close collaboration with CERN, has developed three generations of high-gradient quadrupole (HQ) Nb 3Sn model magnets, to support the development of the 150 mm aperture Nb 3Sn quadrupole magnets for the High-Luminosity LHC. The latest generation, HQ03, featured coils with better uniformity of coil dimensions and properties than the earlier generations. We tested the HQ03 magnet at FNAL, including the field quality study. The profiles of low-order harmonics along the magnet aperture observed at 15 kA, 1.9 K can be traced back to the assembled coil pack before the magnet assembly. Based onmore » the measured harmonics in the magnet center region, the coil block positioning tolerance was analyzed and compared with earlier HQ01 and HQ02 magnets to correlate with coil and magnet fabrication. Our study the capability of correcting the low-order non-allowed field errors, magnetic shims were installed in HQ03. Furthermore, the expected shim contribution agreed well with the calculation. For the persistent-current effect, the measured a4 can be related to 4% higher in the strand magnetization of one coil with respect to the other three coils. Lastly, we compare the field errors due to the inter-strand coupling currents between HQ03 and HQ02.« less

  17. The Ultimate Monte Carlo: Studying Cross-Sections With Cosmic Rays

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.

    2007-01-01

    The high-energy physics community has been discussing for years the need to bring together the three principal disciplines that study hadron cross-section physics - ground-based accelerators, cosmic-ray experiments in space, and air shower research. Only recently have NASA investigators begun discussing the use of space-borne cosmic-ray payloads to bridge the gap between accelerator physics and air shower work using cosmic-ray measurements. The common tool used in these three realms of high-energy hadron physics is the Monte Carlo (MC). Yet the obvious has not been considered - using a single MC for simulating the entire relativistic energy range (GeV to EeV). The task is daunting due to large uncertainties in accelerator, space, and atmospheric cascade measurements. These include inclusive versus exclusive cross-section measurements, primary composition, interaction dynamics, and possible new physics beyond the standard model. However, the discussion of a common tool or ultimate MC might be the very thing that could begin to unify these independent groups into a common purpose. The Offline ALICE concept of a Virtual MC at CERN s Large Hadron Collider (LHC) will be discussed as a rudimentary beginning of this idea, and as a possible forum for carrying it forward in the future as LHC data emerges.

  18. Opening

    ScienceCinema

    Bruning, Oliver

    2018-05-23

    Overview of the operation and upgrade plans for the machine. Upgrade studies and taskforces. The Chamonix 2010 discussions led to five new task forces: planning for a long shut down in 2012 for splice consolidation; long term consolidation planning for the injector complex; SPS upgrade task force (accelerated program for SPS upgrade); PSB upgrade and its implications for the PS (e.g. radiation etc.); LHC High Luminosity project (investigate planning for ONE upgrade by 2018-2020); Launch of a dedicated study for doubling the beam energy in the LHC->HE-LHC.

  19. Challenges and Plans for the Proton Injectors

    NASA Astrophysics Data System (ADS)

    Garoby, R.

    The flexibility of the LHC injectors combined with multiple longitudinal beam gymnastics have significantly contributed to the excellent performance of the LHC during its first run, delivering beam with twice the ultimate brightness with 50 ns bunch spacing. To meet the requirements of the High Luminosity LHC, 25 ns bunch spacing is required, the intensity per bunch at injection has to double and brightness shall almost triple. Extensive hardware modifications or additions are therefore necessary in all accelerators of the injector complex, as well as new beam gymnastics.

  20. U.S. Involvement in the LHC

    DOE PAGES

    Green, Dan

    2016-12-14

    The demise of the SSC in the U.S. created an upheaval in the U.S. high energy physics (HEP) community. Here, the subsequent redirection of HEP efforts to the CERN Large Hadron Collider (LHC) can perhaps be seen as informing on possible future paths for worldwide collaboration on future HEP megaprojects.

  1. Machine Protection with a 700 MJ Beam

    NASA Astrophysics Data System (ADS)

    Baer, T.; Schmidt, R.; Wenninger, J.; Wollmann, D.; Zerlauth, M.

    After the high luminosity upgrade of the LHC, the stored energy per proton beam will increase by a factor of two as compared to the nominal LHC. Therefore, many damage studies need to be revisited to ensure a safe machine operation with the new beam parameters. Furthermore, new accelerator equipment like crab cavities might cause new failure modes, which are not sufficiently covered by the current machine protection system of the LHC. These failure modes have to be carefully studied and mitigated by new protection systems. Finally the ambitious goals for integrated luminosity delivered to the experiments during the era of HL-LHC require an increase of the machine availability without jeopardizing equipment protection.

  2. Assembly Tests of the First Nb 3 Sn Low-Beta Quadrupole Short Model for the Hi-Lumi LHC

    DOE PAGES

    Pan, H.; Felice, H.; Cheng, D. W.; ...

    2016-01-18

    In preparation for the high-luminosity upgrade of the Large Hadron Collider (LHC), the LHC Accelerator Research Program (LARP) in collaboration with CERN is pursuing the development of MQXF: a 150-mm-aperture high-field Nb3Sn quadrupole magnet. Moreover, the development phase starts with the fabrication and test of several short models (1.2-m magnetic length) and will continue with the development of several long prototypes. All of them are mechanically supported using a shell-based support structure, which has been extensively demonstrated on several R&D models within LARP. The first short model MQXFS-AT has been assembled at LBNL with coils fabricated by LARP and CERN.more » In our paper, we summarize the assembly process and show how it relies strongly on experience acquired during the LARP 120-mm-aperture HQ magnet series. We also present comparison between strain gauges data and finite-element model analysis. Finally, we present the implication of the MQXFS-AT experience on the design of the long prototype support structure.« less

  3. The High-Luminosity upgrade of the LHC: Physics and Technology Challenges for the Accelerator and the Experiments

    NASA Astrophysics Data System (ADS)

    Schmidt, Burkhard

    2016-04-01

    In the second phase of the LHC physics program, the accelerator will provide an additional integrated luminosity of about 2500/fb over 10 years of operation to the general purpose detectors ATLAS and CMS. This will substantially enlarge the mass reach in the search for new particles and will also greatly extend the potential to study the properties of the Higgs boson discovered at the LHC in 2012. In order to meet the experimental challenges of unprecedented pp luminosity, the experiments will need to address the aging of the present detectors and to improve the ability to isolate and precisely measure the products of the most interesting collisions. The lectures gave an overview of the physics motivation and described the conceptual designs and the expected performance of the upgrades of the four major experiments, ALICE, ATLAS, CMS and LHCb, along with the plans to develop the appropriate experimental techniques and a brief overview of the accelerator upgrade. Only some key points of the upgrade program of the four major experiments are discussed in this report; more information can be found in the references given at the end.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooley, L. D.; Ghosh, A. K.; Dietderich, D. R.

    The High Luminosity Upgrade of the Large Hadron Collider (HL-LHC) at CERN will replace the main ring inner triplet quadrupoles, identified by the acronym MQXF, adjacent to the main ring intersection regions. For the past decade, the U.S. LHC Accelerator R&D Program, LARP, has been evaluating conductors for the MQXFA prototypes, which are the outer magnets of the triplet. Recently, the requirements for MQXF magnets and cables have been published in P. Ferracin et al., IEEE Trans. Appl. Supercond., vol. 26, no. 4, 2016, Art. no.4000207, along with the final specification for Ti-alloyed Nb3Sn conductor determined jointly by CERN andmore » LARP. This paper describes the rationale beneath the 0.85 mm diameter strand’s chief parameters, which are 108 or more sub-elements, a copper fraction not less than 52.4%, strand critical current at 4.22 K not less than 631 A at 12 T and 331 A at 15 T, and residual resistance ratio of not less than 150. This paper also compares the performance for ~100 km production lots of the five most recent LARP conductors to the first 163 km of strand made according to the HL-LHC specification. Two factors emerge as significant for optimizing performance and minimizing risk: a modest increase of the sub-element diameter from 50 to 55 μm, and a Nb:Sn molar ratio of 3.6 instead of 3.4. Furthermore, the statistics acquired so far give confidence that the present conductor can balance competing demands in production for the HL-LHC project.« less

  5. The Standard Model from LHC to future colliders.

    PubMed

    Forte, S; Nisati, A; Passarino, G; Tenchini, R; Calame, C M Carloni; Chiesa, M; Cobal, M; Corcella, G; Degrassi, G; Ferrera, G; Magnea, L; Maltoni, F; Montagna, G; Nason, P; Nicrosini, O; Oleari, C; Piccinini, F; Riva, F; Vicini, A

    This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the "What Next" Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators.

  6. Air liquide 1.8 K refrigeration units for CERN LHC project

    NASA Astrophysics Data System (ADS)

    Hilbert, Benoît; Gistau-Baguer, Guy M.; Caillaud, Aurélie

    2002-05-01

    The Large Hadron Collider (LHC) will be CERN's next research instrument for high energy physics. This 27 km long circular accelerator will make intensive use of superconducting magnets, operated below 2.0 K. It will thus require high capacity refrigeration below 2.0 K [1, 2]. Coupled to a refrigerator providing 18 kW equivalent at 4.5 K [3], these systems will be able to absorb a cryogenic power of 2.4 kW at 1.8 K in nominal conditions. Air Liquide has designed one Cold Compressor System (CCS) pre-series for CERN-preceding 3 more of them (among 8 in total located around the machine). These systems, making use of cryogenic centrifugal compressors in a series arrangement coupled to room temperature screw compressors, are presented. Key components characteristics will be given.

  7. R&D for the Future

    NASA Astrophysics Data System (ADS)

    Hübner, Kurt; Treille, Daniel; Schulte, Daniel

    The following sections are included: * The LHC and Beyond * Accelerator Magnets with Ever-Higher Fields * Teasing Performance from Superconductors Old and New * RF Power for CLIC: Acceleration by Deceleration * The Next Energy Frontier e+e- Collider: Innovation in Detectors * Hadron Collider Detectors: A Bright and Energetic Future * References

  8. Machine and radiation protection challenges of high energy/intensity accelerators: the role of Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Cerutti, F.

    2017-09-01

    The role of Monte Carlo calculations in addressing machine protection and radiation protection challenges regarding accelerator design and operation is discussed, through an overview of different applications and validation examples especially referring to recent LHC measurements.

  9. Exergy Analysis of the Cryogenic Helium Distribution System for the Large Hadron Collider (lhc)

    NASA Astrophysics Data System (ADS)

    Claudet, S.; Lebrun, Ph.; Tavian, L.; Wagner, U.

    2010-04-01

    The Large Hadron Collider (LHC) at CERN features the world's largest helium cryogenic system, spreading over the 26.7 km circumference of the superconducting accelerator. With a total equivalent capacity of 145 kW at 4.5 K including 18 kW at 1.8 K, the LHC refrigerators produce an unprecedented exergetic load, which must be distributed efficiently to the magnets in the tunnel over the 3.3 km length of each of the eight independent sectors of the machine. We recall the main features of the LHC cryogenic helium distribution system at different temperature levels and present its exergy analysis, thus enabling to qualify second-principle efficiency and identify main remaining sources of irreversibility.

  10. Analysis of the SPS Long Term Orbit Drifts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velotti, Francesco; Bracco, Chiara; Cornelis, Karel

    2016-06-01

    The Super Proton Synchrotron (SPS) is the last accelerator in the Large Hadron Collider (LHC) injector chain, and has to deliver the two high-intensity 450 GeV proton beams to the LHC. The transport from SPS to LHC is done through the two Transfer Lines (TL), TI2 and TI8, for Beam 1 (B1) and Beam 2 (B2) respectively. During the first LHC operation period Run 1, a long term drift of the SPS orbit was observed, causing changes in the LHC injection due to the resulting changes in the TL trajectories. This translated into longer LHC turnaround because of the necessitymore » to periodically correct the TL trajectories in order to preserve the beam quality at injection into the LHC. Different sources for the SPS orbit drifts have been investigated: each of them can account only partially for the total orbit drift observed. In this paper, the possible sources of such drift are described, together with the simulated and measured effect they cause. Possible solutions and countermeasures are also discussed.« less

  11. A New Understanding of the Heat Treatment of Nb-Sn Superconducting Wires

    NASA Astrophysics Data System (ADS)

    Sanabria, Charlie

    Enhancing the beam energy of particle accelerators like the Large Hadron Collider (LHC), at CERN, can increase our probability of finding new fundamental particles of matter beyond those predicted by the standard model. Such discoveries could improve our understanding of the birth of universe, the universe itself, and/or many other mysteries of matter--that have been unresolved for decades--such as dark matter and dark energy. This is obviously a very exciting field of research, and therefore a worldwide collaboration (of universities, laboratories, and the industry) is attempting to increase the beam energy in the LHC. One of the most challenging requirements for an energy increase is the production of a magnetic field homogeneous enough and strong enough to bend the high energy particle beam to keep it inside the accelerating ring. In the current LHC design, these beam bending magnets are made of Nb Ti superconductors, reaching peak fields of 8 T. However, in order to move to higher fields, future magnets will have to use different and more advanced superconducting materials. Among the most viable superconductor wire technologies for future particle accelerator magnets is Nb3Sn, a technology that has been used in high field magnets for many decades. However, Nb3Sn magnet fabrication has an important challenge: the fact the wire fabrication and the coil assembly itself must be done using ductile metallic components (Nb, Sn, and Cu) before the superconducting compound (Nb3 Sn) is activated inside the wires through a heat treatment. The studies presented in this thesis work have found that the heat treatment schedule used on the most advanced Nb3Sn wire technology (the Restacked Rod Process wires, RRPRTM) can still undergo significant improvements. These improvements have already led to an increase of the figure of merit of these wires (critical current density) by 28%.

  12. LHC: The Large Hadron Collider

    ScienceCinema

    Lincoln, Don

    2018-01-16

    The Large Hadron Collider (or LHC) is the world’s most powerful particle accelerator. In 2012, scientists used data taken by it to discover the Higgs boson, before pausing operations for upgrades and improvements. In the spring of 2015, the LHC will return to operations with 163% the energy it had before and with three times as many collisions per second. It’s essentially a new and improved version of itself. In this video, Fermilab’s Dr. Don Lincoln explains both some of the absolutely amazing scientific and engineering properties of this modern scientific wonder.

  13. Induced activation studies for the LHC upgrade to High Luminosity LHC

    NASA Astrophysics Data System (ADS)

    Adorisio, C.; Roesler, S.

    2018-06-01

    The Large Hadron Collider (LHC) will be upgraded in 2019/2020 to increase its luminosity (rate of collisions) by a factor of five beyond its design value and the integrated luminosity by a factor ten, in order to maintain scientific progress and exploit its full capacity. The novel machine configuration, called High Luminosity LHC (HL-LHC), will increase consequently the level of activation of its components. The evaluation of the radiological impact of the HL-LHC operation in the Long Straight Sections of the Insertion Region 1 (ATLAS) and Insertion Region 5 (CMS) is presented. Using the Monte Carlo code FLUKA, ambient dose equivalent rate estimations have been performed on the basis of two announced operating scenarios and using the latest available machine layout. The HL-LHC project requires new technical infrastructure with caverns and 300 m long tunnels along the Insertion Regions 1 and 5. The new underground service galleries will be accessible during the operation of the accelerator machine. The radiological risk assessment for the Civil Engineering work foreseen to start excavating the new galleries in the next LHC Long Shutdown and the radiological impact of the machine operation will be discussed.

  14. Superconducting Magnet Technology for Future High Energy Proton Colliders

    NASA Astrophysics Data System (ADS)

    Gourlay, Stephen

    2017-01-01

    Interest in high field dipoles has been given a boost by new proposals to build a high-energy proton-proton collider to follow the LHC and programs around the world are taking on the task to answer the need. Studies aiming toward future high-energy proton-proton colliders at the 100 TeV scale are now being organized. The LHC and current cost models are based on technology close to four decades old and point to a broad optimum of operation using dipoles with fields between 5 and 12T when site constraints, either geographical or political, are not a factor. Site geography constraints that limit the ring circumference can drive the required dipole field up to 20T, which is more than a factor of two beyond state-of-the-art. After a brief review of current progress, the talk will describe the challenges facing future development and present a roadmap for moving high field accelerator magnet technology forward. This work was supported by the Director, Office of Science, High Energy Physics, US Department of Energy, under contract No. DE-AC02-05CH11231.

  15. Design approach for the development of a cryomodule for compact crab cavities for Hi-Lumi LHC

    NASA Astrophysics Data System (ADS)

    Pattalwar, Shrikant; Jones, Thomas; Templeton, Niklas; Goudket, Philippe; McIntosh, Peter; Wheelhouse, Alan; Burt, Graeme; Hall, Ben; Wright, Loren; Peterson, Tom

    2014-01-01

    A prototype Superconducting RF (SRF) cryomodule, comprising multiple compact crab cavities is foreseen to realise a local crab crossing scheme for the "Hi-Lumi LHC", a project launched by CERN to increase the luminosity performance of LHC. A cryomodule with two cavities will be initially installed and tested on the SPS drive accelerator at CERN to evaluate performance with high-intensity proton beams. A series of boundary conditions influence the design of the cryomodule prototype, arising from; the complexity of the cavity design, the requirement for multiple RF couplers, the close proximity to the second LHC beam pipe and the tight space constraints in the SPS and LHC tunnels. As a result, the design of the helium vessel and the cryomodule has become extremely challenging. This paper assesses some of the critical cryogenic and engineering design requirements and describes an optimised cryomodule solution for the evaluation tests on SPS.

  16. Cryogenic studies for the proposed CERN large hadron electron collider (LHEC)

    NASA Astrophysics Data System (ADS)

    Haug, F.; LHeC Study Team, The

    2012-06-01

    The LHeC (Large Hadron electron Collider) is a proposed future colliding beam facility for lepton-nucleon scattering particle physics at CERN. A new 60 GeV electron accelerator will be added to the existing 27 km circumference 7 TeV LHC for collisions of electrons with protons and heavy ions. Two basic design options are being pursued. The first is a circular accelerator housed in the existing LHC tunnel which is referred to as the "Ring-Ring" version. Low field normal conducting magnets guide the particle beam while superconducting (SC) RF cavities cooled to 2 K are installed at two opposite locations at the LHC tunnel to accelerate the beams. For this version in addition a 10 GeV re-circulating SC injector will be installed. In total four refrigerators with cooling capacities between 1.2 kW and 3 kW @ 4.5 K are needed. The second option, referred to as the "Linac-Ring" version consists of a race-track re-circulating energyrecovery type machine with two 1 km long straight acceleration sections. The 944 high field 2 K SC cavities dissipate 30 kW at CW operation. Eight 10 kW @ 4.5 K refrigerators are proposed. The particle detector contains a combined SC solenoid and dipole forming the cold mass and an independent liquid argon calorimeter. Cooling is done with two individual small sized cryoplants; a 4.5 K helium, and a 87 K liquid nitrogen plant.

  17. Big data analytics for the Future Circular Collider reliability and availability studies

    NASA Astrophysics Data System (ADS)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  18. LINEAR COLLIDER PHYSICS RESOURCE BOOK FOR SNOWMASS 2001.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ABE,T.; DAWSON,S.; HEINEMEYER,S.

    The American particle physics community can look forward to a well-conceived and vital program of experimentation for the next ten years, using both colliders and fixed target beams to study a wide variety of pressing questions. Beyond 2010, these programs will be reaching the end of their expected lives. The CERN LHC will provide an experimental program of the first importance. But beyond the LHC, the American community needs a coherent plan. The Snowmass 2001 Workshop and the deliberations of the HEPAP subpanel offer a rare opportunity to engage the full community in planning our future for the next decademore » or more. A major accelerator project requires a decade from the beginning of an engineering design to the receipt of the first data. So it is now time to decide whether to begin a new accelerator project that will operate in the years soon after 2010. We believe that the world high-energy physics community needs such a project. With the great promise of discovery in physics at the next energy scale, and with the opportunity for the uncovering of profound insights, we cannot allow our field to contract to a single experimental program at a single laboratory in the world. We believe that an e{sup +}e{sup {minus}} linear collider is an excellent choice for the next major project in high-energy physics. Applying experimental techniques very different from those used at hadron colliders, an e{sup +}e{sup {minus}} linear collider will allow us to build on the discoveries made at the Tevatron and the LHC, and to add a level of precision and clarity that will be necessary to understand the physics of the next energy scale. It is not necessary to anticipate specific results from the hadron collider programs to argue for constructing an e{sup +}e{sup {minus}} linear collider; in any scenario that is now discussed, physics will benefit from the new information that e{sup +}e{sup {minus}} experiments can provide.« less

  19. Linear Collider Physics Resource Book for Snowmass 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peskin, Michael E

    The American particle physics community can look forward to a well-conceived and vital program of experimentation for the next ten years, using both colliders and fixed target beams to study a wide variety of pressing questions. Beyond 2010, these programs will be reaching the end of their expected lives. The CERN LHC will provide an experimental program of the first importance. But beyond the LHC, the American community needs a coherent plan. The Snowmass 2001 Workshop and the deliberations of the HEPAP subpanel offer a rare opportunity to engage the full community in planning our future for the next decademore » or more. A major accelerator project requires a decade from the beginning of an engineering design to the receipt of the first data. So it is now time to decide whether to begin a new accelerator project that will operate in the years soon after 2010. We believe that the world high-energy physics community needs such a project. With the great promise of discovery in physics at the next energy scale, and with the opportunity for the uncovering of profound insights, we cannot allow our field to contract to a single experimental program at a single laboratory in the world. We believe that an e{sup +}e{sup -} linear collider is an excellent choice for the next major project in high-energy physics. Applying experimental techniques very different from those used at hadron colliders, an e{sup +}e{sup -} linear collider will allow us to build on the discoveries made at the Tevatron and the LHC, and to add a level of precision and clarity that will be necessary to understand the physics of the next energy scale. It is not necessary to anticipate specific results from the hadron collider programs to argue for constructing an e{sup +}e{sup -} linear collider; in any scenario that is now discussed, physics will benefit from the new information that e{sup +}e{sup -} experiments can provide.« less

  20. Commissioning the cryogenic system of the first LHC sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Millet, F.; Claudet, S.; Ferlin, G.

    2007-12-01

    The LHC machine, composed of eight sectors with superconducting magnets and accelerating cavities, requires a complex cryogenic system providing high cooling capacities (18 kW equivalent at 4.5 K and 2.4 W at 1.8 K per sector produced in large cold boxes and distributed via 3.3-km cryogenic transfer lines). After individual reception tests of the cryogenic subsystems (cryogen storages, refrigerators, cryogenic transfer lines and distribution boxes) performed since 2000, the commissioning of the cryogenic system of the first LHC sector has been under way since November 2006. After a brief introduction to the LHC cryogenic system and its specificities, the commissioningmore » is reported detailing the preparation phase (pressure and leak tests, circuit conditioning and flushing), the cool-down sequences including the handling of cryogenic fluids, the magnet powering phase and finally the warm-up. Preliminary conclusions on the commissioning of the first LHC sector will be drawn with the review of the critical points already solved or still pending. The last part of the paper reports on the first operational experience of the LHC cryogenic system in the perspective of the commissioning of the remaining LHC sectors and the beam injection test.« less

  1. Fermilab | Science | Particle Accelerators | LHC and Future Accelerators

    Science.gov Websites

    Book Newsroom Newsroom News and features Press releases Photo gallery Fact sheets and brochures Media media Video of shutdown event Guest book Tevatron Impact June 11, 2012 About the symposium Symposium Office of Science Security, Privacy, Legal Use of Cookies Quick Links Home Contact Phone Book Fermilab at

  2. Properties of the superconductor in accelerator dipole magnets

    NASA Astrophysics Data System (ADS)

    Teravest, Derk

    Several aspects of the application of superconductors to high field dipole magnets for particle accelerators are discussed. The attention is focused on the 10 tesla (1 m model) magnet that is envisaged for the future Large Hadron Collider (LHC) accelerator. The basic motivation behind the study is the intention of employing superconductors to their utmost performance. An overview of practical supercomputers, their applications and their impact on high field dipole magnets used for particle accelerators, is presented. The LHC reference design for the dipole magnets is outlined. Several models were used to study the influence of a number of factors in the shape and in particular, the deviation from the shape that is due to the flux flow state. For the investigated extrinsic and intrinsic factors, a classification can be made with respect to the effect on the shape of the characteristic of a multifilamentary wire. The optimization of the coil structure for high field dipole magnets, with respect to the field quality is described. An analytical model for solid and hollow filaments, to calculate the effect of filament magnetization in the quality of the dipole field, is presented.

  3. Upgrade of the beam extraction system of the GTS-LHC electron cyclotron resonance ion source at CERN.

    PubMed

    Toivanen, V; Bellodi, G; Dimov, V; Küchler, D; Lombardi, A M; Maintrot, M

    2016-02-01

    Linac3 is the first accelerator in the heavy ion injector chain of the Large Hadron Collider (LHC), providing multiply charged heavy ion beams for the CERN experimental program. The ion beams are produced with GTS-LHC, a 14.5 GHz electron cyclotron resonance ion source, operated in afterglow mode. Improvement of the GTS-LHC beam formation and beam transport along Linac3 is part of the upgrade program of the injector chain in preparation for the future high luminosity LHC. A mismatch between the ion beam properties in the ion source extraction region and the acceptance of the following Low Energy Beam Transport (LEBT) section has been identified as one of the factors limiting the Linac3 performance. The installation of a new focusing element, an einzel lens, into the GTS-LHC extraction region is foreseen as a part of the Linac3 upgrade, as well as a redesign of the first section of the LEBT. Details of the upgrade and results of a beam dynamics study of the extraction region and LEBT modifications will be presented.

  4. Test of the wire ageing induced by radiation for the CMS barrel muon chambers

    NASA Astrophysics Data System (ADS)

    Conti, E.; Gasparini, F.

    2001-06-01

    We have carried out laboratory tests to measure the ageing of a wire tube due to pollutants outgassed by various materials. The tested materials are those used in the barrel muon drift tubes of the CMS experiment at LHC. An X-ray gun irradiated the test tube to accelerate the ageing process. No ageing effect has been measured for a period equivalent to 10 years of operation at LHC.

  5. Issues Using the Life History Calendar in Disability Research

    PubMed Central

    Scott, Tiffany N.; Harrison, Tracie

    2011-01-01

    Background Overall, there is a dearth of research reporting mixed-method data collection procedures using the LHC within disability research. Objective This report provides practical knowledge on use of the life history calendar (LHC) from the perspective of a mixed-method life history study of mobility impairment situated within a qualitative paradigm. Methods In this paper the method related literature referring to the LHC was reviewed along with its epistemological underpinnings. Further, the uses of the LHC in disability research were illustrated using preliminary data from reports of disablement in Mexican American and Non-Hispanic White women with permanent mobility impairment. Results From our perspective, the LHC was most useful when approached from an interpretive paradigm when gathering data from women of varied ethnic and socioeconomic strata. While we found the LHC the most useful tool currently available for studying disablement over the life course, there were challenges associated with its use. The LHC required extensive interviewer training. In addition, large segments of time were needed for completion depending on the type of participant responses. Conclusions Researchers planning to conduct a disability study may find our experience using the LHC valuable for anticipating issues that may arise when the LHC is used in mixed-method research. PMID:22014674

  6. Cryogenic test facility instrumentation with fiber optic and fiber optic sensors for testing superconducting accelerator magnets

    NASA Astrophysics Data System (ADS)

    Chiuchiolo, A.; Bajas, H.; Bajko, M.; Castaldo, B.; Consales, M.; Cusano, A.; Giordano, M.; Giloux, C.; Perez, J. C.; Sansone, L.; Viret, P.

    2017-12-01

    The magnets for the next steps in accelerator physics, such as the High Luminosity upgrade of the LHC (HL- LHC) and the Future Circular Collider (FCC), require the development of new technologies for manufacturing and monitoring. To meet the HL-LHC new requirements, a large upgrade of the CERN SM18 cryogenic test facilities is ongoing with the implementation of new cryostats and cryogenic instrumentation. The paper deals with the advances in the development and the calibration of fiber optic sensors in the range 300 - 4 K using a dedicated closed-cycle refrigerator system composed of a pulse tube and a cryogen-free cryostat. The calibrated fiber optic sensors (FOS) have been installed in three vertical cryostats used for testing superconducting magnets down to 1.9 K or 4.2 K and in the variable temperature test bench (100 - 4.2 K). Some examples of FOS measurements of cryostat temperature evolution are presented as well as measurements of strain performed on a subscale of High Temperature Superconducting magnet during its powering tests.

  7. UPR/Mayaguez High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    López, Angel M.

    2015-10-27

    For the period of sixteen years covered by this report (June 1, 1997 - July 31, 2013) the High Energy Physics Group at the University of Puerto Rico’s Mayaguez Campus (UPRM) carried out an extensive research program that included major experiments at Fermi National Accelerator Laboratory (Fermilab), the Cornell Electron-positron Collider and CERN. In particular, these were E831 (FOCUS) at Fermilab, CLEOc at Cornell and the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) at CERN. The group’s history is one of successful execution and growth. Beginning with one faculty researcher in 1985, it eventually included four facultymore » researchers, one post-doctoral research associate, two undergraduates and as many as six graduate students at one time working on one of the experiments that discovered the Higgs boson. Some of this expansion was due to the group’s leveraging of funds from the Department of Energy’s core grant to attract funds from National Science Foundation programs not targeted to high energy physics. Besides the group’s research productivity, its other major contribution was the training of a large number of MS students who later went on to successful technical careers in industry as well as academia including many who obtained PhD degrees at US universities. In an attempt to document this history, this final report gives a general description of the Group’s work prior to June 1, 2010, the starting date for the last grant renewal period. Much more detail can, of course, be found in the annual reports submitted up to that date. The work during the last grant period is discussed in detail in a separate section. To summarize the group’s scientific accomplishments, one can point to the results of the experiments. Both FOCUS and CLEOc were designed to carry out precise measurements of processes involving the heavy quarks, charm and bottom. Heavy quarks are particularly interesting because, due to their mass, theoretical calculations based on the Standard Model have less uncertainty than those for the light quarks. Precise heavy quark experiments can therefore yield some of the best tests of the Standard Model and of the approximations that are made in calculating measurable observables. Both FOCUS and CLEOc were highly successful achieving significant improvement in the precision of measurements such as lifetimes and decay branching ratios. For example, FOCUS obtained a data sample that contained ten times as many heavy quark decay events as its predecessor. CMS was a big shift in the group’s research. During the first decade of the century it became clear that the LHC would be the world’s highest energy accelerator offering a unique opportunity for discovery. Given the UPRM’s group record of achievement, it was successful in obtaining admission to the CMS collaboration in March, 2006, becoming the first institution to do so that did not have a PhD program. CMS is one of two major experiments at the LHC. Although the plans are for these experiments to run for many years with increased energy and event rates, they have already achieved one of their principal goals. The test for the existence of the Higgs boson, a particle which plays a unique role in the Standard Model but had not been observed, was answered in the affirmative in 2012.The particular contributions of the UPRM group to these experiments make up the majority of this report although other contributions such as the training of students, outreach to the general community and the organization of scientific meetings are also discussed.« less

  8. The United States Particle Accelerator School: Educating the Next Generation of Accelerator Scientists and Engineers

    NASA Astrophysics Data System (ADS)

    Barletta, William A.

    2009-03-01

    Only a handful of universities in the US offer any formal training in accelerator science. The United States Particle Accelerator School (USPAS) is National Graduate Educational Program that has developed a highly successful educational paradigm that, over the past twenty-years, has granted more university credit in accelerator/beam science and technology than any university in the world. Sessions are held twice annually, hosted by major US research universities that approve course credit, certify the USPAS faculty, and grant course credit. The USPAS paradigm is readily extensible to other rapidly developing, cross-disciplinary research areas such as high energy density physics.

  9. EDITORIAL: Metrological Aspects of Accelerator Technology and High Energy Physics Experiments

    NASA Astrophysics Data System (ADS)

    Romaniuk, Ryszard S.; Pozniak, Krzysztof T.

    2007-08-01

    The subject of this special feature in Measurement Science and Technology concerns measurement methods, devices and subsystems, both hardware and software aspects, applied in large experiments of high energy physics (HEP) and superconducting RF accelerator technology (SRF). These experiments concern mainly the physics of elementary particles or the building of new machines and detectors. The papers present practical examples of applied solutions in large, contemporary, international research projects such as HERA, LHC, FLASH, XFEL, ILC and others. These machines are unique in their global scale and consist of extremely dedicated apparatus. The apparatus is characterized by very large dimensions, a considerable use of resources and a high level of overall technical complexity. They possess a large number of measurement channels (ranging from thousands to over 100 million), are characterized by fast of processing of measured data and high measurement accuracies, and work in quite adverse environments. The measurement channels cooperate with a large number of different sensors of momenta, energies, trajectories of elementary particles, electron, proton and photon beam profiles, accelerating fields in resonant cavities, and many others. The provision of high quality measurement systems requires the designers to use only the most up-to-date technical solutions, measurement technologies, components and devices. Research work in these demanding fields is a natural birthplace of new measurement methods, new data processing and acquisition algorithms, complex, networked measurement system diagnostics and monitoring. These developments are taking place in both hardware and software layers. The chief intention of this special feature is that the papers represent equally some of the most current metrology research problems in HEP and SRF. The accepted papers have been divided into four topical groups: superconducting cavities (4 papers), low level RF systems (8 papers), ionizing radiation (5 papers) and HEP experiments (8 papers). The editors would like to thank cordially all the authors who accepted our invitation to present their very recent results. A number of authors of the papers in this issue are active in the 6th European Framework Research Program CARE—Coordinated Accelerators Research in Europe and ELAN—the European Linear Accelerator Network. Some authors are active in research programs of a global extent such as the LHC, ILC and GDE—the Global Design Effort for the International Linear Collider. We also would like to thank personally, as well as on behalf of all the authors, the Editorial Board of Measurement Science and Technology for accepting this very exciting field of contemporary metrology. This field seems to be really a birthplace of a host of new metrological technologies, where the driving force is the incredibly high technical requirements that must soon be fulfilled if we dream of building new accelerators for elementary particles, new biological materials and medicine alike. Special thanks are due to Professor R S Jachowicz of Warsaw University of Technology for initiating this issue and for continuous support and advice during our work.

  10. Measurement of Beam Tunes in the Tevatron Using the BBQ System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edstrom, Dean R.; /Indiana U.

    Measuring the betatron tunes in any synchrotron is of critical importance to ensuring the stability of beam in the synchrotron. The Base Band Tune, or BBQ, measurement system was developed by Marek Gasior of CERN and has been installed at Brookhaven and Fermilab as a part of the LHC Accelerator Research Program, or LARP. The BBQ was installed in the Tevatron to evaluate its effectiveness at reading proton and antiproton tunes at its flattop energy of 980 GeV. The primary objectives of this thesis are to examine the methods used to measure the tune using the BBQ tune measurement system,more » to incorporate the system into the Fermilab accelerator controls system, ACNET, and to compare the BBQ to existing tune measurement systems in the Tevatron.« less

  11. Superconducting Magnets for Accelerators

    NASA Astrophysics Data System (ADS)

    Brianti, G.; Tortschanoff, T.

    1993-03-01

    This chapter describes the main features of superconducting magnets for high energy synchrotrons and colliders. It refers to magnets presently used and under development for the most advanced accelerators projects, both recently constructed or in the preparatory phase. These magnets, using the technology mainly based on the NbTi conductor, are described from the aspect of design, materials, construction and performance. The trend toward higher performance can be gauged from the doubling of design field in less than a decade from about 4 T for the Tevatron to 10 T for the LHC. Special properties of the superconducting accelerator magnets, such as their general layout and the need of extensive computational treatment, the limits of performance inherent to the available conductors, the requirements on the structural design are described. The contribution is completed by elaborating on persistent current effects, quench protection and the cryostat design. As examples the main magnets for HERA and SSC, as well as the twin-aperture magnets for LHC, are presented.

  12. It may be possible to use Microscopic Black Holes as a Propulsion Beam

    NASA Astrophysics Data System (ADS)

    Kriske, Richard

    2017-04-01

    Several years ago during the commissioning of the LHC, the question as to whether a miniature Black Hole would be formed, and what to do with it if it was, came up as a legitimate topic of discussion. It was calculated at that time that although it was possible, the possibility was extremely small, and it would evaporate quickly, and would be safely ejected into space, as its mass would be so great as to simply continue along its inertial path, out the end of the circular LHC accelerator. New improvements to the LHC are the increase in energy to about 15 TEV. Linear accelerators, such as the ILC, claim to be able to produce much higher TEV, as they collide electrons and positrons, as opposed to Protons, as does the LHC. This author has heard incredible numbers, such as 250 TEV, with a beam current of 1 Amp. With this incredible increase in Energy and Current, one could turn the Black Hole investigation around, and try to determine how one could produce a steady stream of Microscopic Black Holes. A Black Hole machine. When the Black Holes evaporate do they expand, space in space time. Would the old theory of expanding space behind a craft warp space, and enable the craft to exceed the speed of light. The warp theory was proposed before Star Trek, is it now feasible to prove?

  13. Challenges in 21st Century Physics

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.

    2007-01-01

    We are truly fortunate to live in one of the great epochs of human discovery, a time when science is providing new visions and understanding about ourselves and the world in which we live. At last, we are beginning to explore the Universe itself. One particularly exciting area of advancement is high-energy physics where several existing concepts will be put to the test. A brief survey will be given of accomplishments in 20th Century physics. These include relativity and quantum physics which have produced breakthroughs in cosmology, astrophysics, and high-energy particle physics. The current situation is then assessed, combining the last 100 years of progress with new 21st Century challenges about unification and where to go next. Finally, the future is upon us. The next frontier in experimental high-energy physics, the Large Hadron Collider (LHC) at CERN in Geneva, is scheduled to begin coming online this year (2007). The potential for the LHC to address several of the significant problems in physics today will be discussed, as this great accelerator examines the predictions of the Standard Model of particle physics and even cosmology. New physics and new science will surely emerge and a better vision of the world will unfold.

  14. High energy beam impact tests on a LHC tertiary collimator at the CERN high-radiation to materials facility

    NASA Astrophysics Data System (ADS)

    Cauchi, Marija; Aberle, O.; Assmann, R. W.; Bertarelli, A.; Carra, F.; Cornelis, K.; Dallocchio, A.; Deboy, D.; Lari, L.; Redaelli, S.; Rossi, A.; Salvachua, B.; Mollicone, P.; Sammut, N.

    2014-02-01

    The correct functioning of a collimation system is crucial to safely operate highly energetic particle accelerators, such as the Large Hadron Collider (LHC). The requirements to handle high intensity beams can be demanding. In this respect, investigating the consequences of LHC particle beams hitting tertiary collimators (TCTs) in the experimental regions is a fundamental issue for machine protection. An experimental test was designed to investigate the robustness and effects of beam accidents on a fully assembled collimator, based on accident scenarios in the LHC. This experiment, carried out at the CERN High-Radiation to Materials (HiRadMat) facility, involved 440 GeV proton beam impacts of different intensities on the jaws of a horizontal TCT. This paper presents the experimental setup and the preliminary results obtained, together with some first outcomes from visual inspection and a comparison of such results with numerical simulations.

  15. Upgrade of the beam extraction system of the GTS-LHC electron cyclotron resonance ion source at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toivanen, V., E-mail: ville.aleksi.toivanen@cern.ch; Bellodi, G.; Dimov, V.

    2016-02-15

    Linac3 is the first accelerator in the heavy ion injector chain of the Large Hadron Collider (LHC), providing multiply charged heavy ion beams for the CERN experimental program. The ion beams are produced with GTS-LHC, a 14.5 GHz electron cyclotron resonance ion source, operated in afterglow mode. Improvement of the GTS-LHC beam formation and beam transport along Linac3 is part of the upgrade program of the injector chain in preparation for the future high luminosity LHC. A mismatch between the ion beam properties in the ion source extraction region and the acceptance of the following Low Energy Beam Transport (LEBT)more » section has been identified as one of the factors limiting the Linac3 performance. The installation of a new focusing element, an einzel lens, into the GTS-LHC extraction region is foreseen as a part of the Linac3 upgrade, as well as a redesign of the first section of the LEBT. Details of the upgrade and results of a beam dynamics study of the extraction region and LEBT modifications will be presented.« less

  16. The ALICE Experiment at CERN Lhc:. Status and First Results

    NASA Astrophysics Data System (ADS)

    Vercellin, Ermanno

    The ALICE experiment is aimed at studying the properties of the hot and dense matter produced in heavy-ion collisions at LHC energies. In the first years of LHC operation the ALICE physics program will be focused on Pb-Pb and p-p collisions. The latter, on top of their intrinsic interest, will provide the necessary baseline for heavy-ion data. After its installation and a long commissioning with cosmic rays, in late fall 2009 ALICE participated (very successfully) in the first LHC run, by collecting data in p-p collisions at c.m. energy 900 GeV. After a short stop during winter, LHC operations have been resumed; the machine is now able to accelerate proton beams up to 3.5 TeV and ALICE has undertaken the data taking campaign at 7 TeV c.m. energy. After an overview of the ALICE physics goals and a short description of the detector layout, the ALICE performance in p-p collisions will be presented. The main physics results achieved so far will be highlighted as well as the main aspects of the ongoing data analysis.

  17. Ashra Neutrino Telescope Array (NTA): Combined Imaging Observation of Astroparticles — For Clear Identification of Cosmic Accelerators and Fundamental Physics Using Cosmic Beams —

    NASA Astrophysics Data System (ADS)

    Sasaki, Makoto; Kifune, Tadashi

    In VHEPA (very high energy particle astronomy) 2014 workshop, focused on the next generation explorers for the origin of cosmic rays, held in Kashiwa, Japan, reviewing and discussions were presented on the status of the observation of GeV-TeV photons, TeV-PeV neutrinos, EeV-ZeV hadrons, test of interaction models with Large Hadron Collider (LHC), and theoretical aspects of astrophysics. The acceleration sites of hadrons, i.e., sources of PeV-EeV cosmic rays, should exist in the universe within the GZK-horizon even in the remotest case. We also affirmed that the hadron acceleration mechanism correlates with cosmic ray composition so that it is important to investigate the acceleration mechanism in relevance to the composition survey at PeV-EeV energy. We regard that LHC and astrophysics theories are ready to be used to probe into hadron acceleration mechanism in the universe. Recently, IceCube has reported detection of three events of neutrinos with energies around 1 PeV and additional events at lower energies, which significantly deviate from the expected level of background events. It is necessary to observe GeV-TeV photon, EeV-ZeV hadron and TeV-PeV neutrino all together, in order to understand hadronic interactions of cosmic rays in the PeV-EeV energy region. It is required to make a step further toward exploring the PeV-EeV universe with high accuracy and high statistics observations for both neutrinos and gamma rays simultaneously, by using the instrument such as Ashra Neutrino Telescope Array (NTA). Wide and fine survey of gamma-rays and neutrinos with simultaneously detecting Cherenkov and fluorescence light with NTA will guide us to a new intriguing stage of recognizing astronomical objects and non-thermal phenomena in ultra-high energy region, in addition, new aspect about the fundamental concepts of physics beyond our presently limited understanding; the longstanding problem of cosmic ray origin, the radiation mechanism of gamma-rays, neutrino and cosmic rays from violent objects like blazars, interaction of gamma-rays and cosmic rays with microwave and infrared background photons, and PeV-EeV neutrinos originated from far places beyond the GZK-horizon.

  18. GPU/MIC Acceleration of the LHC High Level Trigger to Extend the Physics Reach at the LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halyo, Valerie; Tully, Christopher

    The quest for rare new physics phenomena leads the PI [3] to propose evaluation of coprocessors based on Graphics Processing Units (GPUs) and the Intel Many Integrated Core (MIC) architecture for integration into the trigger system at LHC. This will require development of a new massively parallel implementation of the well known Combinatorial Track Finder which uses the Kalman Filter to accelerate processing of data from the silicon pixel and microstrip detectors and reconstruct the trajectory of all charged particles down to momentums of 100 MeV. It is expected to run at least one order of magnitude faster than anmore » equivalent algorithm on a quad core CPU for extreme pileup scenarios of 100 interactions per bunch crossing. The new tracking algorithms will be developed and optimized separately on the GPU and Intel MIC and then evaluated against each other for performance and power efficiency. The results will be used to project the cost of the proposed hardware architectures for the HLT server farm, taking into account the long term projections of the main vendors in the market (AMD, Intel, and NVIDIA) over the next 10 years. Extensive experience and familiarity of the PI with the LHC tracker and trigger requirements led to the development of a complementary tracking algorithm that is described in [arxiv: 1305.4855], [arxiv: 1309.6275] and preliminary results accepted to JINST.« less

  19. Interview of Jim Kerby about the First Beam

    ScienceCinema

    None

    2017-12-09

    Jim Kerby : Head of the US LHC Construction Project - FERMILAB employee Questions asked : 1. What does it take to start up the LHC machine? 2. What's the plan for 1st injection day? 3. How do you feel about this?

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    campbell, myron

    To create a research and study abroad program that would allow U.S. undergraduate students access to the world-leading research facilities at the European Organization for Nuclear Research (CERN), the World Health Organization, various operations of the United Nations and other international organizations based in Geneva.The proposal is based on the unique opportunities currently existing in Geneva. The Large Hadron Collider (LHC) is now operational at CERN, data are being collected, and research results are already beginning to emerge. At the same time, a related reduction of activity at U.S. facilities devoted to particle physics is expected. In addition, the U.S.more » higher-education community has an ever-increasing focus on international organizations dealing with world health pandemics, arms control and human rights, a nexus also centered in Geneva.« less

  1. Research Activities at Fermilab for Big Data Movement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhashilkar, Parag; Wu, Wenji; Kim, Hyun W

    2013-01-01

    Adaptation of 100GE Networking Infrastructure is the next step towards management of Big Data. Being the US Tier-1 Center for the Large Hadron Collider's (LHC) Compact Muon Solenoid (CMS) experiment and the central data center for several other large-scale research collaborations, Fermilab has to constantly deal with the scaling and wide-area distribution challenges of the big data. In this paper, we will describe some of the challenges involved in the movement of big data over 100GE infrastructure and the research activities at Fermilab to address these challenges.

  2. Measurements and TCAD simulation of novel ATLAS planar pixel detector structures for the HL-LHC upgrade

    NASA Astrophysics Data System (ADS)

    Nellist, C.; Dinu, N.; Gkougkousis, E.; Lounis, A.

    2015-06-01

    The LHC accelerator complex will be upgraded between 2020-2022, to the High-Luminosity-LHC, to considerably increase statistics for the various physics analyses. To operate under these challenging new conditions, and maintain excellent performance in track reconstruction and vertex location, the ATLAS pixel detector must be substantially upgraded and a full replacement is expected. Processing techniques for novel pixel designs are optimised through characterisation of test structures in a clean room and also through simulations with Technology Computer Aided Design (TCAD). A method to study non-perpendicular tracks through a pixel device is discussed. Comparison of TCAD simulations with Secondary Ion Mass Spectrometry (SIMS) measurements to investigate the doping profile of structures and validate the simulation process is also presented.

  3. SLHC, the High-Luminosity Upgrade (public event)

    ScienceCinema

    None

    2017-12-09

    In the morning of June 23rd a public event is organised in CERN's Council Chamber with the aim of providing the particle physics community with up-to-date information about the strategy for the LHC luminosity upgrade and to describe the current status of preparation work. The presentations will provide an overview of the various accelerator sub-projects, the LHC physics prospects and the upgrade plans of ATLAS and CMS. This event is organised in the framework of the SLHC-PP project, which receives funding from the European Commission for the preparatory phase of the LHC High Luminosity Upgrade project. Informing the public is among the objectives of this EU-funded project. A simultaneous transmission of this meeting will be broadcast, available at the following address: http://webcast.cern.ch/

  4. Single-pass beam measurements for the verification of the LHC magnetic model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calaga, R.; Giovannozzi, M.; Redaelli, S.

    2010-05-23

    During the 2009 LHC injection tests, the polarities and effects of specific quadrupole and higher-order magnetic circuits were investigated. A set of magnet circuits had been selected for detailed investigation based on a number of criteria. On or off-momentum difference trajectories launched via appropriate orbit correctors for varying strength settings of the magnet circuits under study - e.g. main, trim and skew quadrupoles; sextupole families and spool piece correctors; skew sextupoles, octupoles - were compared with predictions from various optics models. These comparisons allowed confirming or updating the relative polarity conventions used in the optics model and the accelerator controlmore » system, as well as verifying the correct powering and assignment of magnet families. Results from measurements in several LHC sectors are presented.« less

  5. Summary of the Persistent Current Effect Measurements in Nb 3 Sn and NbTi Accelerator Magnets at Fermilab

    DOE PAGES

    Velev, G. V.; Chlachidze, G.; DiMarco, J.; ...

    2016-01-06

    In the past 10 years, Fermilab has been executing an intensive R&D program on accelerator magnets based on Nb 3Sn superconductor technology. This R&D effort includes dipole and quadrupole models for different programs, such as LARP and 11 T dipoles for the LHC high-luminosity upgrade. Before the Nb 3Sn R&D program, Fermilab was involved in the production of the low-beta quadrupole magnets for LHC based on the NbTi superconductor. Additionally, during the 2003-2005 campaign to optimize the operation of the Tevatron, a large number of Tevatron magnets were re-measured. As a result of this field analysis, a systematic study ofmore » the persistent current decay and snapback effect in these magnets was performed. This paper summarizes the result of this study and presents a comparison between Nb 3Sn and NbTi dipoles and quadrupoles.« less

  6. Electron Cloud Effects in Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furman, M.A.

    Abstract We present a brief summary of various aspects of the electron-cloud effect (ECE) in accelerators. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire ?ECLOUD? series [1?22]. In addition, the proceedings of the various flavors of Particle Accelerator Conferences [23] contain a large number of EC-related publications. The ICFA Beam Dynamics Newsletter series [24] contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC [25].

  7. US LHCNet: Transatlantic Networking for the LHC and the U.S. HEP Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Harvey B; Barczyk, Artur J

    2013-04-05

    US LHCNet provides the transatlantic connectivity between the Tier1 computing facilities at the Fermilab and Brookhaven National Labs and the Tier0 and Tier1 facilities at CERN, as well as Tier1s elsewhere in Europe and Asia. Together with ESnet, Internet2, and other R&E Networks participating in the LHCONE initiative, US LHCNet also supports transatlantic connections between the Tier2 centers (where most of the data analysis is taking place) and the Tier1s as needed. Given the key roles of the US and European Tier1 centers as well as Tier2 centers on both continents, the largest data flows are across the Atlantic, wheremore » US LHCNet has the major role. US LHCNet manages and operates the transatlantic network infrastructure including four Points of Presence (PoPs) and currently six transatlantic OC-192 (10Gbps) leased links. Operating at the optical layer, the network provides a highly resilient fabric for data movement, with a target service availability level in excess of 99.95%. This level of resilience and seamless operation is achieved through careful design including path diversity on both submarine and terrestrial segments, use of carrier-grade equipment with built-in high-availability and redundancy features, deployment of robust failover mechanisms based on SONET protection schemes, as well as the design of facility-diverse paths between the LHC computing sites. The US LHCNet network provides services at Layer 1(optical), Layer 2 (Ethernet) and Layer 3 (IPv4 and IPv6). The flexible design of the network, including modular equipment, a talented and agile team, and flexible circuit lease management, allows US LHCNet to react quickly to changing requirements form the LHC community. Network capacity is provisioned just-in-time to meet the needs, as demonstrated in the past years during the changing LHC start-up plans.« less

  8. Black-hole production at LHC: Special features, problems, and expectations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savina, M. V., E-mail: savina@cern.ch

    2011-03-15

    A brief survey of the present-day status of the problem of multidimensional-black-hole production at accelerators according to models featuring large extra dimensions is given. The respective production cross section and the Hawking temperature and decay rate are estimated versus model parameters. Possible flaws and assumptions whose accurate inclusion can reduce significantly the probability of blackhole production at accelerators in relation to earlier optimistic estimates are also discussed.

  9. Support Structure Design of the $$\\hbox{Nb}_{3}\\hbox{Sn}$$ Quadrupole for the High Luminosity LHC

    DOE PAGES

    Juchno, M.; Ambrosio, G.; Anerella, M.; ...

    2014-10-31

    New low-β quadrupole magnets are being developed within the scope of the High Luminosity LHC (HL-LHC) project in collaboration with the US LARP program. The aim of the HLLHC project is to study and implement machine upgrades necessary for increasing the luminosity of the LHC. The new quadrupoles, which are based on the Nb₃Sn superconducting technology, will be installed in the LHC Interaction Regions and will have to generate a gradient of 140 T/m in a coil aperture of 150 mm. In this paper, we describe the design of the short model magnet support structure and discuss results of themore » detailed 3D numerical analysis performed in preparation for the first short model test.« less

  10. Towards future circular colliders

    NASA Astrophysics Data System (ADS)

    Benedikt, Michael; Zimmermann, Frank

    2016-09-01

    The Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) presently provides proton-proton collisions at a center-of-mass (c.m.) energy of 13 TeV. The LHC design was started more than 30 years ago, and its physics program will extend through the second half of the 2030's. The global Future Circular Collider (FCC) study is now preparing for a post-LHC project. The FCC study focuses on the design of a 100-TeV hadron collider (FCC-hh) in a new ˜100 km tunnel. It also includes the design of a high-luminosity electron-positron collider (FCCee) as a potential intermediate step, and a lepton-hadron collider option (FCC-he). The scope of the FCC study comprises accelerators, technology, infrastructure, detectors, physics, concepts for worldwide data services, international governance models, and implementation scenarios. Among the FCC core technologies figure 16-T dipole magnets, based on Nb3 S n superconductor, for the FCC-hh hadron collider, and a highly-efficient superconducting radiofrequency system for the FCC-ee lepton collider. Following the FCC concept, the Institute of High Energy Physics (IHEP) in Beijing has initiated a parallel design study for an e + e - Higgs factory in China (CEPC), which is to be succeeded by a high-energy hadron collider (SPPC). At present a tunnel circumference of 54 km and a hadron collider c.m. energy of about 70 TeV are being considered. After a brief look at the LHC, this article reports the motivation and the present status of the FCC study, some of the primary design challenges and R&D subjects, as well as the emerging global collaboration.

  11. High-Luminosity Large Hadron Collider (HL-LHC) : Preliminary Design Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apollinari, G.; Béjar Alonso, I.; Brüning, O.

    2015-12-17

    The Large Hadron Collider (LHC) is one of the largest scientific instruments ever built. Since opening up a new energy frontier for exploration in 2010, it has gathered a global user community of about 7,000 scientists working in fundamental particle physics and the physics of hadronic matter at extreme temperature and density. To sustain and extend its discovery potential, the LHC will need a major upgrade in the 2020s. This will increase its luminosity (rate of collisions) by a factor of five beyond the original design value and the integrated luminosity (total collisions created) by a factor ten. The LHCmore » is already a highly complex and exquisitely optimised machine so this upgrade must be carefully conceived and will require about ten years to implement. The new configuration, known as High Luminosity LHC (HL-LHC), will rely on a number of key innovations that push accelerator technology beyond its present limits. Among these are cutting-edge 11-12 tesla superconducting magnets, compact superconducting cavities for beam rotation with ultra-precise phase control, new technology and physical processes for beam collimation and 300 metre-long high-power superconducting links with negligible energy dissipation. The present document describes the technologies and components that will be used to realise the project and is intended to serve as the basis for the detailed engineering design of HL-LHC.« less

  12. Numerical simulations of energy deposition caused by 50 MeV—50 TeV proton beams in copper and graphite targets

    NASA Astrophysics Data System (ADS)

    Nie, Y.; Schmidt, R.; Chetvertkova, V.; Rosell-Tarragó, G.; Burkart, F.; Wollmann, D.

    2017-08-01

    The conceptual design of the Future Circular Collider (FCC) is being carried out actively in an international collaboration hosted by CERN, for the post-Large Hadron Collider (LHC) era. The target center-of-mass energy of proton-proton collisions for the FCC is 100 TeV, nearly an order of magnitude higher than for LHC. The existing CERN accelerators will be used to prepare the beams for FCC. Concerning beam-related machine protection of the whole accelerator chain, it is critical to assess the consequences of beam impact on various accelerator components in the cases of controlled and uncontrolled beam losses. In this paper, we study the energy deposition of protons in solid copper and graphite targets, since the two materials are widely used in magnets, beam screens, collimators, and beam absorbers. Nominal injection and extraction energies in the hadron accelerator complex at CERN were selected in the range of 50 MeV-50 TeV. Three beam sizes were studied for each energy, corresponding to typical values of the betatron function. Specifically for thin targets, comparisons between fluka simulations and analytical Bethe equation calculations were carried out, which showed that the damage potential of a few-millimeter-thick graphite target and submillimeter-thick copper foil can be well estimated directly by the Bethe equation. The paper provides a valuable reference for the quick evaluation of potential damage to accelerator elements over a large range of beam parameters when beam loss occurs.

  13. Fabrication and Assembly Performance of the First 4.2 m MQXFA Magnet and Mechanical Model for the Hi-Lumi LHC Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Daniel W.; Ambrosio, Giorgio; Anderssen, Eric C.

    Here, the LHC accelerator research program (LARP), in collaboration with CERN and under the scope of the high luminosity upgrade of the Large Hadron Collider, is in the prototyping stage in the development of a 150 mm aperture high-field Nb 3Sn quadrupole magnet called MQXF. This magnet is mechanically supported using a shell-based support structure, which has been extensively demonstrated on several R&D models within LARP, as well as in the more recent short (1.2 m magnetic length) MQXF model program. The MQXFA magnets are each 4.2 m magnetic length, and the first mechanical long model, MQXFA1M (using aluminum surrogatemore » coils), and MQXFAP1 prototype magnet (the first prototype with Nb 3Sn coils) have been assembled at the LBNL. In this paper, we summarize the tooling and the assembly processes, and discuss the mechanical performance of these first two assemblies, comparing strain gauge data with finite element model analysis, as well as the near-term plans for the long MQXF magnet program.« less

  14. Fabrication and Assembly Performance of the First 4.2 m MQXFA Magnet and Mechanical Model for the Hi-Lumi LHC Upgrade

    DOE PAGES

    Cheng, Daniel W.; Ambrosio, Giorgio; Anderssen, Eric C.; ...

    2018-01-30

    Here, the LHC accelerator research program (LARP), in collaboration with CERN and under the scope of the high luminosity upgrade of the Large Hadron Collider, is in the prototyping stage in the development of a 150 mm aperture high-field Nb 3Sn quadrupole magnet called MQXF. This magnet is mechanically supported using a shell-based support structure, which has been extensively demonstrated on several R&D models within LARP, as well as in the more recent short (1.2 m magnetic length) MQXF model program. The MQXFA magnets are each 4.2 m magnetic length, and the first mechanical long model, MQXFA1M (using aluminum surrogatemore » coils), and MQXFAP1 prototype magnet (the first prototype with Nb 3Sn coils) have been assembled at the LBNL. In this paper, we summarize the tooling and the assembly processes, and discuss the mechanical performance of these first two assemblies, comparing strain gauge data with finite element model analysis, as well as the near-term plans for the long MQXF magnet program.« less

  15. PREFACE: International Workshop on Discovery Physics at the LHC (Kruger2012)

    NASA Astrophysics Data System (ADS)

    Cleymans, Jean

    2013-08-01

    The second conference on 'Discovery Physics at the LHC' was held on 3-7 December 2012 at the Kruger Gate Hotel in South Africa. In total there were 110 participants from Armenia, Belgium, Brazil, Canada, Czech Republic, France, Germany, Greece, Israel, Italy, Norway, Poland, USA, Russia, Slovakia, Spain, Sweden, United Kingdom, Switzerland and South Africa. The latest results from the Large Hadron Collider, Brookhaven National Laboratory, Jefferson Laboratory and BABAR experiments, as well as the latest theoretical insights were presented. Set against the backdrop of the majestic Kruger National Park a very stimulating conference with many exchanges took place. The proceedings reflect the high standard of the conference. The financial contributions from the National Institute for Theoretical Physics (NITHeP), the SA-CERN programme, the UCT-CERN Research Centre, the University of Johannesburg, the University of the Witwatersrand and iThemba Labs—Laboratory for Accelerator Based Science are gratefully acknowledged. Jean Cleymans Chair of the Local Organizing Committee Local Organizing Committee Oana Boeriu Jean Cleymans Simon H Connell Alan S Cornell William A Horowitz Andre Peshier Trevor Vickey Zeblon Z Vilakazi Group picture

  16. Application of particle accelerators in research.

    PubMed

    Mazzitelli, Giovanni

    2011-07-01

    Since the beginning of the past century, accelerators have started to play a fundamental role as powerful tools to discover the world around us, how the universe has evolved since the big bang and to develop fundamental instruments for everyday life. Although more than 15 000 accelerators are operating around the world only a very few of them are dedicated to fundamental research. An overview of the present high energy physics (HEP) accelerator status and prospectives is presented.

  17. Numerical Analysis of Parasitic Crossing Compensation with Wires in DA$$\\Phi$$NE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valishev, A.; Shatilov, D.; Milardi, C.

    2015-06-24

    Current-bearing wire compensators were successfully used in the 2005-2006 run of the DAΦNE collider to mitigate the detrimental effects of parasitic beam-beam interactions. A marked improvement of the positron beam lifetime was observed in machine operation with the KLOE detector. In view of the possible application of wire beam-beam compensators for the High Luminosity LHC upgrade, we revisit the DAΦNE experiments. We use an improved model of the accelerator with the goal to validate the modern simulation tools and provide valuable input for the LHC upgrade project.

  18. Work with Us | Geothermal Technologies | NREL

    Science.gov Websites

    work with us and leverage our geothermal research, facilities, and expertise. Contact Us Photo of develop, test, and evaluate geothermal technologies. Commercialize Your Technology Accelerate the transfer

  19. Introduction to the HL-LHC Project

    NASA Astrophysics Data System (ADS)

    Rossi, L.; Brüning, O.

    The Large Hadron Collider (LHC) is one of largest scientific instruments ever built. It has been exploring the new energy frontier since 2010, gathering a global user community of 7,000 scientists. To extend its discovery potential, the LHC will need a major upgrade in the 2020s to increase its luminosity (rate of collisions) by a factor of five beyond its design value and the integrated luminosity by a factor of ten. As a highly complex and optimized machine, such an upgrade of the LHC must be carefully studied and requires about ten years to implement. The novel machine configuration, called High Luminosity LHC (HL-LHC), will rely on a number of key innovative technologies, representing exceptional technological challenges, such as cutting-edge 11-12 tesla superconducting magnets, very compact superconducting cavities for beam rotation with ultra-precise phase control, new technology for beam collimation and 300-meter-long high-power superconducting links with negligible energy dissipation. HL-LHC federates efforts and R&D of a large community in Europe, in the US and in Japan, which will facilitate the implementation of the construction phase as a global project.

  20. Production and installation of the LHC low-beta triplets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feher, S.; Bossert, R.; DiMarco, J.

    2005-09-01

    The LHC performance depends critically on the low-{beta}, triplets, located on either side of the four interaction points. Each triplet consists of four superconducting quadrupole magnets, which must operate reliably at up to 215 T/m, sustain extremely high heat loads and have an excellent field quality. A collaboration of CERN, Fermilab and KEK was formed in 1996 to design and build the triplet systems, and after nine years of joint effort the production has been completed in 2005. We retrace the main events of the project and present the design features and performance of the low-{beta} quadrupoles, built by KEKmore » and Fermilab, as well as of other vital elements of the triplet. The tunnel installation of the first triplet and plans for commissioning in the LHC are also presented. Apart from the excellent technical results, the construction of the LHC low-{beta} triplets has been a highly enriching experience combining harmoniously the different competences and approaches to engineering in a style reminiscent of high energy physics experiment collaborations, and rarely before achieved in construction of an accelerator.« less

  1. Targeted Alpha Therapy: The US DOE Tri-Lab (ORNL, BNL, LANL) Research Effort to Provide Accelerator-Produced 225Ac for Radiotherapy

    NASA Astrophysics Data System (ADS)

    John, Kevin

    2017-01-01

    Targeted radiotherapy is an emerging discipline of cancer therapy that exploits the biochemical differences between normal cells and cancer cells to selectively deliver a lethal dose of radiation to cancer cells, while leaving healthy cells relatively unperturbed. A broad overview of targeted alpha therapy including isotope production methods, and associated isotope production facility needs, will be provided. A more general overview of the US Department of Energy Isotope Program's Tri-Lab (ORNL, BNL, LANL) Research Effort to Provide Accelerator-Produced 225Ac for Radiotherapy will also be presented focusing on the accelerator-production of 225Ac and final product isolation methodologies for medical applications.

  2. A large hadron electron collider at CERN

    DOE PAGES

    Abelleira Fernandez, J. L.

    2015-04-06

    This document provides a brief overview of the recently published report on the design of the Large Hadron Electron Collider (LHeC), which comprises its physics programme, accelerator physics, technology and main detector concepts. The LHeC exploits and develops challenging, though principally existing, accelerator and detector technologies. This summary is complemented by brief illustrations of some of the highlights of the physics programme, which relies on a vastly extended kinematic range, luminosity and unprecedented precision in deep inelastic scattering. Illustrations are provided regarding high precision QCD, new physics (Higgs, SUSY) and eletron-ion physics. The LHeC is designed to run synchronously withmore » the LHC in the twenties and to achieve an integrated luminosity of O(100)fb –1. It will become the cleanest high resolution microscope of mankind and will substantially extend as well as complement the investigation of the physics of the TeV energy scale, which has been enabled by the LHC.« less

  3. Thin Film Approaches to the SRF Cavity Problem: Fabrication and Characterization of Superconducting Thin Films

    NASA Astrophysics Data System (ADS)

    Beringer, Douglas B.

    Superconducting Radio Frequency (SRF) cavities are responsible for the acceleration of charged particles to relativistic velocities in most modern linear accelerators, such as those employed at high-energy research facilities like Thomas Jefferson National Laboratory's CEBAF and the LHC at CERN. Recognizing SRF as primarily a surface phenomenon enables the possibility of applying thin films to the interior surface of SRF cavities, opening a formidable tool chest of opportunities by combining and designing materials that offer greater benefit. Thus, while improvements in radio frequency cavity design and refinements in cavity processing techniques have improved accelerator performance and efficiency - 1.5 GHz bulk niobium SRF cavities have achieved accelerating gradients in excess of 35 MV/m - there exist fundamental material bounds in bulk superconductors limiting the maximally sustained accelerating field gradient (approximately 45 MV/m for Niobium) where inevitable thermodynamic breakdown occurs. With state of the art niobium based cavity design fast approaching these theoretical limits, novel material innovations must be sought in order to realize next generation SRF cavities. One proposed method to improve SRF performance is to utilize thin film superconducting-insulating-superconducting (SIS) multilayer structures to effectively magnetically screen a bulk superconducting layer such that it can operate at higher field gradients before suffering critically detrimental SRF losses. This dissertation focuses on the production and characterization of thin film superconductors for such SIS layers for radio-frequency applications.

  4. Testing Hadronic Interactions at Ultrahigh Energies with Air Showers Measured by the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Al Samarai, I.; Albuquerque, I. F. M.; Allekotte, I.; Allen, J. D.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Ambrosio, M.; Anastasi, G. A.; Anchordoqui, L.; Andrada, B.; Andringa, S.; Aramo, C.; Arqueros, F.; Arsene, N.; Asorey, H.; Assis, P.; Aublin, J.; Avila, G.; Badescu, A. M.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Biteau, J.; Blaess, S. G.; Blanco, A.; Blazek, J.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Borodai, N.; Botti, A. M.; Brack, J.; Brancus, I.; Bretz, T.; Bridgeman, A.; Briechle, F. L.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Cancio, A.; Canfora, F.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chirinos Diaz, J. C.; Chudoba, J.; Clay, R. W.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Coutu, S.; Covault, C. E.; Cronin, J.; Dallier, R.; D'Amico, S.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; de Jong, S. J.; De Mauro, G.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; Debatin, J.; del Peral, L.; Deligny, O.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Díaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; D'Olivo, J. C.; Dorofeev, A.; dos Anjos, R. C.; Dova, M. T.; Dundovic, A.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Falcke, H.; Fang, K.; Farrar, G. R.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fratu, O.; Freire, M. M.; Fujii, T.; Fuster, A.; Gallo, F.; García, B.; Garcia-Pinto, D.; Gate, F.; Gemmeke, H.; Gherghel-Lascu, A.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Głas, D.; Glaser, C.; Glass, H.; Golup, G.; Gómez Berisso, M.; Gómez Vitale, P. F.; González, N.; Gookin, B.; Gordon, J.; Gorgi, A.; Gorham, P.; Gouffon, P.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Hasankiadeh, Q.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huege, T.; Hulsman, J.; Insolia, A.; Isar, P. G.; Jandt, I.; Jansen, S.; Jarne, C.; Johnsen, J. A.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Keilhauer, B.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Kuempel, D.; Kukec Mezek, G.; Kunka, N.; Kuotb Awad, A.; LaHurd, D.; Latronico, L.; Lauscher, M.; Lautridou, P.; Lebrun, P.; Legumina, R.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; Lopes, L.; López, R.; López Casado, A.; Lucero, A.; Malacari, M.; Mallamaci, M.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Mariş, I. C.; Marsella, G.; Martello, D.; Martinez, H.; Martínez Bravo, O.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Mello, V. B. B.; Melo, D.; Menshikov, A.; Messina, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Montanet, F.; Morello, C.; Mostafá, M.; Moura, C. A.; Müller, G.; Muller, M. A.; Müller, S.; Naranjo, I.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nguyen, P. H.; Niculescu-Oglinzanu, M.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, H.; Núñez, L. A.; Ochilo, L.; Oikonomou, F.; Olinto, A.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pedreira, F.; Pekala, J.; Pelayo, R.; Peña-Rodriguez, J.; Pepe, I. M.; Pereira, L. A. S.; Perrone, L.; Petermann, E.; Peters, C.; Petrera, S.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Risse, M.; Ristori, P.; Rizi, V.; Rodrigues de Carvalho, W.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Rogozin, D.; Rosado, J.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salazar, H.; Saleh, A.; Salesa Greus, F.; Salina, G.; Sanabria Gomez, J. D.; Sánchez, F.; Sanchez-Lucas, P.; Santos, E. M.; Santos, E.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sarmiento-Cano, C.; Sato, R.; Scarso, C.; Schauer, M.; Scherini, V.; Schieler, H.; Schmidt, D.; Scholten, O.; Schoorlemmer, H.; Schovánek, P.; Schröder, F. G.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sigl, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sonntag, S.; Sorokin, J.; Squartini, R.; Stanca, D.; Stanič, S.; Stapleton, J.; Stasielak, J.; Strafella, F.; Stutz, A.; Suarez, F.; Suarez Durán, M.; Sudholz, T.; Suomijärvi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Taborda, O. A.; Tapia, A.; Tepe, A.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trini, M.; Ulrich, R.; Unger, M.; Urban, M.; Valbuena-Delgado, A.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van Bodegom, P.; van den Berg, A. M.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weindl, A.; Wiencke, L.; Wilczyński, H.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yang, L.; Yapici, T.; Yelos, D.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zepeda, A.; Zimmermann, B.; Ziolkowski, M.; Zong, Z.; Zuccarello, F.; Pierre Auger Collaboration

    2016-11-01

    Ultrahigh energy cosmic ray air showers probe particle physics at energies beyond the reach of accelerators. Here we introduce a new method to test hadronic interaction models without relying on the absolute energy calibration, and apply it to events with primary energy 6-16 EeV (ECM=110 - 170 TeV ), whose longitudinal development and lateral distribution were simultaneously measured by the Pierre Auger Observatory. The average hadronic shower is 1.33 ±0.16 (1.61 ±0.21 ) times larger than predicted using the leading LHC-tuned models EPOS-LHC (QGSJetII-04), with a corresponding excess of muons.

  5. Testing Hadronic Interactions at Ultrahigh Energies with Air Showers Measured by the Pierre Auger Observatory.

    PubMed

    Aab, A; Abreu, P; Aglietta, M; Ahn, E J; Al Samarai, I; Albuquerque, I F M; Allekotte, I; Allen, J D; Allison, P; Almela, A; Alvarez Castillo, J; Alvarez-Muñiz, J; Ambrosio, M; Anastasi, G A; Anchordoqui, L; Andrada, B; Andringa, S; Aramo, C; Arqueros, F; Arsene, N; Asorey, H; Assis, P; Aublin, J; Avila, G; Badescu, A M; Baus, C; Beatty, J J; Becker, K H; Bellido, J A; Berat, C; Bertaina, M E; Bertou, X; Biermann, P L; Billoir, P; Biteau, J; Blaess, S G; Blanco, A; Blazek, J; Bleve, C; Blümer, H; Boháčová, M; Boncioli, D; Bonifazi, C; Borodai, N; Botti, A M; Brack, J; Brancus, I; Bretz, T; Bridgeman, A; Briechle, F L; Buchholz, P; Bueno, A; Buitink, S; Buscemi, M; Caballero-Mora, K S; Caccianiga, B; Caccianiga, L; Cancio, A; Canfora, F; Caramete, L; Caruso, R; Castellina, A; Cataldi, G; Cazon, L; Cester, R; Chavez, A G; Chiavassa, A; Chinellato, J A; Chirinos Diaz, J C; Chudoba, J; Clay, R W; Colalillo, R; Coleman, A; Collica, L; Coluccia, M R; Conceição, R; Contreras, F; Cooper, M J; Coutu, S; Covault, C E; Cronin, J; Dallier, R; D'Amico, S; Daniel, B; Dasso, S; Daumiller, K; Dawson, B R; de Almeida, R M; de Jong, S J; De Mauro, G; de Mello Neto, J R T; De Mitri, I; de Oliveira, J; de Souza, V; Debatin, J; Del Peral, L; Deligny, O; Dhital, N; Di Giulio, C; Di Matteo, A; Díaz Castro, M L; Diogo, F; Dobrigkeit, C; D'Olivo, J C; Dorofeev, A; Dos Anjos, R C; Dova, M T; Dundovic, A; Ebr, J; Engel, R; Erdmann, M; Erfani, M; Escobar, C O; Espadanal, J; Etchegoyen, A; Falcke, H; Fang, K; Farrar, G R; Fauth, A C; Fazzini, N; Ferguson, A P; Fick, B; Figueira, J M; Filevich, A; Filipčič, A; Fratu, O; Freire, M M; Fujii, T; Fuster, A; Gallo, F; García, B; Garcia-Pinto, D; Gate, F; Gemmeke, H; Gherghel-Lascu, A; Ghia, P L; Giaccari, U; Giammarchi, M; Giller, M; Głas, D; Glaser, C; Glass, H; Golup, G; Gómez Berisso, M; Gómez Vitale, P F; González, N; Gookin, B; Gordon, J; Gorgi, A; Gorham, P; Gouffon, P; Griffith, N; Grillo, A F; Grubb, T D; Guarino, F; Guedes, G P; Hampel, M R; Hansen, P; Harari, D; Harrison, T A; Harton, J L; Hasankiadeh, Q; Haungs, A; Hebbeker, T; Heck, D; Heimann, P; Herve, A E; Hill, G C; Hojvat, C; Hollon, N; Holt, E; Homola, P; Hörandel, J R; Horvath, P; Hrabovský, M; Huege, T; Hulsman, J; Insolia, A; Isar, P G; Jandt, I; Jansen, S; Jarne, C; Johnsen, J A; Josebachuili, M; Kääpä, A; Kambeitz, O; Kampert, K H; Kasper, P; Katkov, I; Keilhauer, B; Kemp, E; Kieckhafer, R M; Klages, H O; Kleifges, M; Kleinfeller, J; Krause, R; Krohm, N; Kuempel, D; Kukec Mezek, G; Kunka, N; Kuotb Awad, A; LaHurd, D; Latronico, L; Lauscher, M; Lautridou, P; Lebrun, P; Legumina, R; Leigui de Oliveira, M A; Letessier-Selvon, A; Lhenry-Yvon, I; Link, K; Lopes, L; López, R; López Casado, A; Lucero, A; Malacari, M; Mallamaci, M; Mandat, D; Mantsch, P; Mariazzi, A G; Marin, V; Mariş, I C; Marsella, G; Martello, D; Martinez, H; Martínez Bravo, O; Masías Meza, J J; Mathes, H J; Mathys, S; Matthews, J; Matthews, J A J; Matthiae, G; Maurizio, D; Mayotte, E; Mazur, P O; Medina, C; Medina-Tanco, G; Mello, V B B; Melo, D; Menshikov, A; Messina, S; Micheletti, M I; Middendorf, L; Minaya, I A; Miramonti, L; Mitrica, B; Molina-Bueno, L; Mollerach, S; Montanet, F; Morello, C; Mostafá, M; Moura, C A; Müller, G; Muller, M A; Müller, S; Naranjo, I; Navas, S; Necesal, P; Nellen, L; Nelles, A; Neuser, J; Nguyen, P H; Niculescu-Oglinzanu, M; Niechciol, M; Niemietz, L; Niggemann, T; Nitz, D; Nosek, D; Novotny, V; Nožka, H; Núñez, L A; Ochilo, L; Oikonomou, F; Olinto, A; Pakk Selmi-Dei, D; Palatka, M; Pallotta, J; Papenbreer, P; Parente, G; Parra, A; Paul, T; Pech, M; Pedreira, F; Pękala, J; Pelayo, R; Peña-Rodriguez, J; Pepe, I M; Pereira, L A S; Perrone, L; Petermann, E; Peters, C; Petrera, S; Phuntsok, J; Piegaia, R; Pierog, T; Pieroni, P; Pimenta, M; Pirronello, V; Platino, M; Plum, M; Porowski, C; Prado, R R; Privitera, P; Prouza, M; Quel, E J; Querchfeld, S; Quinn, S; Rautenberg, J; Ravel, O; Ravignani, D; Revenu, B; Ridky, J; Risse, M; Ristori, P; Rizi, V; Rodrigues de Carvalho, W; Rodriguez Rojo, J; Rodríguez-Frías, M D; Rogozin, D; Rosado, J; Roth, M; Roulet, E; Rovero, A C; Saffi, S J; Saftoiu, A; Salazar, H; Saleh, A; Salesa Greus, F; Salina, G; Sanabria Gomez, J D; Sánchez, F; Sanchez-Lucas, P; Santos, E M; Santos, E; Sarazin, F; Sarkar, B; Sarmento, R; Sarmiento-Cano, C; Sato, R; Scarso, C; Schauer, M; Scherini, V; Schieler, H; Schmidt, D; Scholten, O; Schoorlemmer, H; Schovánek, P; Schröder, F G; Schulz, A; Schulz, J; Schumacher, J; Sciutto, S J; Segreto, A; Settimo, M; Shadkam, A; Shellard, R C; Sigl, G; Sima, O; Śmiałkowski, A; Šmída, R; Snow, G R; Sommers, P; Sonntag, S; Sorokin, J; Squartini, R; Stanca, D; Stanič, S; Stapleton, J; Stasielak, J; Strafella, F; Stutz, A; Suarez, F; Suarez Durán, M; Sudholz, T; Suomijärvi, T; Supanitsky, A D; Sutherland, M S; Swain, J; Szadkowski, Z; Taborda, O A; Tapia, A; Tepe, A; Theodoro, V M; Timmermans, C; Todero Peixoto, C J; Tomankova, L; Tomé, B; Tonachini, A; Torralba Elipe, G; Torres Machado, D; Travnicek, P; Trini, M; Ulrich, R; Unger, M; Urban, M; Valbuena-Delgado, A; Valdés Galicia, J F; Valiño, I; Valore, L; van Aar, G; van Bodegom, P; van den Berg, A M; van Vliet, A; Varela, E; Vargas Cárdenas, B; Varner, G; Vázquez, J R; Vázquez, R A; Veberič, D; Verzi, V; Vicha, J; Videla, M; Villaseñor, L; Vorobiov, S; Wahlberg, H; Wainberg, O; Walz, D; Watson, A A; Weber, M; Weindl, A; Wiencke, L; Wilczyński, H; Winchen, T; Wittkowski, D; Wundheiler, B; Wykes, S; Yang, L; Yapici, T; Yelos, D; Zas, E; Zavrtanik, D; Zavrtanik, M; Zepeda, A; Zimmermann, B; Ziolkowski, M; Zong, Z; Zuccarello, F

    2016-11-04

    Ultrahigh energy cosmic ray air showers probe particle physics at energies beyond the reach of accelerators. Here we introduce a new method to test hadronic interaction models without relying on the absolute energy calibration, and apply it to events with primary energy 6-16 EeV (E_{CM}=110-170  TeV), whose longitudinal development and lateral distribution were simultaneously measured by the Pierre Auger Observatory. The average hadronic shower is 1.33±0.16 (1.61±0.21) times larger than predicted using the leading LHC-tuned models EPOS-LHC (QGSJetII-04), with a corresponding excess of muons.

  6. High Energy Colliding Beams; What Is Their Future?

    NASA Astrophysics Data System (ADS)

    Richter, Burton

    The success of the first few years of LHC operations at CERN, and the expectation of more to come as the LHC's performance improves, are already leading to discussions of what should be next for both proton-proton and electron-positron colliders. In this discussion I see too much theoretical desperation caused by the so-far-unsuccessful hunt for what is beyond the Standard Model, and too little of the necessary interaction of the accelerator, experimenter, and theory communities necessary for a scientific and engineering success. Here, I give my impressions of the problem, its possible solution, and what is needed to have both a scientifically productive and financially viable future.

  7. High Energy Colliding Beams; What Is Their Future?

    NASA Astrophysics Data System (ADS)

    Richter, Burton

    2014-04-01

    The success of the first few years of LHC operations at CERN, and the expectation of more to come as the LHC's performance improves, are already leading to discussions of what should be next for both proton-proton and electron-positron colliders. In this discussion I see too much theoretical desperation caused by the so-far-unsuccessful hunt for what is beyond the Standard Model, and too little of the necessary interaction of the accelerator, experimenter, and theory communities necessary for a scientific and engineering success. Here, I give my impressions of the problem, its possible solution, and what is needed to have both a scientifically productive and financially viable future.

  8. High Energy Colliding Beams; What Is Their Future?

    NASA Astrophysics Data System (ADS)

    Richter, Burton

    2015-02-01

    The success of the first few years of LHC operations at CERN, and the expectation of more to come as the LHC's performance improves, are already leading to discussions of what should be next for both proton-proton and electron-positron colliders. In this discussion I see too much theoretical desperation caused by the so-far-unsuccessful hunt for what is beyond the Standard Model, and too little of the necessary interaction of the accelerator, experimenter, and theory communities necessary for a scientific and engineering success. Here, I give my impressions of the problem, its possible solution, and what is needed to have both a scientifically productive and financially viable future.

  9. FERMILAB ACCELERATOR R&D PROGRAM TOWARDS INTENSITY FRONTIER ACCELERATORS : STATUS AND PROGRESS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiltsev, Vladimir

    2016-11-15

    The 2014 P5 report indicated the accelerator-based neutrino and rare decay physics research as a centrepiece of the US domestic HEP program at Fermilab. Operation, upgrade and development of the accelerators for the near- term and longer-term particle physics program at the Intensity Frontier face formidable challenges. Here we discuss key elements of the accelerator physics and technology R&D program toward future multi-MW proton accelerators and present its status and progress. INTENSITY FRONTIER ACCELERATORS

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiltsev, Vladimir

    The 2014 P5 report indicated the accelerator-based neutrino and rare decay physics research as a centerpiece of the US domestic HEP program. Operation, upgrade and development of the accelerators for the near-term and longer-term particle physics program at the Intensity Frontier face formidable challenges. Here we discuss key elements of the accelerator physics and technology R&D program toward future multi-MW proton accelerators.

  11. Cancer nanotechnology research in the United States and China: cooperation to promote innovation.

    PubMed

    Schneider, Julie A; Grodzinski, Piotr; Liang, Xing-Jie

    2011-01-01

    The application of nanotechnology to cancer research is a promising area for US-China cooperation. Cancer is a major public health burden in both countries, and progress in cancer nanotechnology research is increasing in several fields, including imaging, biomarker detection, and targeted drug delivery. The United States and China are international leaders in nanotechnology research, and have both launched national programs to support nanotechnology efforts in the recent past. The accelerating trend of co-authorship among US and Chinese nanotechnology researchers demonstrates that individual scientists already recognize the potential for cooperation, providing a strong platform for creating additional partnerships in pre-competitive research areas. Mechanisms that could help to enhance US-China cancer nanotechnology partnerships include: developing new programs for bi-directional training and exchange; convening workshops focused on specific scientific topics of high priority to both countries; and joint support of collaborative research projects by US and Chinese funders. In addition to the accelerating scientific progress, expanded cooperation will stimulate important dialog on regulatory, policy, and technical issues needed to lay the groundwork for US and Chinese scientists to move greater numbers of cancer nanotechnology applications into the clinic. Copyright © 2011 John Wiley & Sons, Inc.

  12. LHC: The Emptiest Space in the Solar System

    ERIC Educational Resources Information Center

    Cid-Vidal, Xabier; Cid, Ramon

    2011-01-01

    Proton beams have been colliding at 7 TeV in the Large Hadron Collider (LHC) since 30 March 2010, meaning that the LHC research programme is underway. Particle physicists around the world are looking forward to using the data from these collisions, as the LHC is running at an energy three and a half times higher than previously achieved at any…

  13. Impact of 7-TeV/c large hadron collider proton beam on a copper target

    NASA Astrophysics Data System (ADS)

    Tahir, N. A.; Goddard, B.; Kain, V.; Schmidt, R.; Shutov, A.; Lomonosov, I. V.; Piriz, A. R.; Temporal, M.; Hoffmann, D. H. H.; Fortov, V. E.

    2005-04-01

    The large hadron collider (LHC) will allow for collision between two 7TeV/c proton beams, each comprising 2808 bunches with 1.1×1011 protons per bunch, traveling in opposite direction. The bunch length is 0.5ns and two neighboring bunches are separated by 25ns so that the duration of the entire beam is about 89μs. The beam power profile in the transverse direction is a Gaussian with a standard deviation of 0.2mm. The energy stored in each beam is about 350MJ that is sufficient to melt 500kg of copper. In case of a failure in the machine protection systems, the entire beam could impact directly onto an accelerator equipment. A first estimate of the scale of damage resulting from such a failure has been assessed for a solid copper target hit by the beam by carrying out three-dimensional energy deposition calculations and two-dimensional numerical simulations of the hydrodynamic and thermodynamic response of the target. This work has shown that the penetration depth of the LHC protons will be between 10 and 40m in solid copper. These calculations show that material conditions obtained in the target are similar to those planned for beam impact at dedicated accelerators designed to study the physics of high-energy-density states of matter, for example, the Facility for Antiprotons and Ion Research at the Gesellschaft für Schwerionenforschung, Darmstadt [W. F. Henning, Nucl. Instrum Methods Phys. Res. B 214, 211 (2004)].

  14. Big Science and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    2012-03-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  15. Simon van der Meer (1925-2011):. A Modest Genius of Accelerator Science

    NASA Astrophysics Data System (ADS)

    Chohan, Vinod C.

    2011-02-01

    Simon van der Meer was a brilliant scientist and a true giant of accelerator science. His seminal contributions to accelerator science have been essential to this day in our quest for satisfying the demands of modern particle physics. Whether we talk of long base-line neutrino physics or antiproton-proton physics at Fermilab or proton-proton physics at LHC, his techniques and inventions have been a vital part of the modern day successes. Simon van der Meer and Carlo Rubbia were the first CERN scientists to become Nobel laureates in Physics, in 1984. Van der Meer's lesserknown contributions spanned a whole range of subjects in accelerator science, from magnet design to power supply design, beam measurements, slow beam extraction, sophisticated programs and controls.

  16. Fermilab | About Fermilab

    Science.gov Websites

    news For the media Particle Physics Neutrinos Fermilab and the LHC Dark matter and dark energy ADMX discoveries Questions for the universe Ask a scientist Tevatron Tevatron Timeline Tevatron accelerator Tevatron experiments Tevatron operation Shutdown process For the media Video of shutdown event Guest book

  17. Study of new FNAL-NICADD extruded scintillator as active media of large EMCal of ALICE at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oleg A. Grachov et al.

    The current conceptual design of proposed Large EMCal of ALICE at LHC is based largely on the scintillating mega-tile/fiber technology implemented in CDF Endplug upgrade project and in both barrel and endcap electromagnetic calorimeters of the STAR. The cost of scintillating material leads us to the choice of extruded polystyrene based scintillator, which is available in new FNAL-NICADD facility. Result of optical measurements, such as light yield and light yield variation, show that it is possible to use this material as active media of Large EMCal of ALICE at LHC.

  18. Air Liquides Contribution to the CERN Lhc Refrigeration System

    NASA Astrophysics Data System (ADS)

    Dauguet, P.; Gistau-Baguer, G. M.; Briend, P.; Hilbert, B.; Monneret, E.; Villard, J. C.; Marot, G.; Delcayre, F.; Mantileri, C.; Hamber, F.; Courty, J. C.; Hirel, P.; Cohu, A.; Moussavi, H.

    2008-03-01

    The Large Hadron Collider (LHC) is the largest particle accelerator in the world. It is a superconducting machine over 27 km in circumference. Its magnets and cavities require helium refrigeration and liquefaction over the temperature range of 1.8 K to 300 K. This is the largest cryogenic system in the world with respect to the needed cryogenic power: 144-kW equivalent power at 4.5 K. The LHC cryogenic system is composed of 8×18 kW at 4.5 K refrigerators, 8×2.4 kW at 1.8 K systems, 5 main valve boxes, more than 27 km of helium transfer lines and around 300 service modules connecting the transfer line to the magnet and cavity strings. More than half of these components have been designed, manufactured, installed and commissioned by Air Liquide. Due to the huge size of the project, the engineering, construction and commissioning of the equipment has lasted for 8 years, from the first order of equipment in 1998 to final commissioning in 2006. Specifications, architecture and the Air Liquide design of major components of the LHC Refrigeration System are presented in this paper.

  19. CERN openlab: Engaging industry for innovation in the LHC Run 3-4 R&D programme

    NASA Astrophysics Data System (ADS)

    Girone, M.; Purcell, A.; Di Meglio, A.; Rademakers, F.; Gunne, K.; Pachou, M.; Pavlou, S.

    2017-10-01

    LHC Run3 and Run4 represent an unprecedented challenge for HEP computing in terms of both data volume and complexity. New approaches are needed for how data is collected and filtered, processed, moved, stored and analysed if these challenges are to be met with a realistic budget. To develop innovative techniques we are fostering relationships with industry leaders. CERN openlab is a unique resource for public-private partnership between CERN and leading Information Communication and Technology (ICT) companies. Its mission is to accelerate the development of cutting-edge solutions to be used by the worldwide HEP community. In 2015, CERN openlab started its phase V with a strong focus on tackling the upcoming LHC challenges. Several R&D programs are ongoing in the areas of data acquisition, networks and connectivity, data storage architectures, computing provisioning, computing platforms and code optimisation and data analytics. This paper gives an overview of the various innovative technologies that are currently being explored by CERN openlab V and discusses the long-term strategies that are pursued by the LHC communities with the help of industry in closing the technological gap in processing and storage needs expected in Run3 and Run4.

  20. Feasibility of applying the life history calendar in a population of chronic opioid users to identify patterns of drug use and addiction treatment.

    PubMed

    Fikowski, Jill; Marchand, Kirsten; Palis, Heather; Oviedo-Joekes, Eugenia

    2014-01-01

    Uncovering patterns of drug use and treatment access is essential to improving treatment for opioid dependence. The life history calendar (LHC) could be a valuable instrument for capturing time-sensitive data on lifetime patterns of drug use and addiction treatment. This study describes the methodology applied when collecting data using the LHC in a sample of individuals with long-term opioid dependence and aims to identify specific factors that impact the feasibility of administering the LHC interview. In this study, the LHC allowed important events such as births, intimate relationships, housing, or incarcerations to become reference points for recalling details surrounding drug use and treatment access. The paper concludes that the administration of the LHC was a resource-intensive process and required special attention to interviewer training and experience with the study population. These factors should be considered and integrated into study plans by researchers using the LHC in addiction research.

  1. Testing hadronic interactions at ultrahigh energies with air showers measured by the Pierre Auger Observatory

    DOE PAGES

    Aab, A.; Abreu, P.; Aglietta, M.; ...

    2016-10-31

    Ultrahigh energy cosmic ray air showers probe particle physics at energies beyond the reach of accelerators. Here we introduce a new method to test hadronic interaction models without relying on the absolute energy calibration, and apply it to events with primary energy 6–16 EeV (E CM = 110–170 TeV), whose longitudinal development and lateral distribution were simultaneously measured by the Pierre Auger Observatory. As a result, the average hadronic shower is 1.33±0.16 (1.61±0.21) times larger than predicted using the leading LHC-tuned models EPOS-LHC (QGSJetII-04), with a corresponding excess of muons.

  2. Test of Relativistic Gravity for Propulsion at the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Felber, Franklin

    2010-01-01

    A design is presented of a laboratory experiment that could test the suitability of relativistic gravity for propulsion of spacecraft to relativistic speeds. An exact time-dependent solution of Einstein's gravitational field equation confirms that even the weak field of a mass moving at relativistic speeds could serve as a driver to accelerate a much lighter payload from rest to a good fraction of the speed of light. The time-dependent field of ultrarelativistic particles in a collider ring is calculated. An experiment is proposed as the first test of the predictions of general relativity in the ultrarelativistic limit by measuring the repulsive gravitational field of bunches of protons in the Large Hadron Collider (LHC). The estimated `antigravity beam' signal strength at a resonant detector of each proton bunch is 3 nm/s2 for 2 ns during each revolution of the LHC. This experiment can be performed off-line, without interfering with the normal operations of the LHC.

  3. Teaching and Research with Accelerators at Tarleton State University

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marble, Daniel K.

    2009-03-10

    Tarleton State University students began performing both research and laboratory experiments using accelerators in 1998 through visitation programs at the University of North Texas, US Army Research Laboratory, and the Naval Surface Warfare Center at Carderock. In 2003, Tarleton outfitted its new science building with a 1 MV pelletron that was donated by the California Institution of Technology. The accelerator has been upgraded and supports a wide range of classes for both the Physics program and the ABET accredited Engineering Physics program as well as supplying undergraduate research opportunities on campus. A discussion of various laboratory activities and research projectsmore » performed by Tarleton students will be presented.« less

  4. Helping Research Organizations Build a Clean Energy Future | Working with

    Science.gov Websites

    Us | NREL Helping Research Organizations Build a Clean Energy Future Helping Research Organizations Build a Clean Energy Future Partner with NREL to accelerate the research and development of your

  5. High Gradient Accelerator Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Temkin, Richard

    The goal of the MIT program of research on high gradient acceleration is the development of advanced acceleration concepts that lead to a practical and affordable next generation linear collider at the TeV energy level. Other applications, which are more near-term, include accelerators for materials processing; medicine; defense; mining; security; and inspection. The specific goals of the MIT program are: • Pioneering theoretical research on advanced structures for high gradient acceleration, including photonic structures and metamaterial structures; evaluation of the wakefields in these advanced structures • Experimental research to demonstrate the properties of advanced structures both in low-power microwave coldmore » test and high-power, high-gradient test at megawatt power levels • Experimental research on microwave breakdown at high gradient including studies of breakdown phenomena induced by RF electric fields and RF magnetic fields; development of new diagnostics of the breakdown process • Theoretical research on the physics and engineering features of RF vacuum breakdown • Maintaining and improving the Haimson / MIT 17 GHz accelerator, the highest frequency operational accelerator in the world, a unique facility for accelerator research • Providing the Haimson / MIT 17 GHz accelerator facility as a facility for outside users • Active participation in the US DOE program of High Gradient Collaboration, including joint work with SLAC and with Los Alamos National Laboratory; participation of MIT students in research at the national laboratories • Training the next generation of Ph. D. students in the field of accelerator physics.« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larbalestier, David C.; Lee, Peter J.; Tarantini, Chiara

    All present circular accelerators use superconducting magnets to bend and to focus the particle beams. The most powerful of these machines is the large hadron collider (LHC) at CERN. The main ring dipole magnets of the LHC are made from Nb-Ti but, as the machine is upgraded to higher luminosity, more powerful magnets made of Nb 3Sn will be required. Our work addresses how to make the Nb 3Sn conductors more effective and more suitable for use in the LHC. The most important property of the superconducting conductor used for an accelerator magnet is that it must have very highmore » critical current density, the property that allows the generation of high magnetic fields in small spaces. Nb 3Sn is the original high field superconductor, the material which was discovered in 1960 to allow a high current density in the field of about 9 T. For the high luminosity upgrade of the LHC, much higher current densities in fields of about 12 Tesla will be required. The critical value of the current density is of order 2600 A/mm 2 in a field of 12 Tesla. But there are very important secondary factors that complicate the attainment of this critical current density. The first is that the effective filament diameter must be no larger than about 40 µm. The second factor is that 50% of the cross-section of the Nb 3Sn conductor that is pure copper must be protected from any poisoning by any Sn leakage through the diffusion barrier that protects the package of niobium and tin from which the Nb 3Sn is formed by a high temperature reaction. These three, somewhat conflicting requirements, mean that optimization of the conductor is complex. The work described in this contract report addresses these conflicting requirements. They show that very sophisticated characterizations can uncover the way to satisfy all 3 requirements and they also suggest that the ultimate optimization of Nb 3Sn is still not yet in sight« less

  7. USPAS | U.S. Particle Accelerator School

    Science.gov Websites

    U.S. Particle Accelerator School U.S. Particle Accelerator School U.S. Particle Accelerator School U.S. Particle Accelerator School Education in Beam Physics and Accelerator Technology Home About About University Credits Joint International Accelerator School University-Style Programs Symposium-Style Programs

  8. Volunteer Clouds and Citizen Cyberscience for LHC Physics

    NASA Astrophysics Data System (ADS)

    Aguado Sanchez, Carlos; Blomer, Jakob; Buncic, Predrag; Chen, Gang; Ellis, John; Garcia Quintas, David; Harutyunyan, Artem; Grey, Francois; Lombrana Gonzalez, Daniel; Marquina, Miguel; Mato, Pere; Rantala, Jarno; Schulz, Holger; Segal, Ben; Sharma, Archana; Skands, Peter; Weir, David; Wu, Jie; Wu, Wenjing; Yadav, Rohit

    2011-12-01

    Computing for the LHC, and for HEP more generally, is traditionally viewed as requiring specialized infrastructure and software environments, and therefore not compatible with the recent trend in "volunteer computing", where volunteers supply free processing time on ordinary PCs and laptops via standard Internet connections. In this paper, we demonstrate that with the use of virtual machine technology, at least some standard LHC computing tasks can be tackled with volunteer computing resources. Specifically, by presenting volunteer computing resources to HEP scientists as a "volunteer cloud", essentially identical to a Grid or dedicated cluster from a job submission perspective, LHC simulations can be processed effectively. This article outlines both the technical steps required for such a solution and the implications for LHC computing as well as for LHC public outreach and for participation by scientists from developing regions in LHC research.

  9. Controlled Cold Helium Spill Test in the LHC Tunnel at CERN

    NASA Astrophysics Data System (ADS)

    Koettig, T.; Casas-Cubillos, J.; Chorowski, M.; Dufay-Chanat, L.; Grabowski, M.; Jedrusyna, A.; Lindell, G.; Nonis, M.; Vauthier, N.; van Weelderen, R.; Winkler, T.; Bremer, J.

    The helium cooled magnets of the LHC particle accelerator are installed in a confined space, formed by a 27 km circumference 3.8 m diameter underground tunnel. The vacuum enclosures of the superconducting LHC magnets are protected by a lift plate against excessive overpressure created by eventual leaks from the magnet helium bath, or from the helium supply headers. A three-meter long no stay zone has been defined centered to these plates, based on earlier scale model studies, to protect the personnel against the consequences of an eventual opening of such a lift plate. More recently several simulation studies have been carried out modelling the propagation of the resulting helium/air mixture along the tunnel in case of such a cold helium release at a rate in the range of 1 kg/s. To validate the different scale models and simulation studies, real life mock-up tests have been performed in the LHC, releasing about 1000 liter of liquid helium under standard operational tunnel conditions. Data recorded during these tests include oxygen level, temperature and flow speed as well as video recordings, taken up- and downstream of the spill point (-100 m to +200 m) with respect to the ventilation direction in the LHC tunnel. The experimental set-up and measurement results are presented. Generic effects found during the tests will be discussed to allow the transposal to possible cold helium release cases in similar facilities.

  10. Experimental And Theoretical High Energy Physics Research At UCLA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cousins, Robert D.

    2013-07-22

    This is the final report of the UCLA High Energy Physics DOE Grant No. DE-FG02- 91ER40662. This report covers the last grant project period, namely the three years beginning January 15, 2010, plus extensions through April 30, 2013. The report describes the broad range of our experimental research spanning direct dark matter detection searches using both liquid xenon (XENON) and liquid argon (DARKSIDE); present (ICARUS) and R&D for future (LBNE) neutrino physics; ultra-high-energy neutrino and cosmic ray detection (ANITA); and the highest-energy accelerator-based physics with the CMS experiment and CERN’s Large Hadron Collider. For our theory group, the report describesmore » frontier activities including particle astrophysics and cosmology; neutrino physics; LHC interaction cross section calculations now feasible due to breakthroughs in theoretical techniques; and advances in the formal theory of supergravity.« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermilab

    Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists around the world collaborate with Fermilab on research at the frontiers of discovery.

  12. Developments in the ATLAS Tracking Software ahead of LHC Run 2

    NASA Astrophysics Data System (ADS)

    Styles, Nicholas; Bellomo, Massimiliano; Salzburger, Andreas; ATLAS Collaboration

    2015-05-01

    After a hugely successful first run, the Large Hadron Collider (LHC) is currently in a shut-down period, during which essential maintenance and upgrades are being performed on the accelerator. The ATLAS experiment, one of the four large LHC experiments has also used this period for consolidation and further developments of the detector and of its software framework, ahead of the new challenges that will be brought by the increased centre-of-mass energy and instantaneous luminosity in the next run period. This is of particular relevance for the ATLAS Tracking software, responsible for reconstructing the trajectory of charged particles through the detector, which faces a steep increase in CPU consumption due to the additional combinatorics of the high-multiplicity environment. The steps taken to mitigate this increase and stay within the available computing resources while maintaining the excellent performance of the tracking software in terms of the information provided to the physics analyses will be presented. Particular focus will be given to changes to the Event Data Model, replacement of the maths library, and adoption of a new persistent output format. The resulting CPU profiling results will be discussed, as well as the performance of the algorithms for physics processes under the expected conditions for the next LHC run.

  13. Experiential learning in high energy physics: a survey of students at the LHC

    NASA Astrophysics Data System (ADS)

    Camporesi, Tiziano; Catalano, Gelsomina; Florio, Massimo; Giffoni, Francesco

    2017-03-01

    More than 36 000 students and post-docs will be involved until 2025 in research at the Large Hadron Collider (LHC) mainly through international collaborations. To what extent they value the skills acquired? Do students expect that their learning experience will have an impact on their professional future? By drawing from earlier literature on experiential learning, we have designed a survey of current and former students at LHC. To quantitatively measure the students’ perceptions, we compare the salary expectations of current students with the assessment of those now employed in different jobs. Survey data are analysed by ordered logistic regression models, which allow multivariate statistical analyses with limited dependent variables. Results suggest that experiential learning at LHC positively correlates with both current and former students’ salary expectations. Those already employed clearly confirm the expectations of current students. At least two not mutually exclusive explanations underlie the results. First, the training at LHC is perceived to provide students valuable skills, which in turn affect the salary expectations; secondly, the LHC research experience per se may act as signal in the labour market. Respondents put a price tag on their learning experience, a ‘LHC salary premium’ ranging from 5% to 12% compared with what they would have expected for their career without such an experience at CERN.

  14. Simulations of the failure scenarios of the crab cavities for the nominal scheme of the LHC

    NASA Astrophysics Data System (ADS)

    Yee, B.; Calaga, R.; Zimmermann, F.; Lopez, R.

    2012-02-01

    The Crab Cavity (CC) represents a possible solution to the problem of the reduction in luminosity due to the impact angle of two colliding beams. The CC is a Radio Frequency (RF) superconducting cavity which applies a transversal kick into a bunch of particles producing a rotation in order to have a head-on collision to improve the luminosity. For this reason people at the Beams Department-Accelerators & Beams Physics of CERN (BE-ABP) have studied the implementation of the CC scheme at the LHC. It is essential to study the failure scenarios and the damage that can be produced to the lattice devices. We have performed simulations of these failures for the nominal scheme.

  15. Lead Ions and Coulomb's Law at the LHC (CERN)

    ERIC Educational Resources Information Center

    Cid-Vidal, Xabier; Cid, Ramon

    2018-01-01

    Although for most of the time the Large Hadron Collider (LHC) at CERN collides protons, for around one month every year lead ions are collided, to expand the diversity of the LHC research programme. Furthermore, in an effort not originally foreseen, proton-lead collisions are also taking place, with results of high interest to the physics…

  16. Fact Sheets and Brochures | News

    Science.gov Websites

    Illinois Accelerator Research Center Economic Impact Particle Physics: Benefits to Society The Fermilab Saturday Morning Physics What are neutrinos? What are neutrinos? (large format) What is a Higgs boson? U.S Public Outreach America's particle physics and accelerator laboratory LBNF/DUNE - An international mega

  17. Strangeness Production in the ALICE Experiment at the LHC

    NASA Astrophysics Data System (ADS)

    Johnson, Harold; Fenner, Kiara; Harton, Austin; Garcia-Solis, Edmundo; Soltz, Ron

    2015-04-01

    The study of strange particle production is an important tool in understanding the properties of a hot and dense medium, the quark-gluon plasma, created in heavy-ion collisions at ultra-relativistic energies. This quark-gluon plasma (QGP) is believed to have been present just after the big bang. The standard model of physics contains six types of quarks. Strange quarks are not among the valence quarks found in protons and neutrons. Strange quark production is sensitive to the extremely high temperatures of the QGP. CERN's Large Hadron Collider accelerates particles to nearly the speed of light before colliding them to create this QGP state. In the results of high-energy particle collisions, hadrons are formed out of quarks and gluons when cooling from extremely high temperatures. Jets are a highly collimated cone of particles coming from the hadronization of a single quark or gluon. Understanding jet interactions may give us clues about the QGP. Using FastJet (a popular jet finder algorithm), we extracted strangeness, or strange particle characteristics of jets contained within proton-proton collisions during our research at CERN. We have identified jets with and without strange particles in proton-proton collisions and we will present a comparison of pT spectra in both cases. This material is based upon work supported by the National Science Foundation under grants PHY-1305280 and PHY-1407051.

  18. Defocusing beam line design for an irradiation facility at the TAEA SANAEM Proton Accelerator Facility

    NASA Astrophysics Data System (ADS)

    Gencer, A.; Demirköz, B.; Efthymiopoulos, I.; Yiğitoğlu, M.

    2016-07-01

    Electronic components must be tested to ensure reliable performance in high radiation environments such as Hi-Limu LHC and space. We propose a defocusing beam line to perform proton irradiation tests in Turkey. The Turkish Atomic Energy Authority SANAEM Proton Accelerator Facility was inaugurated in May 2012 for radioisotope production. The facility has also an R&D room for research purposes. The accelerator produces protons with 30 MeV kinetic energy and the beam current is variable between 10 μA and 1.2 mA. The beam kinetic energy is suitable for irradiation tests, however the beam current is high and therefore the flux must be lowered. We plan to build a defocusing beam line (DBL) in order to enlarge the beam size, reduce the flux to match the required specifications for the irradiation tests. Current design includes the beam transport and the final focusing magnets to blow up the beam. Scattering foils and a collimator is placed for the reduction of the beam flux. The DBL is designed to provide fluxes between 107 p /cm2 / s and 109 p /cm2 / s for performing irradiation tests in an area of 15.4 cm × 21.5 cm. The facility will be the first irradiation facility of its kind in Turkey.

  19. ACCELERATED EXTRACTION OF ORGANIC POLLUTANTS USING MICROWAVE ENERGY

    EPA Science Inventory

    This study is part of an ongoing U.S. Environmental Protection Agency research program, carried out by the National Exposure Research Laboratory, Characterization Research Division-Las Vegas (formerly Environmental Monitoring Systems Laboratory-Las Vegas), addresses new sample pr...

  20. Detector Developments for the High Luminosity LHC Era (1/4)

    ScienceCinema

    Straessner, Arno

    2018-04-27

    Calorimetry and Muon Spectrometers - Part I : In the first part of the lecture series, the motivation for a high luminosity upgrade of the LHC will be quickly reviewed together with the challenges for the LHC detectors. In particular, the plans and ongoing research for new calorimeter detectors will be explained. The main issues in the high-luminosity era are an improved radiation tolerance, natural ageing of detector components and challenging trigger and physics requirements. The new technological solutions for calorimetry at a high-luminosity LHC will be reviewed.

  1. Thin Film Approaches to the SRF Cavity Problem Fabrication and Characterization of Superconducting Thin Films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beringer, Douglas

    Superconducting Radio Frequency (SRF) cavities are responsible for the acceleration of charged particles to relativistic velocities in most modern linear accelerators, such as those employed at high-energy research facilities like Thomas Jefferson National Laboratory’s CEBAF and the LHC at CERN. Recognizing SRF as primarily a surface phenomenon enables the possibility of applying thin films to the interior surface of SRF cavities, opening a formidable tool chest of opportunities by combining and designing materials that offer greater performance benefit. Thus, while improvements in radio frequency cavity design and refinements in cavity processing techniques have improved accelerator performance and efficiency – 1.5more » GHz bulk niobium SRF cavities have achieved accelerating gradients in excess of 35 MV/m – there exist fundamental material bounds in bulk superconductors limiting the maximally sustained accelerating field gradient (≈ 45 MV/m for Nb) where inevitable thermodynamic breakdown occurs. With state of the art Nb based cavity design fast approaching these theoretical limits, novel material innovations must be sought in order to realize next generation SRF cavities. One proposed method to improve SRF performance is to utilize thin film superconducting-insulating-superconducting (SIS) multilayer structures to effectively magnetically screen a bulk superconducting layer such that it can operate at higher field gradients before suffering critically detrimental SRF losses. This dissertation focuses on the production and characterization of thin film superconductors for such SIS layers for radio frequency applications. Correlated studies on structure, surface morphology and superconducting properties of epitaxial Nb and MgB2 thin films are presented.« less

  2. Measurements of beam halo diffusion and population density in the Tevatron and in the Large Hadron Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stancari, Giulio

    2015-03-01

    Halo dynamics influences global accelerator performance: beam lifetimes, emittance growth, dynamic aperture, and collimation efficiency. Halo monitoring and control are also critical for the operation of high-power machines. For instance, in the high-luminosity upgrade of the LHC, the energy stored in the beam tails may reach several megajoules. Fast losses can result in superconducting magnet quenches, magnet damage, or even collimator deformation. The need arises to measure the beam halo and to remove it at controllable rates. In the Tevatron and in the LHC, halo population densities and diffusivities were measured with collimator scans by observing the time evolution ofmore » losses following small inward or outward collimator steps, under different experimental conditions: with single beams and in collision, and, in the case of the Tevatron, with a hollow electron lens acting on a subset of bunches. After the LHC resumes operations, it is planned to compare measured diffusivities with the known strength of transverse damper excitations. New proposals for nondestructive halo population density measurements are also briefly discussed.« less

  3. Heavy-ion physics with the ALICE experiment at the CERN Large Hadron Collider.

    PubMed

    Schukraft, J

    2012-02-28

    After close to 20 years of preparation, the dedicated heavy-ion experiment A Large Ion Collider Experiment (ALICE) took first data at the CERN Large Hadron Collider (LHC) accelerator with proton collisions at the end of 2009 and with lead nuclei at the end of 2010. After a short introduction into the physics of ultra-relativistic heavy-ion collisions, this article recalls the main design choices made for the detector and summarizes the initial operation and performance of ALICE. Physics results from this first year of operation concentrate on characterizing the global properties of typical, average collisions, both in proton-proton (pp) and nucleus-nucleus reactions, in the new energy regime of the LHC. The pp results differ, to a varying degree, from most quantum chromodynamics-inspired phenomenological models and provide the input needed to fine tune their parameters. First results from Pb-Pb are broadly consistent with expectations based on lower energy data, indicating that high-density matter created at the LHC, while much hotter and larger, still behaves like a very strongly interacting, almost perfect liquid.

  4. Advanced Operating System Technologies

    NASA Astrophysics Data System (ADS)

    Cittolin, Sergio; Riccardi, Fabio; Vascotto, Sandro

    In this paper we describe an R&D effort to define an OS architecture suitable for the requirements of the Data Acquisition and Control of an LHC experiment. Large distributed computing systems are foreseen to be the core part of the DAQ and Control system of the future LHC experiments. Neworks of thousands of processors, handling dataflows of several gigaBytes per second, with very strict timing constraints (microseconds), will become a common experience in the following years. Problems like distributyed scheduling, real-time communication protocols, failure-tolerance, distributed monitoring and debugging will have to be faced. A solid software infrastructure will be required to manage this very complicared environment, and at this moment neither CERN has the necessary expertise to build it, nor any similar commercial implementation exists. Fortunately these problems are not unique to the particle and high energy physics experiments, and the current research work in the distributed systems field, especially in the distributed operating systems area, is trying to address many of the above mentioned issues. The world that we are going to face in the next ten years will be quite different and surely much more interconnected than the one we see now. Very ambitious projects exist, planning to link towns, nations and the world in a single "Data Highway". Teleconferencing, Video on Demend, Distributed Multimedia Applications are just a few examples of the very demanding tasks to which the computer industry is committing itself. This projects are triggering a great research effort in the distributed, real-time micro-kernel based operating systems field and in the software enginering areas. The purpose of our group is to collect the outcame of these different research efforts, and to establish a working environment where the different ideas and techniques can be tested, evaluated and possibly extended, to address the requirements of a DAQ and Control System suitable for LHC. Our work started in the second half of 1994, with a research agreement between CERN and Chorus Systemes (France), world leader in the micro-kernel OS technology. The Chorus OS is targeted to distributed real-time applications, and it can very efficiently support different "OS personalities" in the same environment, like Posix, UNIX, and a CORBA compliant distributed object architecture. Projects are being set-up to verify the suitability of our work for LHC applications, we are building a scaled-down prototype of the DAQ system foreseen for the CMS experiment at LHC, where we will directly test our protocols and where we will be able to make measurements and benchmarks, guiding our development and allowing us to build an analytical model of the system, suitable for simulation and large scale verification.

  5. B_s oscillation and prospects for delta m_s at the Tevatron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menzemer, Stephanie; /MIT

    2005-07-01

    Till the start of the LHC, the Tevatron is the only running accelerator which produces enough B{sub s} mesons to perform {Delta}m{sub s} measurements. The status--as it was at the time of the conference--of two different {Delta}m{sub s} analysis performed both by the CDF and D0 collaboration will be presented.

  6. Proceedings of the 2005 International Linear Collider Workshop (LCWS05)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hewett, JoAnne,; /SLAC

    2006-12-18

    Exploration of physics at the TeV scale holds the promise of addressing some of our most basic questions about the nature of matter, space, time, and energy. Discoveries of the Electroweak Symmetry Breaking mechanism, Supersymmetry, Extra Dimensions of space, Dark Matter particles, and new forces of nature are all possible. We have been waiting and planning for this exploration for over 20 years. In 2007 the Large Hadron Collider at CERN will begin its operation and will break into this new energy frontier. A new era of understanding will emerge as the LHC data maps out the Terascale. With themore » LHC discoveries, new compelling questions will arise. Responding to these questions will call for a new tool with greater sensitivity--the International Linear Collider. Historically, the most striking progress in the exploration of new energy frontiers has been made from combining results from hadron and electron-positron colliders. The precision measurements possible at the ILC will reveal the underlying theory which gave rise to the particles discovered at the LHC and will open the window to even higher energies. The world High Energy Physics community has reached an accord that an e+e- linear collider operating at 0.5-1.0 TeV would provide both unique and essential scientific opportunities; the community has endorsed with highest priority the construction of such a machine. A major milestone toward this goal was reached in August 2004 when the International Committee on Future Accelerators approved a recommendation for the technology of the future International Linear Collider. A global research and design effort is now underway to construct a global design report for the ILC. This endeavor is directed by Barry Barrish of the California Institute of Technology. The offer, made by Jonathan Dorfan on the behalf of ICFA, and acceptance of this directorship took place during the opening plenary session of this workshop. The 2005 International Linear Collider Workshop was held at Stanford University from 18 March through 22 March, 2005. This workshop was hosted by the Stanford Linear Accelerator Center and sponsored by the World Wide Study for future e+e- linear colliders. It was the eighth in a series of International Workshops (the first was held in Saariselka, Finland in 1991) devoted to the physics and detectors associated with high energy e+e- linear colliders. 397 physicists from 24 countries participated in the workshop. These proceedings represent the presentations and discussions which took place during the workshop. The contributions are comprised of physics studies, detector specifications, and accelerator design for the ILC. These proceedings are organized in two Volumes and include contributions from both the plenary and parallel sessions.« less

  7. Reinventing the Accelerator for the High Energy Frontier

    ScienceCinema

    Rosenzweig, James [UCLA, Los Angeles, California, United States

    2017-12-09

    The history of discovery in high-energy physics has been intimately connected with progress in methods of accelerating particles for the past 75 years. This remains true today, as the post-LHC era in particle physics will require significant innovation and investment in a superconducting linear collider. The choice of the linear collider as the next-generation discovery machine, and the selection of superconducting technology has rather suddenly thrown promising competing techniques -- such as very large hadron colliders, muon colliders, and high-field, high frequency linear colliders -- into the background. We discuss the state of such conventional options, and the likelihood of their eventual success. We then follow with a much longer view: a survey of a new, burgeoning frontier in high energy accelerators, where intense lasers, charged particle beams, and plasmas are all combined in a cross-disciplinary effort to reinvent the accelerator from its fundamental principles on up.

  8. High Energy Physics Research with the CMS Experiment at CERN - Energy Frontier Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, Gail G.

    The Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) near Geneva, Switzerland, is now the highest energy accelerator in the world, colliding protons with protons. On July 4, 2012, the two general-purpose experiments, ATLAS and the Compact Muon Solenoid (CMS) experiment, announced the observation of a particle consistent with the world’s most sought-after particle, the Higgs boson, at a mass of about 125 GeV (approximately 125 times the mass of the proton). The Higgs boson is the final missing ingredient of the standard model, in which it is needed to allow most other particles to acquiremore » mass through the mechanism of electroweak symmetry breaking. We are members of the team in the CMS experiment that found evidence for the Higgs boson through its decay to two photons, the most sensitive channel at the LHC. We are proposing to carry out studies to determine whether the new particle has the properties expected for the standard model Higgs boson or whether it is something else. The new particle can still carry out its role in electroweak symmetry breaking but have other properties as well. Most theorists think that a single standard model Higgs boson cannot be the complete solution – there are other particles needed to answer some of the remaining questions, such as the hierarchy problem. The particle that has been observed could be one of several Higgs bosons, for example, or it could be composite. One model of physics beyond the standard model is supersymmetry, in which every ordinary particle has a superpartner with opposite spin properties. In supersymmetric models, there must be at least five Higgs bosons. In the most popular versions of supersymmetry, the lightest supersymmetric particle does not decay and is a candidate for dark matter. This proposal covers the period from June 1, 2013, to March 31, 2016. During this period the LHC will finally reach its design energy, almost twice the energy at which it now runs. We will be able to study the Higgs boson at the current LHC energy using about three times as much data as were used to make the observation. In 2013 the LHC will shut down to make preparations to run at its design energy in 2015. During the shutdown period, we will be preparing upgrades of the detector to be able to run at the higher rates of proton-proton collisions that will also be possible once the LHC is running at design energy. The upgrade on which we are working, the inner silicon pixel tracker, will be installed in late 2016. Definitive tests of whether the new particle satisfies the properties of the standard model Higgs boson will almost certainly require both the higher energy and the larger amounts of data that can be accumulated using the higher rates. Meanwhile we will use the data taken during 2012 and the higher energy data starting in 2015 to continue to search for beyond-the-standard-model physics such as supersymmetry and heavy neutrinos. We have already made such searches using data since the LHC started running. We are discussing with theorists how a 125-GeV Higgs modifies such models. Finding such particles will probably also require the higher energy and larger amounts of data beginning in 2015. The period of this proposal promises to be very exciting, leading to new knowledge of the matter in the Universe.« less

  9. Proceedings of the 1995 Particle Accelerator Conference and international Conference on High-Energy Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1996-01-01

    Papers from the sixteenth biennial Particle Accelerator Conference, an international forum on accelerator science and technology held May 1–5, 1995, in Dallas, Texas, organized by Los Alamos National Laboratory (LANL) and Stanford Linear Accelerator Center (SLAC), jointly sponsored by the Institute of Electrical and Electronics Engineers (IEEE) Nuclear and Plasma Sciences Society (NPSS), the American Physical Society (APS) Division of Particles and Beams (DPB), and the International Union of Pure and Applied Physics (IUPAP), and conducted with support from the US Department of Energy, the National Science Foundation, and the Office of Naval Research.

  10. Experimental, Theoretical and Computational Studies of Plasma-Based Concepts for Future High Energy Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Chan; Mori, W.

    2013-10-21

    This is the final report on the DOE grant number DE-FG02-92ER40727 titled, “Experimental, Theoretical and Computational Studies of Plasma-Based Concepts for Future High Energy Accelerators.” During this grant period the UCLA program on Advanced Plasma Based Accelerators, headed by Professor C. Joshi has made many key scientific advances and trained a generation of students, many of whom have stayed in this research field and even started research programs of their own. In this final report however, we will focus on the last three years of the grant and report on the scientific progress made in each of the four tasksmore » listed under this grant. Four tasks are focused on: Plasma Wakefield Accelerator Research at FACET, SLAC National Accelerator Laboratory, In House Research at UCLA’s Neptune and 20 TW Laser Laboratories, Laser-Wakefield Acceleration (LWFA) in Self Guided Regime: Experiments at the Callisto Laser at LLNL, and Theory and Simulations. Major scientific results have been obtained in each of the four tasks described in this report. These have led to publications in the prestigious scientific journals, graduation and continued training of high quality Ph.D. level students and have kept the U.S. at the forefront of plasma-based accelerators research field.« less

  11. Beam experiments with the Grenoble test electron cyclotron resonance ion source at iThemba LABS.

    PubMed

    Thomae, R; Conradie, J; Fourie, D; Mira, J; Nemulodi, F; Kuechler, D; Toivanen, V

    2016-02-01

    At iThemba Laboratory for Accelerator Based Sciences (iThemba LABS) an electron cyclotron ion source was installed and commissioned. This source is a copy of the Grenoble Test Source (GTS) for the production of highly charged ions. The source is similar to the GTS-LHC at CERN and named GTS2. A collaboration between the Accelerators and Beam Physics Group of CERN and the Accelerator and Engineering Department of iThemba LABS was proposed in which the development of high intensity argon and xenon beams is envisaged. In this paper, we present beam experiments with the GTS2 at iThemba LABS, in which the results of continuous wave and afterglow operation of xenon ion beams with oxygen as supporting gases are presented.

  12. Protection of the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Schmidt, R.; Assmann, R.; Carlier, E.; Dehning, B.; Denz, R.; Goddard, B.; Holzer, E. B.; Kain, V.; Puccio, B.; Todd, B.; Uythoven, J.; Wenninger, J.; Zerlauth, M.

    2006-11-01

    The Large Hadron Collider (LHC) at CERN will collide two counter-rotating proton beams, each with an energy of 7 TeV. The energy stored in the superconducting magnet system will exceed 10 GJ, and each beam has a stored energy of 362 MJ which could cause major damage to accelerator equipment in the case of uncontrolled beam loss. Safe operation of the LHC will therefore rely on a complex system for equipment protection. The systems for protection of the superconducting magnets in case of quench must be fully operational before powering the magnets. For safe injection of the 450 GeV beam into the LHC, beam absorbers must be in their correct positions and specific procedures must be applied. Requirements for safe operation throughout the cycle necessitate early detection of failures within the equipment, and active monitoring of the beam with fast and reliable beam instrumentation, mainly beam loss monitors (BLM). When operating with circulating beams, the time constant for beam loss after a failure extends from apms to a few minutes—failures must be detected sufficiently early and transmitted to the beam interlock system that triggers a beam dump. It is essential that the beams are properly extracted on to the dump blocks at the end of a fill and in case of emergency, since the beam dump blocks are the only elements of the LHC that can withstand the impact of the full beam.

  13. GPU-accelerated track reconstruction in the ALICE High Level Trigger

    NASA Astrophysics Data System (ADS)

    Rohr, David; Gorbunov, Sergey; Lindenstruth, Volker; ALICE Collaboration

    2017-10-01

    ALICE (A Large Heavy Ion Experiment) is one of the four major experiments at the Large Hadron Collider (LHC) at CERN. The High Level Trigger (HLT) is an online compute farm which reconstructs events measured by the ALICE detector in real-time. The most compute-intensive part is the reconstruction of particle trajectories called tracking and the most important detector for tracking is the Time Projection Chamber (TPC). The HLT uses a GPU-accelerated algorithm for TPC tracking that is based on the Cellular Automaton principle and on the Kalman filter. The GPU tracking has been running in 24/7 operation since 2012 in LHC Run 1 and 2. In order to better leverage the potential of the GPUs, and speed up the overall HLT reconstruction, we plan to bring more reconstruction steps (e.g. the tracking for other detectors) onto the GPUs. There are several tasks running so far on the CPU that could benefit from cooperation with the tracking, which is hardly feasible at the moment due to the delay of the PCI Express transfers. Moving more steps onto the GPU, and processing them on the GPU at once, will reduce PCI Express transfers and free up CPU resources. On top of that, modern GPUs and GPU programming APIs provide new features which are not yet exploited by the TPC tracking. We present our new developments for GPU reconstruction, both with a focus on the online reconstruction on GPU for the online offline computing upgrade in ALICE during LHC Run 3, and also taking into account how the current HLT in Run 2 can profit from these improvements.

  14. Fermilab | Science | Questions for the Universe | The Birth of the Universe

    Science.gov Websites

    Fermilab and the LHC Dark matter and dark energy ADMX Muons More fundamental particles and forces Theory , that could explain ultra-high-energy cosmic rays, dark matter and perhaps even dark energy. Experiments Accelerators for science and society Particle Physics 101 Science of matter, energy, space and time How

  15. The OSG Open Facility: an on-ramp for opportunistic scientific computing

    NASA Astrophysics Data System (ADS)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.; Gardner, R.; Rynge, M.; Würthwein, F.

    2017-10-01

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  16. The OSG Open Facility: An On-Ramp for Opportunistic Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource ownersmore » and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.« less

  17. A New Generation of Networks and Computing Models for High Energy Physics in the LHC Era

    NASA Astrophysics Data System (ADS)

    Newman, H.

    2011-12-01

    Wide area networks of increasing end-to-end capacity and capability are vital for every phase of high energy physicists' work. Our bandwidth usage, and the typical capacity of the major national backbones and intercontinental links used by our field have progressed by a factor of several hundred times over the past decade. With the opening of the LHC era in 2009-10 and the prospects for discoveries in the upcoming LHC run, the outlook is for a continuation or an acceleration of these trends using next generation networks over the next few years. Responding to the need to rapidly distribute and access datasets of tens to hundreds of terabytes drawn from multi-petabyte data stores, high energy physicists working with network engineers and computer scientists are learning to use long range networks effectively on an increasing scale, and aggregate flows reaching the 100 Gbps range have been observed. The progress of the LHC, and the unprecedented ability of the experiments to produce results rapidly using worldwide distributed data processing and analysis has sparked major, emerging changes in the LHC Computing Models, which are moving from the classic hierarchical model designed a decade ago to more agile peer-to-peer-like models that make more effective use of the resources at Tier2 and Tier3 sites located throughout the world. A new requirements working group has gauged the needs of Tier2 centers, and charged the LHCOPN group that runs the network interconnecting the LHC Tierls with designing a new architecture interconnecting the Tier2s. As seen from the perspective of ICFA's Standing Committee on Inter-regional Connectivity (SCIC), the Digital Divide that separates physicists in several regions of the developing world from those in the developed world remains acute, although many countries have made major advances through the rapid installation of modern network infrastructures. A case in point is Africa, where a new round of undersea cables promises to transform the continent.

  18. What We Do | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The Frederick National Laboratory is the only U.S. national lab wholly focused on research, technology, and collaboration in the biomedical sciences- working to discover, to innovate, and to improve human health. We accelerate progress against can

  19. Reaching record-low β* at the CERN Large Hadron Collider using a novel scheme of collimator settings and optics

    NASA Astrophysics Data System (ADS)

    Bruce, R.; Bracco, C.; De Maria, R.; Giovannozzi, M.; Mereghetti, A.; Mirarchi, D.; Redaelli, S.; Quaranta, E.; Salvachua, B.

    2017-03-01

    The Large Hadron Collider (LHC) at CERN is built to collide intense proton beams with an unprecedented energy of 7 TeV. The design stored energy per beam of 362 MJ makes the LHC beams highly destructive, so that any beam losses risk to cause quenches of superconducting magnets or damage to accelerator components. Collimators are installed to protect the machine and they define a minimum normalized aperture, below which no other element is allowed. This imposes a limit on the achievable luminosity, since when squeezing β* (the β-function at the collision point) to smaller values for increased luminosity, the β-function in the final focusing system increases. This leads to a smaller normalized aperture that risks to go below the allowed collimation aperture. In the first run of the LHC, this was the main limitation on β*, which was constrained to values above the design specification. In this article, we show through theoretical and experimental studies how tighter collimator openings and a new optics with specific phase-advance constraints allows a β* as small as 40 cm, a factor 2 smaller than β*=80 cm used in 2015 and significantly below the design value β*=55 cm, in spite of a lower beam energy. The proposed configuration with β*=40 cm has been successfully put into operation and has been used throughout 2016 as the LHC baseline. The decrease in β* compared to 2015 has been an essential contribution to reaching and surpassing, in 2016, the LHC design luminosity for the first time, and to accumulating a record-high integrated luminosity of around 40 fb-1 in one year, in spite of using less bunches than in the design.

  20. Simulations and measurements of beam loss patterns at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Bruce, R.; Assmann, R. W.; Boccone, V.; Bracco, C.; Brugger, M.; Cauchi, M.; Cerutti, F.; Deboy, D.; Ferrari, A.; Lari, L.; Marsili, A.; Mereghetti, A.; Mirarchi, D.; Quaranta, E.; Redaelli, S.; Robert-Demolaize, G.; Rossi, A.; Salvachua, B.; Skordis, E.; Tambasco, C.; Valentino, G.; Weiler, T.; Vlachoudis, V.; Wollmann, D.

    2014-08-01

    The CERN Large Hadron Collider (LHC) is designed to collide proton beams of unprecedented energy, in order to extend the frontiers of high-energy particle physics. During the first very successful running period in 2010-2013, the LHC was routinely storing protons at 3.5-4 TeV with a total beam energy of up to 146 MJ, and even higher stored energies are foreseen in the future. This puts extraordinary demands on the control of beam losses. An uncontrolled loss of even a tiny fraction of the beam could cause a superconducting magnet to undergo a transition into a normal-conducting state, or in the worst case cause material damage. Hence a multistage collimation system has been installed in order to safely intercept high-amplitude beam protons before they are lost elsewhere. To guarantee adequate protection from the collimators, a detailed theoretical understanding is needed. This article presents results of numerical simulations of the distribution of beam losses around the LHC that have leaked out of the collimation system. The studies include tracking of protons through the fields of more than 5000 magnets in the 27 km LHC ring over hundreds of revolutions, and Monte Carlo simulations of particle-matter interactions both in collimators and machine elements being hit by escaping particles. The simulation results agree typically within a factor 2 with measurements of beam loss distributions from the previous LHC run. Considering the complex simulation, which must account for a very large number of unknown imperfections, and in view of the total losses around the ring spanning over 7 orders of magnitude, we consider this an excellent agreement. Our results give confidence in the simulation tools, which are used also for the design of future accelerators.

  1. The New Big Science at the NSLS

    NASA Astrophysics Data System (ADS)

    Crease, Robert

    2016-03-01

    The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.

  2. Validation and performance of the LHC cryogenic system through commissioning of the first sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serio, L.; Bouillot, A.; Casas-Cubillos, J.

    2007-12-01

    The cryogenic system [1] for the Large Hadron Collider accelerator is presently in its final phase of commissioning at nominal operating conditions. The refrigeration capacity for the LHC is produced using eight large cryogenic plants and eight 1.8 K refrigeration units installed on five cryogenic islands. Machine cryogenic equipment is installed in a 26.7-km circumference ring deep underground tunnel and are maintained at their nominal operating conditions via a distribution system consisting of transfer lines, cold interconnection boxes at each cryogenic island and a cryogenic distribution line. The functional analysis of the whole system during all operating conditions was establishedmore » and validated during the first sector commissioning in order to maximize the system availability. Analysis, operating modes, main failure scenarios, results and performance of the cryogenic system are presented.« less

  3. The LHC magnet system and its status of development

    NASA Technical Reports Server (NTRS)

    Bona, Maurizio; Perin, Romeo; Vlogaert, Jos

    1995-01-01

    CERN is preparing for the construction of a new high energy accelerator/collider, the Large Hadron Collider (LHC). This new facility will mainly consist of two superconducting magnetic beam channels, 27 km long, to be installed in the existing LEP tunnel. The magnetic system comprises about 1200 twin-aperture dipoles, 13.145 m long, with an operational field of 8.65 T, about 600 quadrupoles, 3 m long, and a very large number of other superconducting magnetic components. A general description of the system is given together with the main features of the design of the regular lattice magnets. The paper also describes the present state of the magnet R & D program. Results from short model work, as well as from full scale prototypes will be presented, including the recently tested 10 m long full-scale prototype dipole manufactured in industry.

  4. Aerospace in the future

    NASA Technical Reports Server (NTRS)

    Mccarthy, J. F., Jr.

    1980-01-01

    National research and technology trends are introduced in the environment of accelerating change. NASA and the federal budget are discussed. The U.S. energy dependence on foreign oil, the increasing oil costs, and the U.S. petroleum use by class are presented. The $10 billion aerospace industry positive contribution to the U.S. balance of trade of 1979 is given as an indicator of the positive contribution of NASA in research to industry. The research work of the NASA Lewis Research Center in the areas of space, aeronautics, and energy is discussed as a team effort of government, the areas of space, aeronautics, and energy is discussed as a team effort of government, industry, universities, and business to maintain U.S. world leadership in advanced technology.

  5. Beam experiments with the Grenoble test electron cyclotron resonance ion source at iThemba LABS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomae, R., E-mail: rthomae@tlabs.ac.za; Conradie, J.; Fourie, D.

    2016-02-15

    At iThemba Laboratory for Accelerator Based Sciences (iThemba LABS) an electron cyclotron ion source was installed and commissioned. This source is a copy of the Grenoble Test Source (GTS) for the production of highly charged ions. The source is similar to the GTS-LHC at CERN and named GTS2. A collaboration between the Accelerators and Beam Physics Group of CERN and the Accelerator and Engineering Department of iThemba LABS was proposed in which the development of high intensity argon and xenon beams is envisaged. In this paper, we present beam experiments with the GTS2 at iThemba LABS, in which the resultsmore » of continuous wave and afterglow operation of xenon ion beams with oxygen as supporting gases are presented.« less

  6. Final Technical Report for "High Energy Physics at The University of Iowa"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mallik, Usha; Meurice, Yannick; Nachtman, Jane

    2013-07-31

    Particle Physics explores the very fundamental building blocks of our universe: the nature of forces, of space and time. By exploring very energetic collisions of sub-nuclear particles with sophisticated detectors at the colliding beam accelerators (as well as others), experimental particle physicists have established the current theory known as the Standard Model (SM), one of the several theoretical postulates to explain our everyday world. It explains all phenomena known up to a very small fraction of a second after the Big Bang to a high precision; the Higgs boson, discovered recently, was the last of the particle predicted by themore » SM. However, many other phenomena, like existence of dark energy, dark matter, absence of anti-matter, the parameters in the SM, neutrino masses etc. are not explained by the SM. So, in order to find out what lies beyond the SM, i.e., what conditions at the earliest fractions of the first second of the universe gave rise to the SM, we constructed the Large Hadron Collider (LHC) at CERN after the Tevatron collider at Fermi National Accelerator Laboratory. Each of these projects helped us push the boundary further with new insights as we explore a yet higher energy regime. The experiments are extremely complex, and as we push the boundaries of our existing knowledge, it also requires pushing the boundaries of our technical knowhow. So, not only do we pursue humankind’s most basic intellectual pursuit of knowledge, we help develop technology that benefits today’s highly technical society. Our trained Ph.D. students become experts at fast computing, manipulation of large data volumes and databases, developing cloud computing, fast electronics, advanced detector developments, and complex interfaces in several of these areas. Many of the Particle physics Ph.D.s build their careers at various technology and computing facilities, even financial institutions use some of their skills of simulation and statistical prowess. Additionally, last but not least, today’s discoveries make for tomorrow’s practical uses of an improved life style, case in point, internet technology, fiber optics, and many such things. At The University of Iowa we are involved in the LHC experiments, ATLAS and CMS, building equipment, with calibration and maintenance, supporting the infrastructure in hardware, software and analysis as well as participating in various aspects of data analyses. Our theory group works on fundamentals of field theories and on exploration of non-accelerator high energy neutrinos and possible dark matter searches.« less

  7. Performance Analysis of the Ironless Inductive Position Sensor in the Large Hadron Collider Collimators Environment

    PubMed Central

    Danisi, Alessandro; Masi, Alessandro; Losito, Roberto

    2015-01-01

    The Ironless Inductive Position Sensor (I2PS) has been introduced as a valid alternative to Linear Variable Differential Transformers (LVDTs) when external magnetic fields are present. Potential applications of this linear position sensor can be found in critical systems such as nuclear plants, tokamaks, satellites and particle accelerators. This paper analyzes the performance of the I2PS in the harsh environment of the collimators of the Large Hadron Collider (LHC), where position uncertainties of less than 20 µm are demanded in the presence of nuclear radiation and external magnetic fields. The I2PS has been targeted for installation for LHC Run 2, in order to solve the magnetic interference problem which standard LVDTs are experiencing. The paper describes in detail the chain of systems which belong to the new I2PS measurement task, their impact on the sensor performance and their possible further optimization. The I2PS performance is analyzed evaluating the position uncertainty (on 30 s), the magnetic immunity and the long-term stability (on 7 days). These three indicators are assessed from data acquired during the LHC operation in 2015 and compared with those of LVDTs. PMID:26569259

  8. Thermo-electric analysis of the interconnection of the LHC main superconducting bus bars

    NASA Astrophysics Data System (ADS)

    Granieri, P. P.; Breschi, M.; Casali, M.; Bottura, L.; Siemko, A.

    2013-01-01

    Spurred by the question of the maximum allowable energy for the operation of the Large Hadron Collider (LHC), we have progressed in the understanding of the thermo-electric behavior of the 13 kA superconducting bus bars interconnecting its main magnets. A deep insight of the underlying mechanisms is required to ensure the protection of the accelerator against undesired effects of resistive transitions. This is especially important in case of defective interconnections which can jeopardize the operation of the whole LHC. In this paper we present a numerical model of the interconnections between the main dipole and quadrupole magnets, validated against experimental tests of an interconnection sample with a purposely built-in defect. We consider defective interconnections featuring a lack of bonding among the superconducting cables and the copper stabilizer components, such as those that could be present in the machine. We evaluate the critical defect length limiting the maximum allowable current for powering the magnets. We determine the dependence of the critical defect length on different parameters as the heat transfer towards the cooling helium bath, the quality of manufacturing, the operating conditions and the protection system parameters, and discuss the relevant mechanisms.

  9. Pena to review LHC agreement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawler, A.

    The US government plans to review its tentative agreement with Europe to help build the Large Hadron Collider (LHC), to make sure it is a good deal for this country. The review, announced last week by Energy Secretary Federico Pena, comes at the urging of Representative James Sensenbrenner (RWI), who chairs the House Science Committee. Agency officials say they are confident that most of the lawmaker`s concerns can be met with only minor changes to the proposed partnership, while European managers insist that the current agreement already addresses most of Sensenbrenner`s worries.

  10. Design study of beam transport lines for BioLEIR facility at CERN

    NASA Astrophysics Data System (ADS)

    Ghithan, S.; Roy, G.; Schuh, S.

    2017-09-01

    The biomedical community has asked CERN to investigate the possibility to transform the Low Energy Ion Ring (LEIR) accelerator into a multidisciplinary, biomedical research facility (BioLEIR) that could provide ample, high-quality beams of a range of light ions suitable for clinically oriented, fundamental research on cell cultures and for radiation instrumentation development. The present LEIR machine uses fast beam extraction to the next accelerator in the chain, eventually leading to the Large Hadron Collider (LHC) . To provide beam for a biomedical research facility, a new slow extraction system must be installed. Two horizontal and one vertical experimental beamlines were designed for transporting the extracted beam to three experimental end-stations. The vertical beamline (pencil beam) was designed for a maximum energy of 75 MeV/u for low-energy radiobiological research, while the two horizontal beamlines could deliver up to 440 MeV/u. One horizontal beamline shall be used preferentially for biomedical experiments and shall provide pencil beam and a homogeneous broad beam, covering an area of 5 × 5 cm2 with a beam homogeneity of ±5%. The second horizontal beamline will have pencil beam only and is intended for hardware developments in the fields of (micro-)dosimetry and detector development. The minimum full aperture of the beamlines is approximately 100 mm at all magnetic elements, to accommodate the expected beam envelopes. Seven dipoles and twenty quadrupoles are needed for a total of 65 m of beamlines to provide the specified beams. In this paper we present the optical design for the three beamlines.

  11. Lead ions and Coulomb’s Law at the LHC (CERN)

    NASA Astrophysics Data System (ADS)

    Cid-Vidal, Xabier; Cid, Ramon

    2018-03-01

    Although for most of the time the Large Hadron Collider (LHC) at CERN collides protons, for around one month every year lead ions are collided, to expand the diversity of the LHC research programme. Furthermore, in an effort not originally foreseen, proton-lead collisions are also taking place, with results of high interest to the physics community. All the large experiments of the LHC have now joined the heavy-ion programme, including the LHCb experiment, which was not at first expected to be part of it. The aim of this article is to introduce a few simple physical calculations relating to some electrical phenomena that occur when lead-ion bunches are running in the LHC, using Coulomb’s Law, to be taken to the secondary school classroom to help students understand some important physical concepts.

  12. Design of a 0-50 mbar pressure measurement channel compatible with the LHC tunnel radiation environment

    NASA Astrophysics Data System (ADS)

    Casas, Juan; Jelen, Dorota; Trikoupis, Nikolaos

    2017-02-01

    The monitoring of cryogenic facilities often require the measurement of pressure in the sub 5’000 Pa range that are used for flow metering applications, for saturated superfluid helium, etc. The pressure measurement is based on the minute displacement of a sensing diaphragm often through contactless techniques by using capacitive or inductive methods. The LHC radiation environment forbid the use of standard commercial sensors because of the embedded electronics that are affected both by radiation induced drift and transient Single Event Effects (SEE). Passive pressure sensors from two manufacturers were investigated and a CERN designed radiation-tolerant electronics has been developed for measuring variable-reluctance sensors. During the last maintenance stop of the LHC accelerator, four absolute pressure sensors were installed in some of the low pressure bayonet heat exchangers and four differential pressure sensors on the venturi flowmeters that monitor the cooling flow of the 20.5 kA current leads of the ATLAS end-cap superconducting toroids. The pressure sensors operating range is about 1000 to 5000 Pa and the targeted uncertainty is +/- 50 Pa which would permit to measure the equivalent saturation temperature at 1.8 K within better than 0.01 K. This paper describes the radiation hard measuring head that is based on an inductive bridge, its associated radiation-tolerant electronics that is installed under the LHC superconducting magnets or the ATLAS detector cavern; and the first operational experience.

  13. Thermomechanical assessment of the effects of a jaw-beam angle during beam impact on Large Hadron Collider collimators

    NASA Astrophysics Data System (ADS)

    Cauchi, Marija; Assmann, R. W.; Bertarelli, A.; Carra, F.; Lari, L.; Rossi, A.; Mollicone, P.; Sammut, N.

    2015-02-01

    The correct functioning of a collimation system is crucial to safely and successfully operate high-energy particle accelerators, such as the Large Hadron Collider (LHC). However, the requirements to handle high-intensity beams can be demanding, and accident scenarios must be well studied in order to assess if the collimator design is robust against possible error scenarios. One of the catastrophic, though not very probable, accident scenarios identified within the LHC is an asynchronous beam dump. In this case, one (or more) of the 15 precharged kicker circuits fires out of time with the abort gap, spraying beam pulses onto LHC machine elements before the machine protection system can fire the remaining kicker circuits and bring the beam to the dump. If a proton bunch directly hits a collimator during such an event, severe beam-induced damage such as magnet quenches and other equipment damage might result, with consequent downtime for the machine. This study investigates a number of newly defined jaw error cases, which include angular misalignment errors of the collimator jaw. A numerical finite element method approach is presented in order to precisely evaluate the thermomechanical response of tertiary collimators to beam impact. We identify the most critical and interesting cases, and show that a tilt of the jaw can actually mitigate the effect of an asynchronous dump on the collimators. Relevant collimator damage limits are taken into account, with the aim to identify optimal operational conditions for the LHC.

  14. Characterisation of ionisation chambers for a mixed radiation field and investigation of their suitability as radiation monitors for the LHC.

    PubMed

    Theis, C; Forkel-Wirth, D; Perrin, D; Roesler, S; Vincke, H

    2005-01-01

    Monitoring of the radiation environment is one of the key tasks in operating a high-energy accelerator such as the Large Hadron Collider (LHC). The radiation fields consist of neutrons, charged hadrons as well as photons and electrons with energy spectra extending from those of thermal neutrons up to several hundreds of GeV. The requirements for measuring the dose equivalent in such a field are different from standard uses and it is thus necessary to investigate the response of monitoring devices thoroughly before the implementation of a monitoring system can be conducted. For the LHC, it is currently foreseen to install argon- and hydrogen-filled high-pressure ionisation chambers as radiation monitors of mixed fields. So far their response to these fields was poorly understood and, therefore, further investigation was necessary to prove that they can serve their function well enough. In this study, ionisation chambers of type IG5 (Centronic Ltd) were characterised by simulating their response functions by means of detailed FLUKA calculations as well as by calibration measurements for photons and neutrons at fixed energies. The latter results were used to obtain a better understanding and validation of the FLUKA simulations. Tests were also conducted at the CERF facility at CERN in order to compare the results with simulations of the response in a mixed radiation field. It is demonstrated that these detectors can be characterised sufficiently enough to serve their function as radiation monitors for the LHC.

  15. Electromagnetic nonlinearities in a Roebel-cable-based accelerator magnet prototype: variational approach

    NASA Astrophysics Data System (ADS)

    Ruuskanen, J.; Stenvall, A.; Lahtinen, V.; Pardo, E.

    2017-02-01

    Superconducting magnets are the most expensive series of components produced in the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN). When developing such magnets beyond state-of-the-art technology, one possible option is to use high-temperature superconductors (HTS) that are capable of tolerating much higher magnetic fields than low-temperature superconductors (LTS), carrying simultaneously high current densities. Significant cost reductions due to decreased prototype construction needs can be achieved by careful modelling of the magnets. Simulations are used, e.g. for designing magnets fulfilling the field quality requirements of the beampipe, and adequate protection by studying the losses occurring during charging and discharging. We model the hysteresis losses and the magnetic field nonlinearity in the beampipe as a function of the magnet’s current. These simulations rely on the minimum magnetic energy variation principle, with optimization algorithms provided by the open-source optimization library interior point optimizer. We utilize this methodology to investigate a research and development accelerator magnet prototype made of REBCO Roebel cable. The applicability of this approach, when the magnetic field dependence of the superconductor’s critical current density is considered, is discussed. We also scrutinize the influence of the necessary modelling decisions one needs to make with this approach. The results show that different decisions can lead to notably different results, and experiments are required to study the electromagnetic behaviour of such magnets further.

  16. Magnetic analysis of the Nb$$_3$$Sn low-beta quadrupole for the high luminosity LHC

    DOE PAGES

    Bermudez, Susana Izquierdo; Ambrosio, G.; Chlachidze, G.; ...

    2017-01-10

    As part of the Large Hadron Collider Luminosity upgrade (HiLumi-LHC) program, the US LARP collaboration and CERN are working together to design and build 150 mm aperture Nb 3Sn quadrupoles for the LHC interaction regions. A first series of 1.5 m long coils were fabricated, assembled and tested in the first short model. This paper presents the magnetic analysis, comparing magnetic field measurements with the expectations and the field quality requirements. The analysis is focused on the geometrical harmonics, iron saturation effect and cold-warm correlation. Three dimensional effects such as the variability of the field harmonics along the magnet axismore » and the contribution of the coil ends are also discussed. Furthemore, we present the influence of the conductor magnetization and the dynamic effects.« less

  17. Black Holes and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Roy, Arunava

    2011-12-01

    The European Center for Nuclear Research or CERN's Large Hadron Collider (LHC) has caught our attention partly due to the film ``Angels and Demons.'' In the movie, an antimatter bomb attack on the Vatican is foiled by the protagonist. Perhaps just as controversial is the formation of mini black holes (BHs). Recently, the American Physical Society1 website featured an article on BH formation at the LHC.2 This article examines some aspects of mini BHs and explores the possibility of their detection at the LHC.

  18. Expanding the Clean Energy Economy for Chemical Companies | Working with Us

    Science.gov Websites

    | NREL Expanding the Clean Energy Economy for Chemical Companies Expanding the Clean Energy Economy for Chemical Companies Partner with NREL to accelerate the research, development, and commercialization of ethanol. Learn more. Our Chemical Company Partners Work with us on your next advanced energy

  19. Advancing the Growth of the U.S. Wind Industry: Federal Incentives, Funding, and Partnership Opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The U.S. Department of Energy’s (DOE’s) Wind Energy Technologies Office (WETO) works to accelerate the development and deployment of wind power. The office provides information for researchers, developers,businesses, manufacturers, communities, and others seeking various types of federal assistance available for advancing wind projects.

  20. The QuarkNet CMS masterclass: bringing the LHC to students

    NASA Astrophysics Data System (ADS)

    Cecire, Kenneth; McCauley, Thomas

    2016-04-01

    QuarkNet is an educational program which brings high school teachers and their students into the particle physics research community. The program supports research experiences and professional development workshops and provides inquiry-oriented investigations, some using real experimental data. The CMS experiment at the LHC has released several thousand proton-proton collision events for use in education and outreach. QuarkNet, in collaboration with CMS, has developed a physics masterclass and e-Lab based on this data. A masterclass is a day-long educational workshop where high school students travel to nearby universities and research laboratories. There they learn from LHC physicists about the basics of particle physics and detectors. They then perform a simple measurement using LHC data, and share their results with other students around the world via videoconference. Since 2011 thousands of students from over 25 countries have participated in the CMS masterclass as organized by QuarkNet and the International Particle Physics Outreach Group (IPPOG).We describe here the masterclass exercise: the physics, the online event display and database preparation behind it, the measurement the students undertake, their results and experiences, and future plans for the exercise.

  1. FOREWORD: International Conference on Heavy Ion Collisions in the LHC Era

    NASA Astrophysics Data System (ADS)

    Arleo, Francois; Salgado, Carlos A.; Tran Thanh Van, Jean

    2013-03-01

    The International Conference on Heavy Ion Collisions in the LHC Era was held in Quy Nhon, Vietnam, on 16-20 July 2012. The series Rencontres du Vietnam, created by Jean Tran Thanh Van in 1993, consists of international meetings aimed to stimulate the development of advanced research in Vietnam and more generally in South East Asia, and to establish collaborative research networks with Western scientific communities. This conference, as the whole series, also supports the International Center for Interdisciplinary Science Education being built in Quy Nhon. The articles published in this volume present the latest results from the heavy-ion collision programs of RHIC and LHC as well as the corresponding theoretical interpretation and future perspectives. Lower energy nuclear programs were also reviewed, providing a rather complete picture of the state-of-the-art in the field. We wish to thank the sponsors of the Conference on Heavy Ion Collisions in the LHC Era: the European Research Council; Xunta de Galicia (Spain); EMMI (Germany) and Agence Nationale de la Recherche (France) François Arleo (Laboratoire d'Annecy-le-Vieux de Physique Théorique, France) Francois Arleo, Carlos A Salgado and Jean Tran Thanh Van Conference photograph

  2. Space trajectory calculation based on G-sensor

    NASA Astrophysics Data System (ADS)

    Xu, Biya; Zhan, Yinwei; Shao, Yang

    2017-08-01

    At present, without full use of the mobile phone around us, most of the research in human body posture recognition field is use camera or portable acceleration sensor to collect data. In this paper, G-sensor built-in mobile phone is use to collect data. After processing data with the way of moving average filter and acceleration integral, joint point's space three-dimensional coordinates can be abtained accurately.

  3. ATLAS with CARIBU: A laboratory portrait

    DOE PAGES

    Pardo, Richard C.; Savard, Guy; Janssens, Robert V. F.

    2016-03-21

    The Argonne Tandem Linac Accelerator System (ATLAS) is the world's first superconducting accelerator for projectiles heavier than the electron. This unique system is a U.S. Department of Energy (DOE) national user research facility open to scientists from all over the world. Here, it is located within the Physics Division at Argonne National Laboratory and is one of five large scientific user facilities located at the laboratory.

  4. Simultaneous analysis of neutrinoless double beta decay and LHC pp-cross sections: limits on the left-right mixing angle

    NASA Astrophysics Data System (ADS)

    Civitarese, O.; Suhonen, J.; Zuber, K.

    2015-09-01

    The extension of the Standard Model of electroweak interactions, to accommodate massive neutrinos and/or right-handed currents, is one of the fundamental questions to answer in the cross-field of particle and nuclear physics. The consequences of such extensions would reflect upon nuclear decays, like the very exotic nuclear double-beta-decay, as well as upon high-energy proton-proton reactions of the type performed at the LHC accelerator. In this talk we shall address this question by looking at the results reported by the ATLAS and CMS collaborations, where the excitation and decay of a heavy-mass boson may be mediated by a heavy-mass neutrino in proton-proton reactions leading to two jets and two leptons, and by extracting limits on the left-right mixing, from the latest measurements of nuclear-double-beta decays reported by the GERDA and EXO collaborations.

  5. Extracting information from 0νββ decay and LHC pp-cross sections: Limits on the left-right mixing angle and right-handed boson mass

    NASA Astrophysics Data System (ADS)

    Civitarese, O.; Suhonen, J.; Zuber, K.

    2015-10-01

    The existence of massive neutrinos forces the extension of the Standard Model of electroweak interactions, to accommodate them and/or right-handed currents. This is one of the fundamental questions in todays's physics. The consequences of it would reflect upon several decay processes, like the very exotic nuclear double-beta-decay. By the other hand, high-energy proton-proton reactions of the type performed at the LHC accelerator can provide information about the existence of a right-handed generation of the W and Z-bosons. Here we shall address the possibility of performing a joint analysis of the results reported by the ATLAS and CMS collaborations (σ(pp- > 2l + jets)) and the latest measurements of nuclear-double-beta decays reported by the GERDA and EXO collaborations.

  6. SPS Beam Steering for LHC Extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gianfelice-Wendt, Eliana; Bartosik, Hannes; Cornelis, Karel

    2014-07-01

    The CERN Super Proton Synchrotron accelerates beams for the Large Hadron Collider to 450 GeV. In addition it produces beams for fixed target facilities which adds complexity to the SPS operation. During the run 2012-2013 drifts of the extracted beam trajectories have been observed and lengthy optimizations in the transfer lines were performed to reduce particle losses in the LHC. The observed trajectory drifts are consistent with the measured SPS orbit drifts at extraction. While extensive studies are going on to understand, and possibly suppress, the source of such SPS orbit drifts the feasibility of an automatic beam steering towardsmore » a “golden” orbit at the extraction septa, by means of the interlocked correctors, is also being investigated. The challenges and constraints related to the implementation of such a correction in the SPS are described. Simulation results are presented and a possible operational steering strategy is proposed.« less

  7. Module and electronics developments for the ATLAS ITk pixel system

    NASA Astrophysics Data System (ADS)

    Muñoz, F. J.

    2018-03-01

    The ATLAS experiment is preparing for an extensive modification of its detectors in the course of the planned HL-LHC accelerator upgrade around 2025. The ATLAS upgrade includes the replacement of the entire tracking system by an all-silicon detector (Inner Tracker, ITk). The five innermost layers of ITk will be a pixel detector built of new sensor and readout electronics technologies to improve the tracking performance and cope with the severe HL-LHC environment in terms of occupancy and radiation. The total area of the new pixel system could measure up to 14 m2, depending on the final layout choice, which is expected to take place in 2018. In this paper an overview of the ongoing R&D activities on modules and electronics for the ATLAS ITk is given including the main developments and achievements in silicon planar and 3D sensor technologies, readout and power challenges.

  8. The management of large cabling campaigns during the Long Shutdown 1 of LHC

    NASA Astrophysics Data System (ADS)

    Meroli, S.; Machado, S.; Formenti, F.; Frans, M.; Guillaume, J. C.; Ricci, D.

    2014-03-01

    The Large Hadron Collider at CERN entered into its first 18 month-long shutdown period in February 2013. During this period the entire CERN accelerator complex will undergo major consolidation and upgrade works, preparing the machines for LHC operation at nominal energy (7 TeV/beam). One of the most challenging activities concerns the cabling infrastructure (copper and optical fibre cables) serving the CERN data acquisition, networking and control systems. About 1000 kilometres of cables, distributed in different machine areas, will be installed, representing an investment of about 15 MCHF. This implies an extraordinary challenge in terms of project management, including resource and activity planning, work execution and quality control. The preparation phase of this project started well before its implementation, by defining technical solutions and setting financial plans for staff recruitment and material supply. Enhanced task coordination was further implemented by deploying selected competences to form a central support team.

  9. Extracting information from 0νββ decay and LHC pp-cross sections: Limits on the left-right mixing angle and right-handed boson mass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Civitarese, O., E-mail: osvaldo.civitarese@fisica.unlp.edu.ar; Suhonen, J.; Zuber, K.

    2015-10-28

    The existence of massive neutrinos forces the extension of the Standard Model of electroweak interactions, to accommodate them and/or right-handed currents. This is one of the fundamental questions in todays’s physics. The consequences of it would reflect upon several decay processes, like the very exotic nuclear double-beta-decay. By the other hand, high-energy proton-proton reactions of the type performed at the LHC accelerator can provide information about the existence of a right-handed generation of the W and Z-bosons. Here we shall address the possibility of performing a joint analysis of the results reported by the ATLAS and CMS collaborations (σ(pp− >more » 2l + jets)) and the latest measurements of nuclear-double-beta decays reported by the GERDA and EXO collaborations.« less

  10. CVD diamond detectors for ionizing radiation

    NASA Astrophysics Data System (ADS)

    Friedl, M.; Adam, W.; Bauer, C.; Berdermann, E.; Bergonzo, P.; Bogani, F.; Borchi, E.; Brambilla, A.; Bruzzi, M.; Colledani, C.; Conway, J.; Dabrowski, W.; Delpierre, P.; Deneuville, A.; Dulinski, W.; van Eijk, B.; Fallou, A.; Fizzotti, F.; Foulon, F.; Gan, K. K.; Gheeraert, E.; Grigoriev, E.; Hallewell, G.; Hall-Wilton, R.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H.; Kania, D.; Kaplon, J.; Karl, C.; Kass, R.; Knöpfle, K. T.; Krammer, M.; Logiudice, A.; Lu, R.; Manfredi, P. F.; Manfredotti, C.; Marshall, R. D.; Meier, D.; Mishina, M.; Oh, A.; Pan, L. S.; Palmieri, V. G.; Pernegger, H.; Pernicka, M.; Peitz, A.; Pirollo, S.; Polesello, P.; Pretzl, K.; Re, V.; Riester, J. L.; Roe, S.; Roff, D.; Rudge, A.; Schnetzer, S.; Sciortino, S.; Speziali, V.; Stelzer, H.; Stone, R.; Tapper, R. J.; Tesarek, R.; Thomson, G. B.; Trawick, M.; Trischuk, W.; Vittone, E.; Walsh, A. M.; Wedenig, R.; Weilhammer, P.; Ziock, H.; Zoeller, M.; RD42 Collaboration

    1999-10-01

    In future HEP accelerators, such as the LHC (CERN), detectors and electronics in the vertex region of the experiments will suffer from extreme radiation. Thus radiation hardness is required for both detectors and electronics to survive in this harsh environment. CVD diamond, which is investigated by the RD42 Collaboration at CERN, can meet these requirements. Samples of up to 2×4 cm2 have been grown and refined for better charge collection properties, which are measured with a β source or in a testbeam. A large number of diamond samples has been irradiated with hadrons to fluences of up to 5×10 15 cm-2 to study the effects of radiation. Both strip and pixel detectors were prepared in various geometries. Samples with strip metallization have been tested with both slow and fast readout electronics, and the first diamond pixel detector proved fully functional with LHC electronics.

  11. Registration: Science Adventures

    Science.gov Websites

    classes, students must pre-register using one of two methods: Online Form or Downloadable form Accelerator Laboratory Office of Science / U.S. Department of Energy Managed by Fermi Research Alliance, LLC

  12. Three Generations of FPGA DAQ Development for the ATLAS Pixel Detector

    NASA Astrophysics Data System (ADS)

    Mayer, Joseph A., II

    The Large Hadron Collider (LHC) at the European Center for Nuclear Research (CERN) tracks a schedule of long physics runs, followed by periods of inactivity known as Long Shutdowns (LS). During these LS phases both the LHC, and the experiments around its ring, undergo maintenance and upgrades. For the LHC these upgrades improve their ability to create data for physicists; the more data the LHC can create the more opportunities there are for rare events to appear that physicists will be interested in. The experiments upgrade so they can record the data and ensure the event won't be missed. Currently the LHC is in Run 2 having completed the first LS of three. This thesis focuses on the development of Field-Programmable Gate Array (FPGA)-based readout systems that span across three major tasks of the ATLAS Pixel data acquisition (DAQ) system. The evolution of Pixel DAQ's Readout Driver (ROD) card is presented. Starting from improvements made to the new Insertable B-Layer (IBL) ROD design, which was part of the LS1 upgrade; to upgrading the old RODs from Run 1 to help them run more efficiently in Run 2. It also includes the research and development of FPGA based DAQs and integrated circuit emulators for the ITk upgrade which will occur during LS3 in 2025.

  13. Robustness of waves with a high phase velocity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tajima, T., E-mail: ttajima@uci.edu; Tri Alpha Energy, Inc., P.O. Box 7010, Rancho Santa Margarita, CA 92688; Necas, A., E-mail: anecas@trialphaenergy.com

    Norman Rostoker pioneered research of (1) plasma-driven accelerators and (2) beam-driven fusion reactors. The collective acceleration, coined by Veksler, advocates to drive above-ionization plasma waves by an electron beam to accelerate ions. The research on this, among others, by the Rostoker group incubated the idea that eventually led to the birth of the laser wakefield acceleration (LWFA), by which a large and robust accelerating collective fields may be generated in plasma in which plasma remains robust and undisrupted. Besides the emergence of LWFA, the Rostoker research spawned our lessons learned on the importance of adiabatic acceleration of ions in collectivemore » accelerators, including the recent rebirth in laser-driven ion acceleration efforts in a smooth adiabatic fashion by a variety of ingenious methods. Following Rostoker’s research in (2), the beam-driven Field Reversed Configuration (FRC) has accomplished breakthroughs in recent years. The beam-driven kinetic plasma instabilities have been found to drive the reactivity of deuteron-deuteron fusion beyond the thermonuclear yield in C-2U plasma that Rostoker started. This remarkable result in FRCs as well as the above mentioned LWFA may be understood with the aid of the newly introduced idea of the “robustness hypothesis of waves with a high phase velocity”. It posits that when the wave driven by a particle beam (or laser pulse) has a high phase velocity, its amplitude is high without disrupting the supporting bulk plasma. This hypothesis may guide us into more robust and efficient fusion reactors and more compact accelerators.« less

  14. The photo-philic QCD axion

    DOE PAGES

    Farina, Marco; Pappadopulo, Duccio; Rompineve, Fabrizio; ...

    2017-01-23

    Here, we propose a framework in which the QCD axion has an exponentially large coupling to photons, relying on the “clockwork” mechanism. We discuss the impact of present and future axion experiments on the parameter space of the model. In addition to the axion, the model predicts a large number of pseudoscalars which can be light and observable at the LHC. In the most favorable scenario, axion Dark Matter will give a signal in multiple axion detection experiments and the pseudo-scalars will be discovered at the LHC, allowing us to determine most of the parameters of the model.

  15. Theoretical & Experimental Research in Weak, Electromagnetic & Strong Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nandi, Satyanarayan; Babu, Kaladi; Rizatdinova, Flera

    The conducted research spans a wide range of topics in the theoretical, experimental and phenomenological aspects of elementary particle interactions. Theory projects involve topics in both the energy frontier and the intensity frontier. The experimental research involves energy frontier with the ATLAS Collaboration at the Large Hadron Collider (LHC). In theoretical research, novel ideas going beyond the Standard Model with strong theoretical motivations were proposed, and their experimental tests at the LHC and forthcoming neutrino facilities were outlined. These efforts fall into the following broad categories: (i) TeV scale new physics models for LHC Run 2, including left-right symmetry andmore » trinification symmetry, (ii) unification of elementary particles and forces, including the unification of gauge and Yukawa interactions, (iii) supersummetry and mechanisms of supersymmetry breaking, (iv) superworld without supersymmetry, (v) general models of extra dimensions, (vi) comparing signals of extra dimensions with those of supersymmetry, (vii) models with mirror quarks and mirror leptons at the TeV scale, (viii) models with singlet quarks and singlet Higgs and their implications for Higgs physics at the LHC, (ix) new models for the dark matter of the universe, (x) lepton flavor violation in Higgs decays, (xi) leptogenesis in radiative models of neutrino masses, (xii) light mediator models of non-standard neutrino interactions, (xiii) anomalous muon decay and short baseline neutrino anomalies, (xiv) baryogenesis linked to nucleon decay, and (xv) a new model for recently observed diboson resonance at the LHC and its other phenomenological implications. The experimental High Energy Physics group has been, and continues to be, a successful and productive contributor to the ATLAS experiment at the LHC. Members of the group performed search for gluinos decaying to stop and top quarks, new heavy gauge bosons decaying to top and bottom quarks, and vector-like quarks produced in pairs and decaying to light quarks. Members of the OSU group played a leading role in the detailed optimization studies for the future ATLAS Inner Tracker (ITk), which will be installed during the Phase-II upgrade, replacing the current tracking system. The proposed studies aim to enhance the ATLAS discovery potential in the high-luminosity LHC era. The group members have contributed to the calibration of algorithms for identifying boosted vector bosons and b-jets, which will help expand the ATLAS reach in many searches for new physics.« less

  16. Review of hydrodynamic tunneling issues in high power particle accelerators

    NASA Astrophysics Data System (ADS)

    Tahir, N. A.; Burkart, F.; Schmidt, R.; Shutov, A.; Piriz, A. R.

    2018-07-01

    Full impact of one Large Hadron Collider (LHC) 7 TeV proton beam on solid targets made of different materials including copper and carbon, was simulated using an energy deposition code, FLUKA and a two-dimensional hydrodynamic code, BIG2, iteratively. These studies showed that the penetration depth of the entire beam comprised of 2808 proton bunches significantly increases due to a phenomenon named hydrodynamic tunneling of the protons and the shower. For example, the static range of a single 7 TeV proton and its shower is about 1 m in solid copper, but the full LHC beam will penetrate up to about 35 m in the target, if the hydrodynamic effects were included. Due to the potential implications of this result on the machine protection considerations, it was decided to have an experimental verification of the hydrodynamic tunneling effect. For this purpose, experiments were carried out at the CERN HiRadMat (High Radiation to Materials) facility in which extended solid copper cylindrical targets were irradiated with the 440 GeV proton beam generated by the Super Proton Synchrotron (SPS). Simulations of beam-target heating considering the same beam parameters that were used in the experiments, were also performed. These experiments not only confirmed the existence of the hydrodynamic tunneling, but the experimental measurements showed very good agreement with the experimental results as well. This provided confidence in the work on LHC related beam-matter heating simulations. Currently, a design study is being carried out by the international community (with CERN taking the leading role) for a post LHC collider named, the Future Circular Collider (FCC) which will accelerate two counter rotating proton beams up to a particle energy of 50 TeV. Simulations of the full impact of one FCC beam comprised of 10,600 proton bunches with a solid copper target have also been done. These simulations have shown that although the static range of a single 50 TeV proton and its shower in solid copper is around 1.8 m, the entire beam will penetrate up to about 350 m in the target. Feasibility studies of developing a water beam dump for the FCC have also been carried out. A review of this work and its implications on machine protection system are presented in this paper.

  17. Continuous wave superconducting radio frequency electron linac for nuclear physics research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reece, Charles E.

    CEBAF, the Continuous Electron Beam Accelerator Facility, has been actively serving the nuclear physics research community as a unique forefront international resource since 1995. This cw electron linear accelerator (linac) at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility (Jefferson Lab) has continued to evolve as a precision tool for discerning the structure and dynamics within nuclei. Superconducting rf (SRF) technology has been the essential foundation for CEBAF, first as a 4 GeV machine, then 6 GeV, and currently capable of 12 GeV. Lastly, we review the development, implementation, and performance of SRF systems for CEBAF from itsmore » early beginnings to the commissioning of the 12 GeV era.« less

  18. Continuous wave superconducting radio frequency electron linac for nuclear physics research

    DOE PAGES

    Reece, Charles E.

    2016-12-28

    CEBAF, the Continuous Electron Beam Accelerator Facility, has been actively serving the nuclear physics research community as a unique forefront international resource since 1995. This cw electron linear accelerator (linac) at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility (Jefferson Lab) has continued to evolve as a precision tool for discerning the structure and dynamics within nuclei. Superconducting rf (SRF) technology has been the essential foundation for CEBAF, first as a 4 GeV machine, then 6 GeV, and currently capable of 12 GeV. Lastly, we review the development, implementation, and performance of SRF systems for CEBAF from itsmore » early beginnings to the commissioning of the 12 GeV era.« less

  19. Accelerated Learning: Undergraduate Research Experiences at the Texas A&M Cyclotron Institute

    NASA Astrophysics Data System (ADS)

    Yennello, S. J.

    The Texas A&M Cyclotron Institute (TAMU CI) has had an NSF funded Research Experiences for Undergraduates program since 2004. Each summer about a dozen students from across the country join us for the 10-week program. They are each imbedded in one of the research groups of the TAMU CI and given their own research project. While the main focus of their effort is their individual research project, we also have other activities to broaden their experience. For instance, one of those activities has been involvement in a dedicated group experiment. Because not every experimental group will run during those 10 weeks and the fact that some of the students are in theory research groups, a group research experience allows everyone to actually be involved in an experiment using the accelerator. In stark contrast to the REU students' very focused experience during the summer, Texas A&M undergraduates can be involved in research projects at the Cyclotron throughout the year, often for multiple years. This extended exposure enables Texas A&M students to have a learning experience that cannot be duplicated without a local accelerator. The motivation for the REU program was to share this accelerator experience with students who do not have that opportunity at their home institution.

  20. Reflections on Preparing Educators to Evaluate the Efficacy of Educational Technology: An Interview with Joseph South

    ERIC Educational Resources Information Center

    Bull, Glen; Spector, J. Michael; Persichitte, Kay; Meiers, Ellen

    2017-01-01

    Joseph South, an educational researcher, technology consultant, and former director of the U.S. Office of Educational Technology participated in a research initiative on Educational Technology Efficacy Research organized by the Jefferson Education Accelerator, Digital Promise, and the Curry School of Education at the University of Virginia. The…

  1. Exclusive and diffractive μ+μ- production in p p collisions at the LHC

    NASA Astrophysics Data System (ADS)

    Gonçalves, V. P.; Jaime, M. M.; Martins, D. E.; Rangel, M. S.

    2018-04-01

    In this paper, we estimate the production of dimuons (μ+μ- ) in exclusive photon-photon (γ γ ) and diffractive Pomeron-Pomeron (I P I P ), Pomeron-Reggeon (I P I R ), and Reggeon-Reggeon (I R I R ) interactions in p p collisions at the LHC energy. The invariant mass, rapidity, and tranverse momentum distributions are calculated using the forward physics Monte Carlo (FPMC), which allows us to obtain realistic predictions for the dimuon production with two leading intact hadrons. In particular, predictions taking into account the CMS and LHCb acceptances are presented. Moreover, the contribution of the single diffraction for the dimuon production also is estimated. Our results demonstrate that the experimental separation of these different mechanisms is feasible. In particular, the events characterized by pairs with large squared transverse momentum are dominated by diffractive interactions, which allows us to investigate the underlying assumptions present in the description of these processes.

  2. Underground neutrino detectors for particle and astroparticle Science: The Giant Liquid Argon Charge Imaging ExpeRiment (GLACIER)

    NASA Astrophysics Data System (ADS)

    Rubbia, André

    2009-06-01

    The current focus of the CERN program is the Large Hadron Collider (LHC), however, CERN is engaged in long baseline neutrino physics with the CNGS project and supports T2K as recognized CERN RE13, and for good reasons: a number of observed phenomena in high-energy physics and cosmology lack their resolution within the Standard Model of particle physics; these puzzles include the origin of neutrino masses, CP-violation in the leptonic sector, and baryon asymmetry of the Universe. They will only partially be addressed at LHC. A positive measurement of sin2 2θ13 > 0.01 would certainly give a tremendous boost to neutrino physics by opening the possibility to study CP violation in the lepton sector and the determination of the neutrino mass hierarchy with upgraded conventional super-beams. These experiments (so called 'Phase II') require, in addition to an upgraded beam power, next generation very massive neutrino detectors with excellent energy resolution and high detection efficiency in a wide neutrino energy range, to cover 1st and 2nd oscillation maxima, and excellent particle identification and p0 background suppression. Two generations of large water Cherenkov detectors at Kamioka (Kamiokande and Super-Kamiokande) have been extremely successful. And there are good reasons to consider a third generation water Cherenkov detector with an order of magnitude larger mass than Super-Kamiokande for both non-accelerator (proton decay, supernovae,...) and accelerator-based physics. On the other hand, a very massive underground liquid Argon detector of about 100 kton could represent a credible alternative for the precision measurements of 'Phase II' and aim at significantly new results in neutrino astroparticle and non-accelerator-based particle physics (e.g. proton decay).

  3. Kinetic Studies on the Xanthophyll Cycle in Barley Leaves (Influence of Antenna Size and Relations to Nonphotochemical Chlorophyll Fluorescence Quenching).

    PubMed Central

    Hartel, H.; Lokstein, H.; Grimm, B.; Rank, B.

    1996-01-01

    Xanthophyll-cycle kinetics as well as the relationship between the xanthophyll de-epoxidation state and Stern-Volmer type nonphotochemical chlorophyll (Chl) fluorescence quenching (qN) were investigated in barley (Hordeum vulgare L.) leaves comprising a stepwise reduced antenna system. For this purpose plants of the wild type (WT) and the Chl b-less mutant chlorina 3613 were cultivated under either continuous (CL) or intermittent light (IML). Violaxanthin (V) availability varied from about 70% in the WT up to 97 to 98% in the mutant and IML-grown plants. In CL-grown mutant leaves, de-epoxidation rates were strongly accelerated compared to the WT. This is ascribed to a different accessibility of V to the de-epoxidase due to the existence of two V pools: one bound to light-harvesting Chl a/b-binding complexes (LHC) and the other one not bound. Epoxidation rates (k) were decreased with reduction in LHC protein contents: kWT > kmutant >> kIML plants. This supports the idea that the epoxidase activity resides on certain LHC proteins. Irrespective of huge zeaxanthin and antheraxanthin accumulation, the capacity to develop qN was reduced stepwise with antenna size. The qN level obtained in dithiothreitol-treated CL- and IML-grown plants was almost identical with that in untreated IML-grown plants. The findings provide evidence that structural changes within the LHC proteins, mediated by xanthophyll-cycle operation, render the basis for the development of a major proportion of qN. PMID:12226199

  4. Viewpoint: the End of the World at the Large Hadron Collider?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peskin, Michael E.; /SLAC

    New arguments based on astrophysical phenomena constrain the possibility that dangerous black holes will be produced at the CERN Large Hadron Collider. On 8 August, the Large Hadron Collider (LHC) at CERN injected its first beams, beginning an experimental program that will produce proton-proton collisions at an energy of 14 TeV. Particle physicists are waiting expectantly. The reason is that the Standard Model of strong, weak, and electromagnetic interactions, despite its many successes, is clearly incomplete. Theory says that the holes in the model should be filled by new physics in the energy region that will be studied by themore » LHC. Some candidate theories are simple quick fixes, but the most interesting ones involve new concepts of spacetime waiting to be discovered. Look up the LHC on Wikipedia, however, and you will find considerable space devoted to safety concerns. At the LHC, we will probe energies beyond those explored at any previous accelerator, and we hope to create particles that have never been observed. Couldn't we, then, create particles that would actually be dangerous, for example, ones that would eat normal matter and eventually turn the earth into a blob of unpleasantness? It is morbid fun to speculate about such things, and candidates for such dangerous particles have been suggested. These suggestions have been analyzed in an article in Reviews of Modern Physics by Jaffe, Busza, Wilczek, and Sandweiss and excluded on the basis of constraints from observation and from the known laws of physics. These conclusions have been upheld by subsequent studies conducted at CERN.« less

  5. The practical Pomeron for high energy proton collimation

    NASA Astrophysics Data System (ADS)

    Appleby, R. B.; Barlow, R. J.; Molson, J. G.; Serluca, M.; Toader, A.

    2016-10-01

    We present a model which describes proton scattering data from ISR to Tevatron energies, and which can be applied to collimation in high energy accelerators, such as the LHC and FCC. Collimators remove beam halo particles, so that they do not impinge on vulnerable regions of the machine, such as the superconducting magnets and the experimental areas. In simulating the effect of the collimator jaws it is crucial to model the scattering of protons at small momentum transfer t, as these protons can subsequently survive several turns of the ring before being lost. At high energies these soft processes are well described by Pomeron exchange models. We study the behaviour of elastic and single-diffractive dissociation cross sections over a wide range of energy, and show that the model can be used as a global description of the wide variety of high energy elastic and diffractive data presently available. In particular it models low mass diffraction dissociation, where a rich resonance structure is present, and thus predicts the differential and integrated cross sections in the kinematical range appropriate to the LHC. We incorporate the physics of this model into the beam tracking code MERLIN and use it to simulate the resulting loss maps of the beam halo lost in the collimators in the LHC.

  6. LHCSR3 affects de-coupling and re-coupling of LHCII to PSII during state transitions in Chlamydomonas reinhardtii

    PubMed Central

    Roach, Thomas; Na, Chae Sun

    2017-01-01

    Photosynthetic organisms have to tolerate rapid changes in light intensity, which is facilitated by non-photochemical quenching (NPQ) and involves modification of energy transfer from light-harvesting complexes (LHC) to the photosystem reaction centres. NPQ includes dissipating excess light energy to heat (qE) and the reversible coupling of LHCII to photosystems (state transitions/qT), which are considered separate NPQ mechanisms. In the model alga Chlamydomonas reinhardtii the LHCSR3 protein has a well characterised role in qE. Here, it is shown in the npq4 mutant, deficient in LHCSR3, that energy coupling to photosystem II (PSII) more akin to qT is also disrupted, but no major differences in LHC phosphorylation or LHC compositions were found in comparison to wild-type cells. The qT of wild-type cells possessed two kinetically distinguishable phases, with LHCSR3 participating in the more rapid (<2 min) phase. This LHCSR3-mediated qT was sensitive to physiological levels of H2O2, which accelerated qE induction, revealing a way that may help C. reinhardtii tolerate a sudden increase in light intensity. Overall, a clear mechanistic overlap between qE and qT is shown. PMID:28233792

  7. Magnetic Frequency Response of HL-LHC Beam Screens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrone, M.; Martino, M.; De Maria, R.

    Magnetic fields used to control particle beams in accelerators are usually controlled by regulating the electrical current of the power converters. In order to minimize lifetime degradation and ultimately luminosity loss in circular colliders, current-noise is a highly critical figure of merit of power converters, in particular for magnets located in areas with high beta-function, like the High Luminosity Large Hadron Collider (HL-LHC) insertions. However, what is directly acting upon the beam is the magnetic field and not the current of the power converter, which undergoes several frequency-dependent transformations until the desired magnetic field, seen by the beam, is obtained.more » Beam screens are very rarely considered when assessing or specifying the noise figure of merit, but their magnetic frequency response is such that they realize relatively effective low pass filtering of the magnetic field produced by the system magnet-power converter. This work aims at filling this gap by quantifying the expected impact of different beam screen layouts for the most relevant HL-LHC insertion magnets. A welldefined post-processing technique is used to derive the frequency response of the different multipoles from multi-physics Finite Element Method (FEM) simulation results. In addition, a well approximated analytical formula for the low-frequency range of multi-layered beam screens is presented.« less

  8. Searches for Physics Beyond the Standard Model and Triggering on Proton-Proton Collisions at 14 TEV LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wittich, Peter

    2011-10-14

    This document describes the work achieved under the OJI award received May 2008 by Peter Wittich as Principal Investigator. The proposal covers experimental particle physics project searching for physics beyond the standard model at the Large Hadron Collider (LHC) at the European Organization for Nuclear Research.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finnell, Joshua Eugene

    US President Barack Obama issued Executive Order 13642-Making Open and Machine Readable the New Default for Government - on May, 9 2013, mandating, wherever legally permissible and possible, that US Government information be made open to the public.[1] This edict accelerated the construction of and framework for data repositories, and data citation principles and practices, such as data.gov. As a corollary, researchers across the country's national laboratories found themselves creating data management plans, applying data set metadata standards, and ensuring the long-term access of data for federally funded scientific research.

  10. What We Do | FNLCR Staging

    Cancer.gov

    The Frederick National Laboratory is the only U.S. national lab wholly focused on research, technology, and collaboration in the biomedical sciences- working to discover, to innovate, and to improve human health. We accelerate progress against can

  11. Matter, Energy, Space and Time: The International Linear Collider Physics Prospects and International Aspects

    NASA Astrophysics Data System (ADS)

    Wagner, Albrecht

    2006-04-01

    Over the past century, physicists have sought to explain the character of the matter and energy in our universe, to show how the basic forces of nature and the building blocks of matter come about, and to explore the fabric of space and time. In the past three decades, experiments at laboratories around the world have given us a precise confirmation of the underlying theory called the standard model. These particle physics advances have a direct impact for our understanding of the structure of the universe, both at its inception in the Big Bang, and in its evolution to the present and future. The final synthesis is not yet fully clear, but we know with confidence that major discoveries expanding the standard model framework will occur at the next generation of accelerators. The Large Hadron Collider (LHC) being built at CERN will take us into the discovery realm. The proposed International Linear Collider (ILC) will extend the discoveries and provide a wealth of precision measurements that are essential for giving deeper understanding of their meaning, and pointing the way to further evolution of particle physics in the future. A world-wide consensus has formed for a baseline ILC project at energies of 500 GeV and beyond. The choice of the superconducting technology as basis for the ILC has paved the way for a global design effort which has now taken full speed.

  12. High fidelity 3-dimensional models of beam-electron cloud interactions in circular accelerators

    NASA Astrophysics Data System (ADS)

    Feiz Zarrin Ghalam, Ali

    Electron cloud is a low-density electron profile created inside the vacuum chamber of circular machines with positively charged beams. Electron cloud limits the peak current of the beam and degrades the beams' quality through luminosity degradation, emittance growth and head to tail or bunch to bunch instability. The adverse effects of electron cloud on long-term beam dynamics becomes more and more important as the beams go to higher and higher energies. This problem has become a major concern in many future circular machines design like the Large Hadron Collider (LHC) under construction at European Center for Nuclear Research (CERN). Due to the importance of the problem several simulation models have been developed to model long-term beam-electron cloud interaction. These models are based on "single kick approximation" where the electron cloud is assumed to be concentrated at one thin slab around the ring. While this model is efficient in terms of computational costs, it does not reflect the real physical situation as the forces from electron cloud to the beam are non-linear contrary to this model's assumption. To address the existing codes limitation, in this thesis a new model is developed to continuously model the beam-electron cloud interaction. The code is derived from a 3-D parallel Particle-In-Cell (PIC) model (QuickPIC) originally used for plasma wakefield acceleration research. To make the original model fit into circular machines environment, betatron and synchrotron equations of motions have been added to the code, also the effect of chromaticity, lattice structure have been included. QuickPIC is then benchmarked against one of the codes developed based on single kick approximation (HEAD-TAIL) for the transverse spot size of the beam in CERN-LHC. The growth predicted by QuickPIC is less than the one predicted by HEAD-TAIL. The code is then used to investigate the effect of electron cloud image charges on the long-term beam dynamics, particularly on the transverse tune shift of the beam at CERN Super Proton Synchrotron (SPS) ring. The force from the electron cloud image charges on the beam cancels the force due to cloud compression formed on the beam axis and therefore the tune shift is mainly due to the uniform electron cloud density. (Abstract shortened by UMI.)

  13. Grand Challenges: High Performance Computing and Communications. The FY 1992 U.S. Research and Development Program.

    ERIC Educational Resources Information Center

    Federal Coordinating Council for Science, Engineering and Technology, Washington, DC.

    This report presents a review of the High Performance Computing and Communications (HPCC) Program, which has as its goal the acceleration of the commercial availability and utilization of the next generation of high performance computers and networks in order to: (1) extend U.S. technological leadership in high performance computing and computer…

  14. Computational Accelerator Physics. Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bisognano, J.J.; Mondelli, A.A.

    1997-04-01

    The sixty two papers appearing in this volume were presented at CAP96, the Computational Accelerator Physics Conference held in Williamsburg, Virginia from September 24{minus}27,1996. Science Applications International Corporation (SAIC) and the Thomas Jefferson National Accelerator Facility (Jefferson lab) jointly hosted CAP96, with financial support from the U.S. department of Energy`s Office of Energy Research and the Office of Naval reasearch. Topics ranged from descriptions of specific codes to advanced computing techniques and numerical methods. Update talks were presented on nearly all of the accelerator community`s major electromagnetic and particle tracking codes. Among all papers, thirty of them are abstracted formore » the Energy Science and Technology database.(AIP)« less

  15. Final Report, Next-Generation Mega-Voltage Cargo-Imaging System for Cargo Conainer Inspection, March 2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. James Clayton, Ph.D., Varian Medical Systems-Security & Inspection Products; Dr. Emma Regentova, Ph.D, University of Nevada Las Vegas; Dr. Evangelos Yfantis, Ph.D., University of Nevada, Las Vegas

    The UNLV Research Foundation, as the primary award recipient, teamed with Varian Medical Systems-Security & Inspection Products and the University of Nevada Las Vegas (UNLV) for the purpose of conducting research and engineering related to a "next-generation" mega-voltage imaging (MVCI) system for inspection of cargo in large containers. The procurement and build-out of hardware for the MVCI project has been completed. The K-9 linear accelerator and an optimized X-ray detection system capable of efficiently detecting X-rays emitted from the accelerator after they have passed through the device is under test. The Office of Science financial assistance award has made possiblemore » the development of a system utilizing a technology which will have a profound positive impact on the security of U.S. seaports. The proposed project will ultimately result in critical research and development advances for the "next-generation" Linatron X-ray accelerator technology, thereby providing a safe, reliable and efficient fixed and mobile cargo inspection system, which will very significantly increase the fraction of cargo containers undergoing reliable inspection as the enter U.S. ports. Both NNSA/NA-22 and the Department of Homeland Security's Domestic Nuclear Detection Office are collaborating with UNLV and its team to make this technology available as soon as possible.« less

  16. The Fermilab Connection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermilab

    More than 4,000 scientists in 53 countries use Fermilab and its particle accelerators, detectors and computers for their research. That includes about 2,500 scientists from 223 U.S. institutions in 42 states, plus the District of Columbia and Puerto Rico.

  17. Dissecting the Science of "Angels and Demons" or Antimatter and Other Matters (Vernon W. Hughes Memorial Lecture)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, Howard

    2009-05-27

    Howard Gordon, a physicist from the U.S. Department of Energy’s Brookhaven National Laboratory, and local educators will separate the science facts from the science fiction of “Angels & Demons,” a major motion picture based on Dan Brown’s best-selling novel. The film, which opens nationally in theaters today, focuses on a plot to destroy the Vatican using antimatter stolen from the Large Hadron Collider (LHC) at the European particle physics laboratory CERN. Speakers will explain the real science of the LHC, including antimatter – oppositely charged cousins of ordinary matter with intriguing properties.

  18. University of Oklahoma - High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skubic, Patrick L.

    2013-07-31

    The High Energy Physics program at the University of Oklahoma, Pat Skubic, Principal Investigator, is attempting to understand nature at the deepest level using the most advanced experimental and theoretical tools. The four experimental faculty, Brad Abbott, Phil Gutierrez, Pat Skubic, and Mike Strauss, together with post-doctoral associates and graduate students, are finishing their work as part of the D0 collaboration at Fermilab, and increasingly focusing their investigations at the Large Hadron Collidor (LHC) as part of the ATLAS Collaboration. Work at the LHC has become even more exciting with the recent discovery by ATLAS and the other collaboration, CMS,more » of the long-sought Higgs boson, which plays a key role in generating masses for the elementary constituents of matter. Work of the OUHEP group has been in the three areas of hardware, software, and analysis. Now that the Higgs boson has been discovered, completing the Standard Model of fundamental physics, new efforts will focus on finding hints of physics beyond the standard model, such as supersymmetry. The OUHEP theory group (Kim Milton, PI) also consists of four faculty members, Howie Baer, Chung Kao, Kim Milton, and Yun Wang, and associated students and postdocs. They are involved in understanding fundamental issues in formulating theories of the microworld, and in proposing models that carry us past the Standard Model, which is an incomplete description of nature. They therefore work in close concert with their experimental colleagues. One also can study fundamental physics by looking at the large scale structure of the universe; in particular the ``dark energy'' that seems to be causing the universe to expand at an accelerating rate, effectively makes up about 3/4 of the energy in the universe, and yet is totally unidentified. Dark energy and dark matter, which together account for nearly all of the energy in the universe, are an important probe of fundamental physics at the very shortest distances, or at the very highest energies. The outcomes of the group's combined experimental and theoretical research will be an improved understanding of nature, at the highest energies reachable, from which applications to technological innovation will surely result, as they always have from such studies in the past.« less

  19. Full steam ahead

    NASA Astrophysics Data System (ADS)

    Heuer, Rolf-Dieter

    2008-03-01

    When the Economist recently reported the news of Rolf-Dieter Heuer's appointment as the next directorgeneral of CERN, it depicted him sitting cross-legged in the middle of a circular track steering a model train around him - smiling. It was an apt cartoon for someone who is about to take charge of the world's most powerful particle accelerator: the 27 km-circumference Large Hadron Collider (LHC), which is nearing completion at the European laboratory just outside Geneva. What the cartoonist did not known is that model railways are one of Heuer's passions.

  20. Integrating Traumatic Brain Injury Model Systems Data into the Federal InteragencyTraumatic Brain Injury Research Informatics Systems

    DTIC Science & Technology

    2017-12-01

    methodologies , and associated tools, rather than summaries or interpretations of this information, can accelerate research progress by allowing re-analysis of... Research Informatics Systems PRINCIPAL INVESTIGATOR: Cynthia Harrison-Felix, PhD CONTRACTING ORGANIZATION: Craig Hospital Englewood, CO 80113...REPORT DATE: December 2017 TYPE OF REPORT: Final PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland 21702

  1. Transportation innovation at the U.S. Department of Transportation (USDOT).

    DOT National Transportation Integrated Search

    2014-01-01

    The USDOT is one of the biggest sources of funding for intelligent transportation system (ITS) research and development nationwide. We have invested over $100 million per year to spur technology innovation and accelerate technology deployment. Our vi...

  2. Collider Signal II:. Missing ET Signatures and Dark Matter Connection

    NASA Astrophysics Data System (ADS)

    Baer, Howard

    2010-08-01

    These lectures give an overview of aspects of missing ET signatures from new physics at the LHC, along with their important connection to dark matter physics. Mostly, I will concentrate on supersymmetric (SUSY) sources of ɆT, but will also mention Little Higgs models with T-parity (LHT) and universal extra dimensions (UED) models with KK-parity. Lecture 1 covers SUSY basics, model building and spectra computation. Lecture 2 addresses sparticle production and decay mechanisms at hadron colliders and event generation. Lecture 3 covers SUSY signatures at LHC, along with LHT and UED signatures for comparison. In Lecture 4, I address the dark matter connection, and how direct and indirect dark matter searches, along with LHC collider searches, may allow us to both discover and characterize dark matter in the next several years. Finally, the interesting scenario of Yukawa-unified SUSY is examined; this case works best if the dark matter turns out to be a mixture of axion/axino states, rather than neutralinos.

  3. Second-generation coil design of the Nb 3Sn low-β quadrupole for the high luminosity LHC

    DOE PAGES

    Bermudez, S. Izquierdo; Ambrosio, G.; Ballarino, A.; ...

    2016-01-18

    As part of the Large Hadron Collider Luminosity upgrade (HiLumi-LHC) program, the US LARP collaboration and CERN are working together to design and build a 150 mm aperture Nb 3Sn quadrupole for the LHC interaction regions. A first series of 1.5 m long coils were fabricated and assembled in a first short model. A detailed visual inspection of the coils was carried out to investigate cable dimensional changes during heat treatment and the position of the windings in the coil straight section and in the end region. The analyses allow identifying a set of design changes which, combined with amore » fine tune of the cable geometry and a field quality optimization, were implemented in a new, second-generation, coil design. In this study, we review the main characteristics of the first generation coils, describe the modification in coil lay-out, and discuss their impact on parts design and magnet analysis.« less

  4. Intercontinental Multi-Domain Monitoring for LHC with perfSONAR

    NASA Astrophysics Data System (ADS)

    Vicinanza, D.

    2012-12-01

    The Large Hadron Collider (LHC) is currently running at CERN in Geneva, Switzerland. Physicists are using LHC to recreate the conditions just after the Big Bang, by colliding two beams of particles and heavy ions head-on at very high energy. The project is generating more than 15 TB of raw data per year, plus 10 TB of “event summary data”. This data is sent out from CERN to eleven Tier 1 research centres in Europe, Asia, and North America using a multi-gigabits Optical Private Network (OPN), the LHCOPN. Tier 1 sites are then connected to 100+ academic and research institutions in the world (the Tier 2s) through a Multipoint to Multipoint network, the LHC Open Network Environment (LHCONE). Network monitoring on such complex network architecture to ensure robust and reliable operation is of crucial importance. The chosen approach for monitoring the OPN and ONE is based on the perfSONAR framework, which is designed for multi-domain monitoring environments. perfSONAR (www.perfsonar.net) is an infrastructure for performance monitoring data exchange between networks, making it easier to solve performance problems occurring between network measurement points interconnected through several network domains.

  5. Design optimization of an ironless inductive position sensor for the LHC collimators

    NASA Astrophysics Data System (ADS)

    Danisi, A.; Masi, A.; Losito, R.; Perriard, Y.

    2013-09-01

    The Ironless Inductive Position Sensor (I2PS) is an air-cored displacement sensor which has been conceived to be totally immune to external DC/slowly-varying magnetic fields. It can thus be used as a valid alternative to Linear Variable Differential Transformers (LVDTs), which can show a position error in magnetic environments. In addition, since it retains the excellent properties of LVDTs, the I2PS can be used in harsh environments, such as nuclear plants, plasma control and particle accelerators. This paper focuses on the design optimization of the sensor, considering the CERN LHC Collimators as application. In particular, the optimization comes after a complete review of the electromagnetic and thermal modeling of the sensor, as well as the proper choice of the reading technique. The design optimization stage is firmly based on these preliminary steps. Therefore, the paper summarises the sensor's complete development, from its modeling to its actual implementation. A set of experimental measurements demonstrates the sensor's performances to be those expected in the design phase.

  6. The MoEDAL Experiment at the LHC

    NASA Astrophysics Data System (ADS)

    Pinfold, James L.

    2014-04-01

    In 2010 the CERN (European Centre for Particle Physics Research) Research Board unanimously approved MoEDAL, the 7th international experiment at the Large Hadron Collider (LHC), which is designed to search for avatars of new physics signified by highly ionizing particles. The MoEDAL detector is like a giant camera ready to reveal "photographic" evidence for new physics and also to actually trap long-lived new particles for further study. The MoEDAL experiment will significantly expand the horizon for discovery at the LHC, in a complementary way. A MoEDAL discovery would have revolutionary implications for our understanding of the microcosm, providing insights into such fundamental questions as: do magnetic monopoles exist, are there extra dimensions or new symmetries of nature; what is the mechanism for the generation of mass; what is the nature of dark matter and how did the big-bang unfurl at the earliest times.

  7. Detector Developments for the High Luminosity LHC Era (2/4)

    ScienceCinema

    Straessner, Arno

    2018-04-16

    Calorimetry and Muon Spectromers - Part II: When upgrading the LHC to higher luminosities, the detector and trigger performance shall be preserved - if not improved - with respect to the nominal performance. The ongoing R&D; for new radiation tolerant front-end electronics for calorimeters with higher read-out bandwidth are summarized and new possibilities for the trigger systems are presented. Similar developments are foreseen for the muon spectrometers, where also radiation tolerance of the muon detectors and functioning at high background rates is important. The corresponding plans and research work for the calorimeter and muon detectors at a LHC with highest luminsity are presented.

  8. US EPA - A*Star Partnership - Accelerating the Acceptance of ...

    EPA Pesticide Factsheets

    The path for incorporating new alternative methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. Some of these challenges include development of relevant and predictive test systems and computational models to integrate and extrapolate experimental data, and rapid characterization and acceptance of these systems and models. The series of presentations will highlight a collaborative effort between the U.S. Environmental Protection Agency (EPA) and the Agency for Science, Technology and Research (A*STAR) that is focused on developing and applying experimental and computational models for predicting chemical-induced liver and kidney toxicity, brain angiogenesis, and blood-brain-barrier formation. In addressing some of these challenges, the U.S. EPA and A*STAR collaboration will provide a glimpse of what chemical risk assessments could look like in the 21st century. Presentation on US EPA – A*STAR Partnership at international symposium on Accelerating the acceptance of next-generation sciences and their application to regulatory risk assessment in Singapore.

  9. Phenomenological study of the interplay between IR-improved DGLAP-CS theory and the precision of an NLO ME matched parton shower MC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Majhi, S.K., E-mail: tpskm@iacs.res.in; Mukhopadhyay, A., E-mail: aditi_mukhopadhyay@baylor.edu; Ward, B.F.L., E-mail: bfl_ward@baylor.edu

    2014-11-15

    We present a phenomenological study of the current status of the application of our approach of exact amplitude-based resummation in quantum field theory to precision QCD calculations, by realistic MC event generator methods, as needed for precision LHC physics. We discuss recent results as they relate to the interplay of the attendant IR-improved DGLAP-CS theory of one of us and the precision of exact NLO matrix-element matched parton shower MC’s in the Herwig6.5 environment as determined by comparison to recent LHC experimental observations on single heavy gauge boson production and decay. The level of agreement between the new theory andmore » the data continues to be a reason for optimism. In the spirit of completeness, we discuss as well other approaches to the same theoretical predictions that we make here from the standpoint of physical precision with an eye toward the (sub-)1% QCD⊗EW total theoretical precision regime for LHC physics. - Highlights: • Using LHC data, we show that IR-improved DGLAP-CS kernels with exact NLO Shower/ME matching improves MC precision. • We discuss other possible approaches in comparison with ours. • We propose experimental tests to discriminate between competing approaches.« less

  10. Simulation of the cabling process for Rutherford cables: An advanced finite element model

    NASA Astrophysics Data System (ADS)

    Cabanes, J.; Garlasche, M.; Bordini, B.; Dallocchio, A.

    2016-12-01

    In all existing large particle accelerators (Tevatron, HERA, RHIC, LHC) the main superconducting magnets are based on Rutherford cables, which are characterized by having: strands fully transposed with respect to the magnetic field, a significant compaction that assures a large engineering critical current density and a geometry that allows efficient winding of the coils. The Nb3Sn magnets developed in the framework of the HL-LHC project for improving the luminosity of the Large Hadron Collider (LHC) are also based on Rutherford cables. Due to the characteristics of Nb3Sn wires, the cabling process has become a crucial step in the magnet manufacturing. During cabling the wires experience large plastic deformations that strongly modify the geometrical dimensions of the sub-elements constituting the superconducting strand. These deformations are particularly severe on the cable edges and can result in a significant reduction of the cable critical current as well as of the Residual Resistivity Ratio (RRR) of the stabilizing copper. In order to understand the main parameters that rule the cabling process and their impact on the cable performance, CERN has developed a 3D Finite Element (FE) model based on the LS-Dyna® software that simulates the whole cabling process. In the paper the model is presented together with a comparison between experimental and numerical results for a copper cable produced at CERN.

  11. Radial scaling in inclusive jet production at hadron colliders

    NASA Astrophysics Data System (ADS)

    Taylor, Frank E.

    2018-03-01

    Inclusive jet production in p-p and p ¯ -p collisions shows many of the same kinematic systematics as observed in single-particle inclusive production at much lower energies. In an earlier study (1974) a phenomenology, called radial scaling, was developed for the single-particle inclusive cross sections that attempted to capture the essential underlying physics of pointlike parton scattering and the fragmentation of partons into hadrons suppressed by the kinematic boundary. The phenomenology was successful in emphasizing the underlying systematics of the inclusive particle productions. Here we demonstrate that inclusive jet production at the Large Hadron Collider (LHC) in high-energy p-p collisions and at the Tevatron in p ¯ -p inelastic scattering shows similar behavior. The ATLAS inclusive jet production plotted as a function of this scaling variable is studied for √s of 2.76, 7 and 13 TeV and is compared to p ¯ -p inclusive jet production at 1.96 TeV measured at the CDF and D0 at the Tevatron and p-Pb inclusive jet production at the LHC ATLAS at √sNN=5.02 TeV . Inclusive single-particle production at Fermi National Accelerator Laboratory fixed target and Intersecting Storage Rings energies are compared to inclusive J /ψ production at the LHC measured in ATLAS, CMS and LHCb. Striking common features of the data are discussed.

  12. Jerome Lewis Duggan: A Nuclear Physicist and a Well-Known, Six-Decade Accelerator Application Conference (CAARI) Organizer

    NASA Astrophysics Data System (ADS)

    Del McDaniel, Floyd; Doyle, Barney L.

    Jerry Duggan was an experimental MeV-accelerator-based nuclear and atomic physicist who, over the past few decades, played a key role in the important transition of this field from basic to applied physics. His fascination for and application of particle accelerators spanned almost 60 years, and led to important discoveries in the following fields: accelerator-based analysis (accelerator mass spectrometry, ion beam techniques, nuclear-based analysis, nuclear microprobes, neutron techniques); accelerator facilities, stewardship, and technology development; accelerator applications (industrial, medical, security and defense, and teaching with accelerators); applied research with accelerators (advanced synthesis and modification, radiation effects, nanosciences and technology); physics research (atomic and molecular physics, and nuclear physics); and many other areas and applications. Here we describe Jerry’s physics education at the University of North Texas (B. S. and M. S.) and Louisiana State University (Ph.D.). We also discuss his research at UNT, LSU, and Oak Ridge National Laboratory, his involvement with the industrial aspects of accelerators, and his impact on many graduate students, colleagues at UNT and other universities, national laboratories, and industry and acquaintances around the world. Along the way, we found it hard not to also talk about his love of family, sports, fishing, and other recreational activities. While these were significant accomplishments in his life, Jerry will be most remembered for his insight in starting and his industry in maintaining and growing what became one of the most diverse accelerator conferences in the world — the International Conference on the Application of Accelerators in Research and Industry, or what we all know as CAARI. Through this conference, which he ran almost single-handed for decades, Jerry came to know, and became well known by, literally thousands of atomic and nuclear physicists, accelerator engineers and vendors, medical doctors, cultural heritage experts... the list goes on and on. While thousands of his acquaintances already miss Jerry, this is being felt most by his family and us (B.D. and F.D.M).

  13. Jerome Lewis Duggan: A Nuclear Physicist and a Well-Known, Six-Decade Accelerator Application Conference (CAARI) Organizer

    NASA Astrophysics Data System (ADS)

    Del McDaniel, Floyd; Doyle, Barney L.

    Jerry Duggan was an experimental MeV-accelerator-based nuclear and atomic physicist who, over the past few decades, played a key role in the important transition of this field from basic to applied physics. His fascination for and application of particle accelerators spanned almost 60 years, and led to important discoveries in the following fields: accelerator-based analysis (accelerator mass spectrometry, ion beam techniques, nuclear-based analysis, nuclear microprobes, neutron techniques); accelerator facilities, stewardship, and technology development; accelerator applications (industrial, medical, security and defense, and teaching with accelerators); applied research with accelerators (advanced synthesis and modification, radiation effects, nanosciences and technology); physics research (atomic and molecular physics, and nuclear physics); and many other areas and applications. Here we describe Jerry's physics education at the University of North Texas (B. S. and M. S.) and Louisiana State University (Ph.D.). We also discuss his research at UNT, LSU, and Oak Ridge National Laboratory, his involvement with the industrial aspects of accelerators, and his impact on many graduate students, colleagues at UNT and other universities, national laboratories, and industry and acquaintances around the world. Along the way, we found it hard not to also talk about his love of family, sports, fishing, and other recreational activities. While these were significant accomplishments in his life, Jerry will be most remembered for his insight in starting and his industry in maintaining and growing what became one of the most diverse accelerator conferences in the world — the International Conference on the Application of Accelerators in Research and Industry, or what we all know as CAARI. Through this conference, which he ran almost single-handed for decades, Jerry came to know, and became well known by, literally thousands of atomic and nuclear physicists, accelerator engineers and vendors, medical doctors, cultural heritage experts... the list goes on and on. While thousands of his acquaintances already miss Jerry, this is being felt most by his family and us (B.D. and F.D.M).

  14. New seismic hazard maps for Puerto Rico and the U.S. Virgin Islands

    USGS Publications Warehouse

    Mueller, C.; Frankel, A.; Petersen, M.; Leyendecker, E.

    2010-01-01

    The probabilistic methodology developed by the U.S. Geological Survey is applied to a new seismic hazard assessment for Puerto Rico and the U.S. Virgin Islands. Modeled seismic sources include gridded historical seismicity, subduction-interface and strike-slip faults with known slip rates, and two broad zones of crustal extension with seismicity rates constrained by GPS geodesy. We use attenuation relations from western North American and worldwide data, as well as a Caribbean-specific relation. Results are presented as maps of peak ground acceleration and 0.2- and 1.0-second spectral response acceleration for 2% and 10% probabilities of exceedance in 50 years (return periods of about 2,500 and 500 years, respectively). This paper describes the hazard model and maps that were balloted by the Building Seismic Safety Council and recommended for the 2003 NEHRP Provisions and the 2006 International Building Code. ?? 2010, Earthquake Engineering Research Institute.

  15. Dark-matter production through loop-induced processes at the LHC: the s-channel mediator case.

    PubMed

    Mattelaer, Olivier; Vryonidou, Eleni

    We show how studies relevant for mono-X searches at the LHC in simplified models featuring a dark-matter candidate and an s -channel mediator can be performed within the MadGraph5_aMC@NLO framework. We focus on gluon-initiated loop-induced processes, mostly relevant to the case where the mediator couples preferentially to third generation quarks and in particular to the top quark. Our implementation allows us to study signatures at hadron colliders involving missing transverse energy plus jets or plus neutral bosons ([Formula: see text]), possibly including the effects of extra radiation by multi-parton merging and matching to the parton shower.

  16. Communicating and Translating EPA's Computational Toxicology Research (WC10)

    EPA Science Inventory

    US EPA’s National Center for Computational Toxicology (NCCT) develops and uses alternative testing methods to accelerate the pace of chemical evaluations, reduce reliance on animal testing, and address the significant lack of chemical data. The chemical data is generated through ...

  17. ToxCast Communications and Outreach Strategy (SETAC)

    EPA Science Inventory

    US EPA's Chemical Safety for Sustainability Research Program has been using in vitro testing methods in an effort to accelerate the pace of chemical evaluations and address the significant lack of health and environmental data on the thousands of chemicals found in commonly used ...

  18. NREL National Bioenergy Center Overview

    ScienceCinema

    Foust, Thomas; Pienkos, Phil; Sluiter, Justin; Magrini, Kim; McMillan, Jim

    2018-01-16

    The demand for clean, sustainable, secure energy is growing... and the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) is answering the call. NREL's National Bioenergy Center is pioneering biofuels research and development and accelerating the pace these technologies move into the marketplace.

  19. The pressing energy innovation challenge of the US National Laboratories

    NASA Astrophysics Data System (ADS)

    Anadon, Laura Diaz; Chan, Gabriel; Bin-Nun, Amitai Y.; Narayanamurti, Venkatesh

    2016-10-01

    Accelerating the development and deployment of energy technologies is a pressing challenge. Doing so will require policy reform that improves the efficacy of public research organizations and strengthens the links between public and private innovators. With their US$14 billion annual budget and unique mandates, the US National Laboratories have the potential to critically advance energy innovation, yet reviews of their performance find several areas of weak organizational design. Here, we discuss the challenges the National Laboratories face in engaging the private sector, increasing their contributions to transformative research, and developing culture and management practices to better support innovation. We also offer recommendations for how policymakers can address these challenges.

  20. The US Navy/Canadian DCIEM research initiative on pressure breathing physiology

    NASA Technical Reports Server (NTRS)

    Whitley, Phillip E.

    1994-01-01

    Development of improved positive pressure breathing garments for altitude and acceleration protection has occurred without collection of sufficient physiological data to understand the mechanisms of the improvement. Furthermore, modeling of the predicted response of future enhanced garments is greatly hampered by this lack of information. A joint, international effort is under way between Canada's Defense and Civil Institute for Environmental Medicine (DCIEM) and the US Navy's Naval Air Warfare Center Aircraft Division, Warminster (NAWCACDIVWAR). Using a Canadian subject pool, experiments at both the DCIEM altitude facility and the NAWCADIVWAR Dynamic Flight Simulator have been conducted to determine the cardiovascular and respiratory consequences of high levels of positive pressure breathing for altitude and positive pressure breathing for acceleration protection. Various improved pressure breathing garments were used to collect comparative physiological and performance data. New pressure breathing level and durahon capabilities have been encountered. Further studies will address further improvements in pressure suit design and correlation of altitude and acceleration data.

  1. Measuring asymmetry load pairs of top quarks-antitop in the final states dileptoniques from D0 and ATLAS detectors (in French)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapelain, Antoine

    Particle physics aims to give a coherent description of the nature and the behavior of elementary particles of matter. Particle accelerators (colliders) allow pushing back our know- ledge in this domain producing particles that cannot be observed by other means. This thesis work contributes to this research eld and focuses on the study of the top quark which is the latest brick of matter discovered and the heaviest known elementary particle. The property of the top quark studied here, the charge asymmetry of the top quark-antiquark pairs, has driven a lot of attention in 2011 because of measurements released bymore » Tevatron experiments. These measurements showed deviations with the predictions made in the framework of the standard model of particle physics. New measurements of the charge asymmetry performed at the Tevatron (with the D0 detector) and at the LHC (with the ATLAS detector) are presented in this thesis.« less

  2. Integrated analysis of particle interactions at hadron colliders Report of research activities in 2010-2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nadolsky, Pavel M.

    2015-08-31

    The report summarizes research activities of the project ”Integrated analysis of particle interactions” at Southern Methodist University, funded by 2010 DOE Early Career Research Award DE-SC0003870. The goal of the project is to provide state-of-the-art predictions in quantum chromodynamics in order to achieve objectives of the LHC program for studies of electroweak symmetry breaking and new physics searches. We published 19 journal papers focusing on in-depth studies of proton structure and integration of advanced calculations from different areas of particle phenomenology: multi-loop calculations, accurate long-distance hadronic functions, and precise numerical programs. Methods for factorization of QCD cross sections were advancedmore » in order to develop new generations of CTEQ parton distribution functions (PDFs), CT10 and CT14. These distributions provide the core theoretical input for multi-loop perturbative calculations by LHC experimental collaborations. A novel ”PDF meta-analysis” technique was invented to streamline applications of PDFs in numerous LHC simulations and to combine PDFs from various groups using multivariate stochastic sampling of PDF parameters. The meta-analysis will help to bring the LHC perturbative calculations to the new level of accuracy, while reducing computational efforts. The work on parton distributions was complemented by development of advanced perturbative techniques to predict observables dependent on several momentum scales, including production of massive quarks and transverse momentum resummation at the next-to-next-to-leading order in QCD.« less

  3. Upgrade of the cryogenic CERN RF test facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pirotte, O.; Benda, V.; Brunner, O.

    2014-01-29

    With the large number of superconducting radiofrequency (RF) cryomodules to be tested for the former LEP and the present LHC accelerator a RF test facility was erected early in the 1990’s in the largest cryogenic test facility at CERN located at Point 18. This facility consisted of four vertical test stands for single cavities and originally one and then two horizontal test benches for RF cryomodules operating at 4.5 K in saturated helium. CERN is presently working on the upgrade of its accelerator infrastructure, which requires new superconducting cavities operating below 2 K in saturated superfluid helium. Consequently, the RFmore » test facility has been renewed in order to allow efficient cavity and cryomodule tests in superfluid helium and to improve its thermal performances. The new RF test facility is described and its performances are presented.« less

  4. B, D and K decays

    NASA Astrophysics Data System (ADS)

    Buchalla, G.; Komatsubara, T. K.; Muheim, F.; Silvestrini, L.; Artuso, M.; Asner, D. M.; Ball, P.; Baracchini, E.; Bell, G.; Beneke, M.; Berryhill, J.; Bevan, A.; Bigi, I. I.; Blanke, M.; Bobeth, Ch.; Bona, M.; Borzumati, F.; Browder, T.; Buanes, T.; Buchmüller, O.; Buras, A. J.; Burdin, S.; Cassel, D. G.; Cavanaugh, R.; Ciuchini, M.; Colangelo, P.; Crosetti, G.; Dedes, A.; de Fazio, F.; Descotes-Genon, S.; Dickens, J.; Doležal, Z.; Dürr, S.; Egede, U.; Eggel, C.; Eigen, G.; Fajfer, S.; Feldmann, Th.; Ferrandes, R.; Gambino, P.; Gershon, T.; Gibson, V.; Giorgi, M.; Gligorov, V. V.; Golob, B.; Golutvin, A.; Grossman, Y.; Guadagnoli, D.; Haisch, U.; Hazumi, M.; Heinemeyer, S.; Hiller, G.; Hitlin, D.; Huber, T.; Hurth, T.; Iijima, T.; Ishikawa, A.; Isidori, G.; Jäger, S.; Khodjamirian, A.; Koppenburg, P.; Lagouri, T.; Langenegger, U.; Lazzeroni, C.; Lenz, A.; Lubicz, V.; Lucha, W.; Mahlke, H.; Melikhov, D.; Mescia, F.; Misiak, M.; Nakao, M.; Napolitano, J.; Nikitin, N.; Nierste, U.; Oide, K.; Okada, Y.; Paradisi, P.; Parodi, F.; Patel, M.; Petrov, A. A.; Pham, T. N.; Pierini, M.; Playfer, S.; Polesello, G.; Policicchio, A.; Poschenrieder, A.; Raimondi, P.; Recksiegel, S.; Řezníček, P.; Robert, A.; Rosner, J. L.; Ruggiero, G.; Sarti, A.; Schneider, O.; Schwab, F.; Simula, S.; Sivoklokov, S.; Slavich, P.; Smith, C.; Smizanska, M.; Soni, A.; Speer, T.; Spradlin, P.; Spranger, M.; Starodumov, A.; Stech, B.; Stocchi, A.; Stone, S.; Tarantino, C.; Teubert, F.; T'jampens, S.; Toms, K.; Trabelsi, K.; Trine, S.; Uhlig, S.; Vagnoni, V.; van Hunen, J. J.; Weiglein, G.; Weiler, A.; Wilkinson, G.; Xie, Y.; Yamauchi, M.; Zhu, G.; Zupan, J.; Zwicky, R.

    2008-09-01

    The present report documents the results of Working Group 2: B, D and K decays, of the workshop on Flavor in the Era of the LHC, held at CERN from November 2005 through March 2007. With the advent of the LHC, we will be able to probe New Physics (NP) up to energy scales almost one order of magnitude larger than it has been possible with present accelerator facilities. While direct detection of new particles will be the main avenue to establish the presence of NP at the LHC, indirect searches will provide precious complementary information, since most probably it will not be possible to measure the full spectrum of new particles and their couplings through direct production. In particular, precision measurements and computations in the realm of flavor physics are expected to play a key role in constraining the unknown parameters of the Lagrangian of any NP model emerging from direct searches at the LHC. The aim of Working Group 2 was twofold: on the one hand, to provide a coherent up-to-date picture of the status of flavor physics before the start of the LHC; on the other hand, to initiate activities on the path towards integrating information on NP from high- p T and flavor data. This report is organized as follows: in Sect. 1, we give an overview of NP models, focusing on a few examples that have been discussed in some detail during the workshop, with a short description of the available computational tools for flavor observables in NP models. Section 2 contains a concise discussion of the main theoretical problem in flavor physics: the evaluation of the relevant hadronic matrix elements for weak decays. Section 3 contains a detailed discussion of NP effects in a set of flavor observables that we identified as “benchmark channels” for NP searches. The experimental prospects for flavor physics at future facilities are discussed in Sect. 4. Finally, Sect. 5 contains some assessments on the work done at the workshop and the prospects for future developments.

  5. Evaluating strategies for sustainable intensification of U.S. agriculture through the Long-Term Agroecosystem Research network

    USDA-ARS?s Scientific Manuscript database

    Sustainable intensification is an emerging model for agriculture designed to reconcile accelerating global demand for agricultural products with long-term environmental stewardship. Defined here as increasing agricultural production while maintaining or improving environmental quality, sustainable i...

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benton, J; Wall, D; Parker, E

    This paper presents the latest information on one of the Accelerated Highly Enriched Uranium (HEU) Disposition initiatives that resulted from the May 2002 Summit meeting between Presidents George W. Bush and Vladimir V. Putin. These initiatives are meant to strengthen nuclear nonproliferation objectives by accelerating the disposition of nuclear weapons-useable materials. The HEU Transparency Implementation Program (TIP), within the National Nuclear Security Administration (NNSA) is working to implement one of the selected initiatives that would purchase excess Russian HEU (93% 235U) for use as fuel in U.S. research reactors over the next ten years. This will parallel efforts to convertmore » the reactors' fuel core from HEU to low enriched uranium (LEU) material, where feasible. The paper will examine important aspects associated with the U.S. research reactor HEU purchase. In particular: (1) the establishment of specifications for the Russian HEU, and (2) transportation safeguard considerations for moving the HEU from the Mayak Production Facility in Ozersk, Russia, to the Y-12 National Security Complex in Oak Ridge, TN.« less

  7. Heavy flavor results at RHIC - A comparative overview

    DOE PAGES

    Dong, Xin

    2012-01-01

    I review the latest heavy flavor measurements at RHIC experiments. Measurements from RHIC together with preliminary results from LHC offer us an opportunity to systematically study the sQGP medium properties. In the end, I will outlook a prospective future on precision heavy flavor measurements with detector upgrades at RHIC.

  8. Use of a wire scanner for monitoring residual gas ionization in Soreq Applied Research Accelerator Facility 20 keV/u proton/deuteron low energy beam transport beam line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vainas, B.; Eliyahu, I.; Weissman, L.

    2012-02-15

    The ion source end of the Soreq Applied Research Accelerator Facility accelerator consists of a proton/deuteron ECR ion source and a low energy beam transport (LEBT) beam line. An observed reduction of the radio frequency quadrupole transmission with increase of the LEBT current prompted additional study of the LEBT beam properties. Numerous measurements have been made with the LEBT bream profiler wire biased by a variable voltage. Current-voltage characteristics in presence of the proton beam were measured even when the wire was far out of the beam. The current-voltage characteristic in this case strongly resembles an asymmetric diodelike characteristic, whichmore » is typical of Langmuir probes monitoring plasma. The measurement of biased wire currents, outside the beam, enables us to estimate the effective charge density in vacuum.« less

  9. Proposal for an Accelerator R&D User Facility at Fermilab's Advanced Superconducting Test Accelerator (ASTA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Church, M.; Edwards, H.; Harms, E.

    2013-10-01

    Fermilab is the nation’s particle physics laboratory, supported by the DOE Office of High Energy Physics (OHEP). Fermilab is a world leader in accelerators, with a demonstrated track-record— spanning four decades—of excellence in accelerator science and technology. We describe the significant opportunity to complete, in a highly leveraged manner, a unique accelerator research facility that supports the broad strategic goals in accelerator science and technology within the OHEP. While the US accelerator-based HEP program is oriented toward the Intensity Frontier, which requires modern superconducting linear accelerators and advanced highintensity storage rings, there are no accelerator test facilities that support themore » accelerator science of the Intensity Frontier. Further, nearly all proposed future accelerators for Discovery Science will rely on superconducting radiofrequency (SRF) acceleration, yet there are no dedicated test facilities to study SRF capabilities for beam acceleration and manipulation in prototypic conditions. Finally, there are a wide range of experiments and research programs beyond particle physics that require the unique beam parameters that will only be available at Fermilab’s Advanced Superconducting Test Accelerator (ASTA). To address these needs we submit this proposal for an Accelerator R&D User Facility at ASTA. The ASTA program is based on the capability provided by an SRF linac (which provides electron beams from 50 MeV to nearly 1 GeV) and a small storage ring (with the ability to store either electrons or protons) to enable a broad range of beam-based experiments to study fundamental limitations to beam intensity and to develop transformative approaches to particle-beam generation, acceleration and manipulation which cannot be done elsewhere. It will also establish a unique resource for R&D towards Energy Frontier facilities and a test-bed for SRF accelerators and high brightness beam applications in support of the OHEP mission of Accelerator Stewardship.« less

  10. Advancing the Growth of the U.S. Wind Industry: Federal Incentives, Funding, and Partnership Opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The U.S. Department of Energy's (DOE's) Wind Energy Technologies Office (WETO) works to accelerate the development and deployment of wind power. The office provides information for researchers, developers, businesses, manufacturers, communities, and others seeking various types of federal assistance available for advancing wind projects. This fact sheet outlines the primary federal incentives for developing and investing in wind power, resources for funding wind power, and opportunities to partner with DOE and other federal agencies on efforts to move the U.S. wind industry forward.

  11. Progress towards next generation hadron colliders: FCC-hh, HE-LHC, and SPPC

    NASA Astrophysics Data System (ADS)

    Zimmermann, Frank; EuCARD-2 Extreme Beams Collaboration; Future Circular Collider (FCC) Study Collaboration

    2017-01-01

    A higher-energy circular proton collider is generally considered to be the only path available in this century for exploring energy scales well beyond the reach of the Large Hadron Collider (LHC) presently in operation at CERN. In response to the 2013 Update of the European Strategy for Particle Physics and aligned with the 2014 US ``P5'' recommendations, the international Future Circular Collider (FCC) study, hosted by CERN, is designing such future frontier hadron collider. This so-called FCC-hh will provide proton-proton collisions at a centre-of-mass energy of 100 TeV, with unprecedented luminosity. The FCC-hh energy goal is reached by combining higher-field, 16 T magnets, based on Nb3Sn superconductor, and a new 100 km tunnel connected to the LHC complex. In addition to the FCC-hh proper, the FCC study is also exploring the possibility of a High-Energy LHC (HE-LHC), with a centre-of-mass energy of 25-27 TeV, as could be achieved in the existing 27 km LHC tunnel using the FCC-hh magnet technology. A separate design effort centred at IHEP Beijing aims at developing and constructing a similar collider in China, with a smaller circumference of about 54 km, called SPPC. Assuming even higher-field 20 T magnets, by relying on high-temperature superconductor, the SPPC could reach a c.m. energy of about 70 TeV. This presentation will report the motivation and the present status of the R&D for future hadron colliders, a comparison of the three designs under consideration, the major challenges, R&D topics, the international technology programs, and the emerging global collaboration. Work supported by the European Commission under Capacities 7th Framework Programme project EuCARD-2, Grant Agreement 312453, and the HORIZON 2020 project EuroCirCol, Grant Agreement 654305.

  12. From scientific discovery to health outcomes: A synergistic model of doctoral nursing education.

    PubMed

    Michael, Melanie J; Clochesy, John M

    2016-05-01

    Across the globe, health system leaders and stakeholder are calling for system-level reforms in education, research, and practice to accelerate the uptake and application of new knowledge in practice and to improve health care delivery and health outcomes. An evolving bi-dimensional research-practice focused model of doctoral nursing education in the U.S. is creating unprecedented opportunities for collaborative translational and investigative efforts for nurse researchers and practitioners. The nursing academy must commit to a shared goal of preparing future generations of nurse scientists and practitioners with the capacity and motivation to work together to accelerate the translation of evidence into practice in order to place nursing at the forefront of health system improvement efforts and advance the profession. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Design and Vertical Tests of SPS-series Double-Quarter Wave (DQW) Cavity Prototypes for the HL-LHC Crab Cavity System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verdú-Andrés, S.; et al.

    Crab crossing is essential for high-luminosity colliders. The High Luminosity Large Hadron Collider (HL-LHC) will equip one of its Interaction Points (IP1) with Double-Quarter Wave (DQW) crab cavities. A DQW cavity is a new generation of deflecting RF cavities that stands out for its compactness and broad frequency separation between fundamental and first high-order modes. The deflecting kick is provided by its fundamental mode. Each HL-LHC DQW cavity shall provide a nominal deflecting voltage of 3.4 MV, although up to 5.0 MV may be required. A Proof-of-Principle (PoP) DQW cavity was limited by quench at 4.6 MV. This paper describesmore » a new, highly optimized cavity, designated DQW SPS-series, which satisfies dimensional, cryogenic, manufacturing and impedance requirements for beam tests at SPS and operation in LHC. Two prototypes of this DQW SPS-series were fabricated by US industry and cold tested after following conventional SRF surface treatment. Both units outperformed the PoP cavity, reaching a deflecting voltage of 5.3-5.9 MV. This voltage - the highest reached by a DQW cavity - is well beyond the nominal voltage of 3.4 MV and may even operate at the ultimate voltage of 5.0MVwith sufficient margin. This paper covers fabrication, surface preparation and cryogenic RF test results and implications.« less

  14. Scalar production in association with a Z boson at the LHC and ILC: The mixed Higgs-radion case of warped models

    NASA Astrophysics Data System (ADS)

    Angelescu, Andrei; Moreau, Grégory; Richard, François

    2017-07-01

    The radion scalar field might be the lightest new particle predicted by extradimensional extensions of the standard model. It could thus lead to the first signatures of new physics at the LHC collider. We perform a complete study of the radion production in association with the Z gauge boson in the custodially protected warped model with a brane-localized Higgs boson addressing the gauge hierarchy problem. Radion-Higgs mixing effects are present. Such a radion production receives possibly resonant contributions from the Kaluza-Klein excitations of the Z boson as well as the extra neutral gauge boson (Z'). All the exchange and mixing effects induced by those heavy bosons are taken into account in the radion coupling and rate calculations. The investigation of the considered radion production at the LHC allows us to be sensitive to some parts of the parameter space but only the ILC program at high luminosity would cover most of the theoretically allowed parameter space via the studied reaction. Complementary tests of the same theoretical parameters can be realized through the high accuracy measurements of the Higgs couplings at the ILC. The generic sensitivity limits on the rates discussed for the LHC and ILC potential reach can be applied to the searches for other (light) exotic scalar bosons.

  15. Direct and indirect signals of natural composite Higgs models

    NASA Astrophysics Data System (ADS)

    Niehoff, Christoph; Stangl, Peter; Straub, David M.

    2016-01-01

    We present a comprehensive numerical analysis of a four-dimensional model with the Higgs as a composite pseudo-Nambu-Goldstone boson that features a calculable Higgs potential and protective custodial and flavour symmetries to reduce electroweak fine-tuning. We employ a novel numerical technique that allows us for the first time to study constraints from radiative electroweak symmetry breaking, Higgs physics, electroweak precision tests, flavour physics, and direct LHC bounds on fermion and vector boson resonances in a single framework. We consider four different flavour symmetries in the composite sector, one of which we show to not be viable anymore in view of strong precision constraints. In the other cases, all constraints can be passed with a sub-percent electroweak fine-tuning. The models can explain the excesses recently observed in WW, WZ, Wh and ℓ + ℓ - resonance searches by ATLAS and CMS and the anomalies in angular observables and branching ratios of rare semi-leptonic B decays observed by LHCb. Solving the B physics anomalies predicts the presence of a dijet or toverline{t} resonance around 1 TeV just below the sensitivity of LHC run 1. We discuss the prospects to probe the models at run 2 of the LHC. As a side product, we identify several gaps in the searches for vector-like quarks at hadron colliders, that could be closed by reanalyzing existing LHC data.

  16. Translating Computational Toxicology Data Through Stakeholder Outreach & Engagement (SOT)

    EPA Science Inventory

    US EPA has been using in vitro testing methods in an effort to accelerate the pace of chemical evaluations and address the significant lack of health and environmental data on the thousands of chemicals found in commonly used products. Since 2005, EPA’s researchers have generated...

  17. Committee approves bill to boost NIH funding.

    PubMed

    2015-08-01

    A U.S. House of Representatives committee approved the 21st Century Cures Act. If passed by Congress, the bill would boost funding for the NIH and FDA and introduce new strategies for accelerating the approval of drugs and devices. ©2015 American Association for Cancer Research.

  18. Lessons and prospects from the pMSSM after LHC Run I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cahill-Rowley, M.; Hewett, J. L.; Ismail, A.

    2015-03-01

    We study SUSY signatures at the 7, 8 and 14 TeV LHC employing the 19-parameter, R-parity conserving p(henomenological)MSSM, in the scenario with a neutralino lightest supersymmetric particle (LSP). Our results were obtained via a fast Monte Carlo simulation of the ATLAS SUSY analysis suite. The flexibility of this framework allows us to study a wide variety of SUSY phenomena simultaneously and to probe for weak spots in existing SUSY search analyses. We determine the ranges of the sparticle masses that are either disfavored or allowed after the searches with the 7 and 8 TeV data sets are combined. We findmore » that natural SUSY models with light squarks and gluinos remain viable. We extrapolate to 14 TeV with both 300 fb(-1) and 3 ab(-1) of integrated luminosity and determine the expected sensitivity of the jets + MET and stop searches to the pMSSM parameter space. We find that the high-luminosity LHC will be powerful in probing SUSY with neutralino LSPs and can provide a more definitive statement on the existence of natural supersymmetry.« less

  19. Gravitino LSP and leptogenesis after the first LHC results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heisig, Jan, E-mail: heisig@physik.rwth-aachen.de

    2014-04-01

    Supersymmetric scenarios where the lightest superparticle (LSP) is the gravitino are an attractive alternative to the widely studied case of a neutralino LSP. A strong motivation for a gravitino LSP arises from the possibility of achieving higher reheating temperatures and thus potentially allow for thermal leptogenesis. The predictions for the primordial abundances of light elements in the presence of a late decaying next-to-LSP (NSLP) as well as the currently measured dark matter abundance allow us to probe the cosmological viability of such a scenario. Here we consider a gravitino-stau scenario. Utilizing a pMSSM scan we work out the implications ofmore » the 7 and 8 TeV LHC results as well as other experimental and theoretical constraints on the highest reheating temperatures that are cosmologically allowed. Our analysis shows that points with T{sub R}∼>10{sup 9} GeV survive only in a very particular corner of the SUSY parameter space. Those spectra feature a distinct signature at colliders that could be looked at in the upcoming LHC run.« less

  20. Simultaneous production of lepton pairs in ultraperipheral relativistic heavy ion collisions

    NASA Astrophysics Data System (ADS)

    Kurban, E.; Güçlü, M. C.

    2017-10-01

    We calculate the total cross sections and probabilities of electromagnetic productions of electron, muon, and tauon pairs simultaneously. At the CERN Large Hadron Collider (LHC), the available electromagnetic energy is sufficient to produce all kinds of leptons coherently. The masses of muons and tauons are large, so their Compton wavelengths are small enough to interact with the colliding nuclei. Therefore, the realistic nuclear form factors are included in the calculations of electromagnetic pair productions. The cross section calculations show that, at LHC energies, the probabilities of simultaneous productions of all kinds of leptons are increased significantly compared to energies available at the BNL Relativistic Heavy Ion Collider (RHIC) . Experimentally, observing this simultaneous production can give us important information about strong QED.

  1. A Model of Human Orientation and Self Motion Perception during Body Acceleration: The Orientation Modeling System

    DTIC Science & Technology

    2016-09-28

    previous research and modeling results. The OMS and Perception Toolbox were used to perform a case study of an F18 mishap. Model results imply that...request documents from DTIC. Change of Address Organizations receiving reports from the U.S. Army Aeromedical Research Laboratory on automatic...54  Coriolis head movement during a coordinated turn. .............................................55  Case Study

  2. Improving Design Efficiency for Large-Scale Heterogeneous Circuits

    NASA Astrophysics Data System (ADS)

    Gregerson, Anthony

    Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.

  3. Plant biology research and training for the 21st century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, K.

    1992-12-31

    The committee was assembled in response to a request from the National Science Foundation (NSF), the US Department of Agriculture (USDA), and the US Department of Energy (DoE). The leadership of these agencies asked the National Academy of Sciences through the National Research Council (NRC) to assess the status of plant-science research in the United States in light of the opportunities arising from advances inother areas of biology. NRC was asked to suggest ways of accelerating the application of these new biologic concepts and tools to research in plant science with the aim of enhancing the acquisition of new knowledgemore » about plants. The charge to the committee was to examine the following: Organizations, departments, and institutions conducting plant biology research; human resources involved in plant biology research; graduate training programs in plant biology; federal, state, and private sources of support for plant-biology research; the role of industry in conducting and supporting plant-biology research; the international status of US plant-biology research; and the relationship of plant biology to leading-edge research in biology.« less

  4. Plant biology research and training for the 21st century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, K.

    1992-01-01

    The committee was assembled in response to a request from the National Science Foundation (NSF), the US Department of Agriculture (USDA), and the US Department of Energy (DoE). The leadership of these agencies asked the National Academy of Sciences through the National Research Council (NRC) to assess the status of plant-science research in the United States in light of the opportunities arising from advances inother areas of biology. NRC was asked to suggest ways of accelerating the application of these new biologic concepts and tools to research in plant science with the aim of enhancing the acquisition of new knowledgemore » about plants. The charge to the committee was to examine the following: Organizations, departments, and institutions conducting plant biology research; human resources involved in plant biology research; graduate training programs in plant biology; federal, state, and private sources of support for plant-biology research; the role of industry in conducting and supporting plant-biology research; the international status of US plant-biology research; and the relationship of plant biology to leading-edge research in biology.« less

  5. Federated software defined network operations for LHC experiments

    NASA Astrophysics Data System (ADS)

    Kim, Dongkyun; Byeon, Okhwan; Cho, Kihyeon

    2013-09-01

    The most well-known high-energy physics collaboration, the Large Hadron Collider (LHC), which is based on e-Science, has been facing several challenges presented by its extraordinary instruments in terms of the generation, distribution, and analysis of large amounts of scientific data. Currently, data distribution issues are being resolved by adopting an advanced Internet technology called software defined networking (SDN). Stability of the SDN operations and management is demanded to keep the federated LHC data distribution networks reliable. Therefore, in this paper, an SDN operation architecture based on the distributed virtual network operations center (DvNOC) is proposed to enable LHC researchers to assume full control of their own global end-to-end data dissemination. This may achieve an enhanced data delivery performance based on data traffic offloading with delay variation. The evaluation results indicate that the overall end-to-end data delivery performance can be improved over multi-domain SDN environments based on the proposed federated SDN/DvNOC operation framework.

  6. Al-Qaeda arrest casts shadow over the LHC

    NASA Astrophysics Data System (ADS)

    Dacey, James

    2009-11-01

    CERN remains on course for the imminent switch-on of the Large Hadron Collider (LHC) despite the media frenzy following the recent arrest of a physicist who had been working at the facility. The researcher in question is a 32-year-old man of Algerian descent who is expected to face trial in France - the country in which he was arrested. His name is yet to be formally identified under French judicial rules.

  7. Optical fibres in the radiation environment of CERN

    NASA Astrophysics Data System (ADS)

    Guillermain, E.

    2017-11-01

    CERN, the European Organization for Nuclear Research (in Geneva, Switzerland), is home to a complex scientific instrument: the 27-kilometre Large Hadron Collider (LHC) collides beams of high-energy particles at close to the speed of light. Optical fibres are widely used at CERN, both in surface areas (e.g. for inter-building IT networks) and in the accelerator complex underground (e.g. for cryogenics, vacuum, safety systems). Optical fibres in the accelerator are exposed to mixed radiation fields (mainly composed of protons, pions, neutrons and other hadrons, gamma rays and electrons), with dose rates depending on the particular installation zone, and with radiation levels often significantly higher than those encountered in space. In the LHC and its injector chain radiation levels range from relatively low annual doses of a few Gy up to hundreds of kGy. Optical fibres suffer from Radiation Induced Attenuation (RIA, expressed in dB per unit length) that affect light transmission and which depends on the irradiation conditions (e.g. dose rate, total dose, temperature). In the CERN accelerator complex, the failure of an optical link can affect the proper functionality of control or monitoring systems and induce the interruption of the accelerator operation. The qualification of optical fibres for installation in critical radiation areas is therefore crucial. Thus, all optical fibre types installed in radiation areas at CERN are subject to laboratory irradiation tests, in order to evaluate their RIA at different total dose and dose rates. This allows the selection of the appropriate optical fibre type (conventional or radiation resistant) compliant with the requirements of each installation. Irradiation tests are performed in collaboration with Fraunhofer INT (irradiation facilities and expert team in Euskirchen, Germany). Conventional off-the-shelf optical fibres can be installed for optical links exposed to low radiation levels (i.e. annual dose typically below few kGy). Nevertheless, the conventional optical fibres must be carefully qualified as a spread in RIA of factor 10 is observed among optical fibres of different types and dopants. In higher radiation areas, special radiation resistant optical fibres are installed. For total dose above 1 kGy, the RIA of these special optical fibres is at least 10 times lower than the conventional optical fibres RIA at same irradiation conditions. 2400 km of these special radiation resistant optical fibres were recently procured at CERN. As part of this procurement process, a quality assurance plan including the irradiation testing of all 65 produced batches was set up. This presentation will review the selection process of the appropriate optical fibre types to be installed in the radiation environment of CERN. The methodology for choosing the irradiation parameters for the laboratory tests will be discussed together with an overview of the RIA of different optical fibre types under several irradiation conditions.

  8. A Prediction of Response of the Head and Neck of the U.S. Adult Military Population to Dynamic Impact Acceleration from Selected Dynamic Test Subjects.

    DTIC Science & Technology

    1976-05-01

    to Review Grants for Clinical Research and Investigation Involving Human Beings, Medical School, The University of Michigan. 3 of biomechanical models...human volunteers in dynamic sled tests found no clinically observable effects. due to acceleration on a subject in which the peak mouth angular...minutes cf rest between trials , and the average fo-ce of each set computed. Figure 2.7 shows typi- cal forcc curves and the EMG signal resulting from

  9. A Variable Energy CW Compact Accelerator for Ion Cancer Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstone, Carol J.; Taylor, J.; Edgecock, R.

    2016-03-10

    Cancer is the second-largest cause of death in the U.S. and approximately two-thirds of all cancer patients will receive radiation therapy with the majority of the radiation treatments performed using x-rays produced by electron linacs. Charged particle beam radiation therapy, both protons and light ions, however, offers advantageous physical-dose distributions over conventional photon radiotherapy, and, for particles heavier than protons, a significant biological advantage. Despite recognition of potential advantages, there is almost no research activity in this field in the U.S. due to the lack of clinical accelerator facilities offering light ion therapy in the States. In January, 2013, amore » joint DOE/NCI workshop was convened to address the challenges of light ion therapy [1], inviting more than 60 experts from diverse fields related to radiation therapy. This paper reports on the conclusions of the workshop, then translates the clinical requirements into accelerat or and beam-delivery technical specifications. A comparison of available or feasible accelerator technologies is compared, including a new concept for a compact, CW, and variable energy light ion accelerator currently under development. This new light ion accelerator is based on advances in nonscaling Fixed-Field Alternating gradient (FFAG) accelerator design. The new design concepts combine isochronous orbits with long (up to 4m) straight sections in a compact racetrack format allowing inner circulating orbits to be energy selected for low-loss, CW extraction, effectively eliminating the high-loss energy degrader in conventional CW cyclotron designs.« less

  10. Accelerating the commercialization of university technologies for military healthcare applications: the role of the proof of concept process

    NASA Astrophysics Data System (ADS)

    Ochoa, Rosibel; DeLong, Hal; Kenyon, Jessica; Wilson, Eli

    2011-06-01

    The von Liebig Center for Entrepreneurism and Technology Advancement at UC San Diego (vonliebig.ucsd.edu) is focused on accelerating technology transfer and commercialization through programs and education on entrepreneurism. Technology Acceleration Projects (TAPs) that offer pre-venture grants and extensive mentoring on technology commercialization are a key component of its model which has been developed over the past ten years with the support of a grant from the von Liebig Foundation. In 2010, the von Liebig Entrepreneurism Center partnered with the U.S. Army Telemedicine and Advanced Technology Research Center (TATRC), to develop a regional model of Technology Acceleration Program initially focused on military research to be deployed across the nation to increase awareness of military medical needs and to accelerate the commercialization of novel technologies to treat the patient. Participants to these challenges are multi-disciplinary teams of graduate students and faculty in engineering, medicine and business representing universities and research institutes in a region, selected via a competitive process, who receive commercialization assistance and funding grants to support translation of their research discoveries into products or services. To validate this model, a pilot program focused on commercialization of wireless healthcare technologies targeting campuses in Southern California has been conducted with the additional support of Qualcomm, Inc. Three projects representing three different universities in Southern California were selected out of forty five applications from ten different universities and research institutes. Over the next twelve months, these teams will conduct proof of concept studies, technology development and preliminary market research to determine the commercial feasibility of their technologies. This first regional program will help build the needed tools and processes to adapt and replicate this model across other regions in the Country.

  11. Does one need a 4.5 K screen in cryostats of superconducting accelerator devices operating in superfluid helium? lessons from the LHL

    NASA Astrophysics Data System (ADS)

    Lebrun, Philippe; Parma, Vittorio; Tavian, Laurent

    2014-01-01

    Superfluid helium is increasingly used as a coolant for superconducting devices in particle accelerators: the lower temperature enhances the performance of superconductors in high-field magnets and reduces BCS losses in RF acceleration cavities, while the excellent transport properties of superfluid helium can be put to work in efficient distributed cooling systems. The thermodynamic penalty of operating at lower temperature however requires careful management of the heat loads, achieved inter alia through proper design and construction of the cryostats. A recurrent question appears to be that of the need and practical feasibility of an additional screen cooled by normal helium at around 4.5 K surrounding the cold mass at about 2 K, in such cryostats equipped with a standard 80 K screen. We introduce the issue in terms of first principles applied to the configuration of the cryostats, discuss technical constraints and economical limitations, and illustrate the argumentation with examples taken from large projects confronted with this issue, i.e. CEBAF, SPL, ESS, LHC, TESLA, European X-FEL, ILC.

  12. Numerate Intends to Join ATOM Consortium to Rapidly Accelerate Preclinical Drug Development | Frederick National Laboratory for Cancer Research

    Cancer.gov

    SAN FRANCISCO – Computational drug design company Numerate has signed a letter of intent to join an open consortium of scientists staffed from two U.S. national laboratories, industry, and academia working to transform drug discovery and developmen

  13. The MoEDAL Experiment at the Lhc — a New Light on the High Energy Frontier

    NASA Astrophysics Data System (ADS)

    Pinfold, James L.

    2014-01-01

    In 2010, the CERN (European Centre for Particle Physics Research) Research Board unanimously approved MoEDAL, the seventh international experiment at the Large Hadron Collider (LHC), which is designed to search for avatars of new physics signified by highly ionizing particles. A MoEDAL discovery would have revolutionary implications for our understanding of the microcosm, providing insights into such fundamental questions as: do magnetic monopoles exist, are there extra dimensions or new symmetries of nature; what is the mechanism for the generation of mass; what is the nature of dark matter and how did the big bang unfurl at the earliest times.

  14. The MoEDAL Experiment at the Lhc -- a New Light on the High Energy Frontier

    NASA Astrophysics Data System (ADS)

    Pinfold, James L.

    2014-04-01

    In 2010, the CERN (European Centre for Particle Physics Research) Research Board unanimously approved MoEDAL, the seventh international experiment at the Large Hadron Collider (LHC), which is designed to search for avatars of new physics signified by highly ionizing particles. A MoEDAL discovery would have revolutionary implications for our understanding of the microcosm, providing insights into such fundamental questions as: do magnetic monopoles exist, are there extra dimensions or new symmetries of nature; what is the mechanism for the generation of mass; what is the nature of dark matter and how did the big bang unfurl at the earliest times.

  15. Beyond the Large Hadron Collider: A First Look at Cryogenics for CERN Future Circular Colliders

    NASA Astrophysics Data System (ADS)

    Lebrun, Philippe; Tavian, Laurent

    Following the first experimental discoveries at the Large Hadron Collider (LHC) and the recent update of the European strategy in particle physics, CERN has undertaken an international study of possible future circular colliders beyond the LHC. The study, conducted with the collaborative participation of interested institutes world-wide, considers several options for very high energy hadron-hadron, electron-positron and hadron-electron colliders to be installed in a quasi-circular underground tunnel in the Geneva basin, with a circumference of 80 km to 100 km. All these machines would make intensive use of advanced superconducting devices, i.e. high-field bending and focusing magnets and/or accelerating RF cavities, thus requiring large helium cryogenic systems operating at 4.5 K or below. Based on preliminary sets of parameters and layouts for the particle colliders under study, we discuss the main challenges of their cryogenic systems and present first estimates of the cryogenic refrigeration capacities required, with emphasis on the qualitative and quantitative steps to be accomplished with respect to the present state-of-the-art.

  16. Accelerating Science with Generative Adversarial Networks: An Application to 3D Particle Showers in Multilayer Calorimeters

    NASA Astrophysics Data System (ADS)

    Paganini, Michela; de Oliveira, Luke; Nachman, Benjamin

    2018-01-01

    Physicists at the Large Hadron Collider (LHC) rely on detailed simulations of particle collisions to build expectations of what experimental data may look like under different theoretical modeling assumptions. Petabytes of simulated data are needed to develop analysis techniques, though they are expensive to generate using existing algorithms and computing resources. The modeling of detectors and the precise description of particle cascades as they interact with the material in the calorimeter are the most computationally demanding steps in the simulation pipeline. We therefore introduce a deep neural network-based generative model to enable high-fidelity, fast, electromagnetic calorimeter simulation. There are still challenges for achieving precision across the entire phase space, but our current solution can reproduce a variety of particle shower properties while achieving speedup factors of up to 100 000 × . This opens the door to a new era of fast simulation that could save significant computing time and disk space, while extending the reach of physics searches and precision measurements at the LHC and beyond.

  17. Challenging data and workload management in CMS Computing with network-aware systems

    NASA Astrophysics Data System (ADS)

    D, Bonacorsi; T, Wildish

    2014-06-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of Intelligent Network Services, including also bandwidth on demand concepts. In this paper, we will review the work done in CMS on this, and the next steps.

  18. Accidental Beam Losses and Protection in the LHC

    NASA Astrophysics Data System (ADS)

    Schmidt, R.; Working Group On Machine Protection

    2005-06-01

    At top energy (proton momentum 7 TeV/c) with nominal beam parameters, each of the two LHC proton beams has a stored energy of 350 MJ threatening to damage accelerator equipment in case of accidental beam loss. It is essential that the beams are properly extracted onto the dump blocks in case of failure since these are the only elements that can withstand full beam impact. Although the energy stored in the beams at injection (450 GeV/c) is about 15 times smaller compared to top energy, the beams must still be properly extracted in case of large accidental beam losses. Failures must be detected at a sufficiently early stage and initiate a beam dump. Quenches and power converter failures will be detected by monitoring the correct functioning of the hardware systems. In addition, safe operation throughout the cycle requires the use of beam loss monitors, collimators and absorbers. Ideas of detection of fast beam current decay, monitoring of fast beam position changes and monitoring of fast magnet current changes are discussed, to provide the required redundancy for machine protection.

  19. Polycrystalline CdTe detectors: A luminosity monitor for the LHC

    NASA Astrophysics Data System (ADS)

    Gschwendtner, E.; Placidia, M.; Schmicklera, H.

    2003-09-01

    The luminosity at the four interaction points of the Large Hadron Collider must be continuously monitored in order to provide an adequate tool for the control and optimization of the collision parameters and the beam optics. At both sides of the interaction points absorbers are installed to protect the super-conducting accelerator elements from quenches caused by the deposited energy of collision products. The luminosity detectors will be installed in the copper core of these absorbers to measure the electromagnetic and hadronic showers caused by neutral particles that are produced at the proton-proton collision in the interaction points. The detectors have to withstand extreme radiation levels (108 Gy/yr at the design luminosity) and their long-term operation has to be assured without requiring human intervention. In addition the demand for bunch-by-bunch luminosity measurements, i.e. 40 MHz detection speed, puts severe constraints on the detectors. Polycrystalline CdTe detectors have a high potential to fulfill the requirements and are considered as LHC luminosity monitors. In this paper the interaction region is shown and the characteristics of the CdTe detectors are presented.

  20. Final Report: High Energy Physics at the Energy Frontier at Louisiana Tech

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sawyer, Lee; Wobisch, Markus; Greenwood, Zeno D.

    The Louisiana Tech University High Energy Physics group has developed a research program aimed at experimentally testing the Standard Model of particle physics and searching for new phenomena through a focused set of analyses in collaboration with the ATLAS experiment at the Large Hadron Collider (LHC) at the CERN laboratory in Geneva. This research program includes involvement in the current operation and maintenance of the ATLAS experiment and full involvement in Phase 1 and Phase 2 upgrades in preparation for future high luminosity (HL-LHC) operation of the LHC. Our focus is solely on the ATLAS experiment at the LHC, withmore » some related detector development and software efforts. We have established important service roles on ATLAS in five major areas: Triggers, especially jet triggers; Data Quality monitoring; grid computing; GPU applications for upgrades; and radiation testing for upgrades. Our physics research is focused on multijet measurements and top quark physics in final states containing tau leptons, which we propose to extend into related searches for new phenomena. Focusing on closely related topics in the jet and top analyses and coordinating these analyses in our group has led to high efficiency and increased visibility inside the ATLAS collaboration and beyond. Based on our work in the DØ experiment in Run II of the Fermilab Tevatron Collider, Louisiana Tech has developed a reputation as one of the leading institutions pursuing jet physics studies. Currently we are applying this expertise to the ATLAS experiment, with several multijet analyses in progress.« less

  1. Design study for a staged Very Large Hadron Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peter J. Limon et al.

    Advancing accelerator designs and technology to achieve the highest energies has enabled remarkable discoveries in particle physics. This report presents the results of a design study for a new collider at Fermilab that will create exceptional opportunities for particle physics--a two-stage very large hadron collider. In its first stage, the machine provides a facility for energy-frontier particle physics research, at an affordable cost and on a reasonable time scale. In a second-stage upgrade in the same tunnel, the VLHC offers the possibility of reaching 100 times the collision energy of the Tevatron. The existing Fermilab accelerator complex serves as themore » injector, and the collision halls are on the Fermilab site. The Stage-1 VLHC reaches a collision energy of 40 TeV and a luminosity comparable to that of the LHC, using robust superferric magnets of elegant simplicity housed in a large-circumference tunnel. The Stage-2 VLHC, constructed after the scientific potential of the first stage has been fully realized, reaches a collision energy of at least 175 TeV with the installation of high-field magnets in the same tunnel. It makes optimal use of the infrastructure developed for the Stage-1 machine, using the Stage-1 accelerator itself as the injector. The goals of this study, commissioned by the Fermilab Director in November 2000, are: to create reasonable designs for the Stage-1 and Stage-2 VLHC in the same tunnel; to discover the technical challenges and potential impediments to building such a facility at Fermilab; to determine the approximate costs of the major elements of the Stage-1 VLHC; and to identify areas requiring significant R and D to establish the basis for the design.« less

  2. U.S. Climate Change Science Program. Vision for the Program and Highlights of the Scientific Strategic Plan

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The vision document provides an overview of the Climate Change Science Program (CCSP) long-term strategic plan to enhance scientific understanding of global climate change.This document is a companion to the comprehensive Strategic Plan for the Climate Change Science Program. The report responds to the Presidents direction that climate change research activities be accelerated to provide the best possible scientific information to support public discussion and decisionmaking on climate-related issues.The plan also responds to Section 104 of the Global Change Research Act of 1990, which mandates the development and periodic updating of a long-term national global change research plan coordinated through the National Science and Technology Council.This is the first comprehensive update of a strategic plan for U.S. global change and climate change research since the origal plan for the U.S. Global Change Research Program was adopted at the inception of the program in 1989.

  3. The F-18 systems research aircraft facility

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.

    1992-01-01

    To help ensure that new aerospace initiatives rapidly transition to competitive U.S. technologies, NASA Dryden Flight Research Facility has dedicated a systems research aircraft facility. The primary goal is to accelerate the transition of new aerospace technologies to commercial, military, and space vehicles. Key technologies include more-electric aircraft concepts, fly-by-light systems, flush airdata systems, and advanced computer architectures. Future aircraft that will benefit are the high-speed civil transport and the National AeroSpace Plane. This paper describes the systems research aircraft flight research vehicle and outlines near-term programs.

  4. Adventures in the enormous: a 1.8 million clone BAC library for the 21.7 Gb genome of loblolly pine

    Treesearch

    Zenaida V. Magbanua; Seval Ozkan; Benjamin D. Bartlett; Philippe Chouvarine; Christopher A. Saski; Aaron Liston; Richard C. Cronn; C. Dana Nelson; Daniel G. Peterson

    2011-01-01

    Loblolly pine (LP; Pinus taeda L.) is the most economically important tree in the U.S. and a cornerstone species in southeastern forests. However, genomics research on LP and other conifers has lagged behind studies on flowering plants due, in part, to the large size of conifer genomes. As a means to accelerate conifer genome research, we...

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Dominic F.; Burwell, Malcolm; Stillman, H.

    This report documents the findings at an Ultraconductive Copper Strategy Meeting held on March 11, 2015 in Washington DC. The aim of this meeting was to bring together researchers of ultraconductive copper in the U.S. to identify and prioritize critical non-proprietary research activities that will enhance the understanding in the material and accelerate its development into practical conductors. Every effort has been made to ensure that the discussion and findings are accurately reported in this document.

  6. The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments

    NASA Astrophysics Data System (ADS)

    Girone, M.; Andreeva, J.; Barreiro Megino, F. H.; Campana, S.; Cinquilli, M.; Di Girolamo, A.; Dimou, M.; Giordano, D.; Karavakis, E.; Kenyon, M. J.; Kokozkiewicz, L.; Lanciotti, E.; Litmaath, M.; Magini, N.; Negri, G.; Roiser, S.; Saiz, P.; Saiz Santos, M. D.; Schovancova, J.; Sciabà, A.; Spiga, D.; Trentadue, R.; Tuckett, D.; Valassi, A.; Van der Ster, D. C.; Shiers, J. D.

    2012-12-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.

  7. Physics History Books in the Fermilab Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sara Tompson.

    Fermilab is a basic research high-energy physics laboratory operated by Universities Research Association, Inc. under contract to the U.S. Department of Energy. Fermilab researchers utilize the Tevatron particle accelerator (currently the worlds most powerful accelerator) to better understand subatomic particles as they exist now and as they existed near the birth of the universe. A collection review of the Fermilab Library monographs was conducted during the summers of 1998 and 1999. While some items were identified for deselection, the review proved most fruitful in highlighting some of the strengths of the Fermilab monograph collection. One of these strengths is historymore » of physics, including biographies and astrophysics. A bibliography of the physics history books in the collection as of Summer, 1999 follows, arranged by author. Note that the call numbers are Library of Congress classification.« less

  8. Physics History Books in the Fermilab Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sara Tompson

    Fermilab is a basic research high-energy physics laboratory operated by Universities Research Association, Inc. under contract to the U.S. Department of Energy. Fermilab researchers utilize the Tevatron particle accelerator (currently the world�s most powerful accelerator) to better understand subatomic particles as they exist now and as they existed near the birth of the universe. A collection review of the Fermilab Library monographs was conducted during the summers of 1998 and 1999. While some items were identified for deselection, the review proved most fruitful in highlighting some of the strengths of the Fermilab monograph collection. One of these strengths is historymore » of physics, including biographies and astrophysics. A bibliography of the physics history books in the collection as of Summer, 1999 follows, arranged by author. Note that the call numbers are Library of Congress classification.« less

  9. Measurements and analysis of dynamic effects in the LARP model quadrupole HQ02b during rapid discharge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorbi, Massimo; Ambrosio, Giorgio; Bajas, Hugo

    This paper presents the analysis of some quench tests addressed to study the dynamic effects in the 1-m-long 120-mm-aperture Nb 3Sn quadrupole magnet, i.e., HQ02b, designed, fabricated, and tested by the LHC Accelerator Research Program. The magnet has a short sample gradient of 205 T/m at 1.9 K and a peak field of 14.2 T. The test campaign has been performed at CERN in April 2014. In the specific tests, which were dedicated to the measurements of the dynamic inductance of the magnet during the rapid current discharge for a quench, the protection heaters were activated only in some windings,more » in order to obtain the measure of the resistive and inductive voltages separately. The analysis of the results confirms a very low value of the dynamic inductance at the beginning of the discharge, which later approaches the nominal value. Indications of dynamic inductance variation were already found from the analysis of current decay during quenches in the previous magnets HQ02a and HQ02a2; however, with this dedicated test of HQ02b, a quantitative measurement and assessment has been possible. An analytical model using interfilament coupling current influence for the inductance lowering has been implemented in the quench calculation code QLASA, and the comparison with experimental data is given. In conclusion, the agreement of the model with the experimental results is very good and allows predicting more accurately the critical parameters in quench analysis (MIITs, hot spot temperature) for the MQXF Nb3Sn quadrupoles, which will be installed in the High Luminosity LHC.« less

  10. Measurements and analysis of dynamic effects in the LARP model quadrupole HQ02b during rapid discharge

    DOE PAGES

    Sorbi, Massimo; Ambrosio, Giorgio; Bajas, Hugo; ...

    2016-06-01

    This paper presents the analysis of some quench tests addressed to study the dynamic effects in the 1-m-long 120-mm-aperture Nb 3Sn quadrupole magnet, i.e., HQ02b, designed, fabricated, and tested by the LHC Accelerator Research Program. The magnet has a short sample gradient of 205 T/m at 1.9 K and a peak field of 14.2 T. The test campaign has been performed at CERN in April 2014. In the specific tests, which were dedicated to the measurements of the dynamic inductance of the magnet during the rapid current discharge for a quench, the protection heaters were activated only in some windings,more » in order to obtain the measure of the resistive and inductive voltages separately. The analysis of the results confirms a very low value of the dynamic inductance at the beginning of the discharge, which later approaches the nominal value. Indications of dynamic inductance variation were already found from the analysis of current decay during quenches in the previous magnets HQ02a and HQ02a2; however, with this dedicated test of HQ02b, a quantitative measurement and assessment has been possible. An analytical model using interfilament coupling current influence for the inductance lowering has been implemented in the quench calculation code QLASA, and the comparison with experimental data is given. In conclusion, the agreement of the model with the experimental results is very good and allows predicting more accurately the critical parameters in quench analysis (MIITs, hot spot temperature) for the MQXF Nb3Sn quadrupoles, which will be installed in the High Luminosity LHC.« less

  11. Superconducting accelerator magnet technology in the 21st century: A new paradigm on the horizon?

    NASA Astrophysics Data System (ADS)

    Gourlay, S. A.

    2018-06-01

    Superconducting magnets for accelerators were first suggested in the mid-60's and have since become one of the major components of modern particle colliders. Technological progress has been slow but steady for the last half-century, based primarily on Nb-Ti superconductor. That technology has reached its peak with the Large Hadron Collider (LHC). Despite the superior electromagnetic properties of Nb3Sn and adoption by early magnet pioneers, it is just now coming into use in accelerators though it has not yet reliably achieved fields close to the theoretical limit. The discovery of the High Temperature Superconductors (HTS) in the late '80's created tremendous excitement, but these materials, with tantalizing performance at high fields and temperatures, have not yet been successfully developed into accelerator magnet configurations. Thanks to relatively recent developments in both Bi-2212 and REBCO, and a more focused international effort on magnet development, the situation has changed dramatically. Early optimism has been replaced with a reality that could create a new paradigm in superconducting magnet technology. Using selected examples of magnet technology from the previous century to define the context, this paper will describe the possible innovations using HTS materials as the basis for a new paradigm.

  12. New theories of relativistic hydrodynamics in the LHC era

    NASA Astrophysics Data System (ADS)

    Florkowski, Wojciech; Heller, Michal P.; Spaliński, Michał

    2018-04-01

    The success of relativistic hydrodynamics as an essential part of the phenomenological description of heavy-ion collisions at RHIC and the LHC has motivated a significant body of theoretical work concerning its fundamental aspects. Our review presents these developments from the perspective of the underlying microscopic physics, using the language of quantum field theory, relativistic kinetic theory, and holography. We discuss the gradient expansion, the phenomenon of hydrodynamization, as well as several models of hydrodynamic evolution equations, highlighting the interplay between collective long-lived and transient modes in relativistic matter. Our aim to provide a unified presentation of this vast subject—which is naturally expressed in diverse mathematical languages—has also led us to include several new results on the large-order behaviour of the hydrodynamic gradient expansion.

  13. Note: The design of thin gap chamber simulation signal source based on field programmable gate array.

    PubMed

    Hu, Kun; Lu, Houbing; Wang, Xu; Li, Feng; Liang, Futian; Jin, Ge

    2015-01-01

    The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.

  14. Note: The design of thin gap chamber simulation signal source based on field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Kun; Wang, Xu; Li, Feng

    The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.

  15. Two-photon production of leptons at hadron colliders in semielastic and elastic cases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manko, A. Yu., E-mail: andrej.j.manko@gmail.com; Shulyakovsky, R. G., E-mail: shul@ifanbel.bas-net.by, E-mail: shulyakovsky@iaph.bas-net.by

    The mechanism of two-photon dilepton production is studied in the equivalent-photon (Weizsäcker–Williams) approximation. This approximation is shown to describe well experimental data from hadron accelerators. The respective total and differential cross sections were obtained for the LHC and for the Tevatron collider at various energies of colliding hadrons. The differential cross sections were studied versus the dilepton invariant mass, transverse momentum, and emission angle in the reference frame comoving with the center of mass of colliding hadrons. The cases of semielastic and inelastic collisions were examined.

  16. Being Relevant in Tough Times: TRIUMF's Five-Year Plan

    ScienceCinema

    Tim, Mayer [TRIUMF

    2017-12-09

    Perhaps better known to the international community than its own neighbors, TRIUMF is Canada's national laboratory for particle and nuclear physics.  Working with the Canadian scientific community, TRIUMF has formulated a new vision to transform the laboratory and deliver a whole new level of performance and impact.  The plan capitalizes on platform technologies (superconducting RF cavities for accelerator physics and radiotracers in nuclear medicine) and exploits Canada's role in ATLAS and the LHC.  I will describe the key elements of the plan and discuss the science-policy landscape in which TRIUMF must make its case.

  17. Proton-Proton and Proton-Antiproton Colliders

    NASA Astrophysics Data System (ADS)

    Scandale, Walter

    In the last five decades, proton-proton and proton-antiproton colliders have been the most powerful tools for high energy physics investigations. They have also deeply catalyzed innovation in accelerator physics and technology. Among the large number of proposed colliders, only four have really succeeded in becoming operational: the ISR, the SppbarS, the Tevatron and the LHC. Another hadron collider, RHIC, originally conceived for ion-ion collisions, has also been operated part-time with polarized protons. Although a vast literature documenting them is available, this paper is intended to provide a quick synthesis of their main features and key performance.

  18. Proton-Proton and Proton-Antiproton Colliders

    NASA Astrophysics Data System (ADS)

    Scandale, Walter

    2014-04-01

    In the last five decades, proton-proton and proton-antiproton colliders have been the most powerful tools for high energy physics investigations. They have also deeply catalyzed innovation in accelerator physics and technology. Among the large number of proposed colliders, only four have really succeeded in becoming operational: the ISR, the SppbarS, the Tevatron and the LHC. Another hadron collider, RHIC, originally conceived for ion-ion collisions, has also been operated part-time with polarized protons. Although a vast literature documenting them is available, this paper is intended to provide a quick synthesis of their main features and key performance.

  19. Proton-Proton and Proton-Antiproton Colliders

    NASA Astrophysics Data System (ADS)

    Scandale, Walter

    2015-02-01

    In the last five decades, proton-proton and proton-antiproton colliders have been the most powerful tools for high energy physics investigations. They have also deeply catalyzed innovation in accelerator physics and technology. Among the large number of proposed colliders, only four have really succeeded in becoming operational: the ISR, the SppbarS, the Tevatron and the LHC. Another hadron collider, RHIC, originally conceived for ion-ion collisions, has also been operated part-time with polarized protons. Although a vast literature documenting them is available, this paper is intended to provide a quick synthesis of their main features and key performance.

  20. Proceedings of the NATO Nuclear Human Response Subject Matter Expert Review Meeting, 23-25 June 2008, Albuquerque, New Mexico, United States of America

    DTIC Science & Technology

    2009-08-01

    Radiobiological Research Institute (AFRRI) Mr. Michael Leggeiri, Jr, US Army Medical Research and Material Command Dr. Gene McClellan, Applied Research ...to 6 weeks with the radiation injury alone but is accelerated with other injuries; with other injuries death may occur within 2 weeks ≥ 8.3 Bone...Fluence Burn Surface Area Insult Ranges E. Recommendations/ Next Actions: Based on this meeting, the following additional tasks were recommended: 1

  1. Teaching And Training Tools For The Undergraduate: Experience With A Rebuilt AN-400 Accelerator

    NASA Astrophysics Data System (ADS)

    Roberts, Andrew D.

    2011-06-01

    There is an increasingly recognized need for people trained in a broad range of applied nuclear science techniques, indicated by reports from the American Physical Society and elsewhere. Anecdotal evidence suggests that opportunities for hands-on training with small particle accelerators have diminished in the US, as development programs established in the 1960's and 1970's have been decommissioned over recent decades. Despite the reduced interest in the use of low energy accelerators in fundamental research, these machines can offer a powerful platform for bringing unique training opportunities to the undergraduate curriculum in nuclear physics, engineering and technology. We report here on the new MSU Applied Nuclear Science Lab, centered around the rebuild of an AN400 electrostatic accelerator. This machine is run entirely by undergraduate students under faculty supervision, allowing a great deal of freedom in its use without restrictions from graduate or external project demands.

  2. Teaching And Training Tools For The Undergraduate: Experience With A Rebuilt AN-400 Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Andrew D.

    2011-06-01

    There is an increasingly recognized need for people trained in a broad range of applied nuclear science techniques, indicated by reports from the American Physical Society and elsewhere. Anecdotal evidence suggests that opportunities for hands-on training with small particle accelerators have diminished in the US, as development programs established in the 1960's and 1970's have been decommissioned over recent decades. Despite the reduced interest in the use of low energy accelerators in fundamental research, these machines can offer a powerful platform for bringing unique training opportunities to the undergraduate curriculum in nuclear physics, engineering and technology. We report here onmore » the new MSU Applied Nuclear Science Lab, centered around the rebuild of an AN400 electrostatic accelerator. This machine is run entirely by undergraduate students under faculty supervision, allowing a great deal of freedom in its use without restrictions from graduate or external project demands.« less

  3. Laboratory and field testing of an accelerated bridge construction demonstration bridge : US Highway 6 bridge over Keg Creek : tech transfer summary.

    DOT National Transportation Integrated Search

    2013-04-01

    To address the need for the development of a fast, repeatable, and dependable way to replace typical bridges across the country, the Transportation Research Board (TRB) developed project R04, Innovative Designs for Rapid Renewal, as part of the...

  4. Bell-Plesset effects in Rayleigh-Taylor instability of finite-thickness spherical and cylindrical shells

    NASA Astrophysics Data System (ADS)

    Velikovich, A. L.; Schmit, P. F.

    2015-11-01

    Bell-Plesset effects accounting for the time dependence of the radius, velocity and acceleration of the Rayleigh-Taylor-unstable surface are ubiquitous in the instability of spherical laser targets and magnetically driven cylindrical liners. We present an analytical model that, for an ideal incompressible fluid and small perturbation amplitudes, exactly accounts for the Bell-Plesset effects in finite-thickness targets and liners through acceleration and deceleration phases. We derive the time-dependent dispersion equations determining the ``instantaneous growth rate'' and demonstrate that by integrating this growth rate over time (the WKB approximation) we accurately evaluate the number of perturbation e-foldings during the acceleration phase. In the limit of the small target/liner thickness, we obtain the exact thin-shell perturbation equations and approximate thin-shell dispersion relations, generalizing the earlier results of Harris (1962), Ott (1972) and Bud'ko et al. (1989). This research was supported by the US DOE/NNSA (A.L.V.), and in part by appointment to the Sandia National Laboratories Truman Fellowship in National Security Science and Engineering (P.F.S.), which is part of the Laboratory Directed Research and Development (LDRD) Program, Project No. 165746, and sponsored by Sandia Corporation (a wholly owned subsidiary of Lockheed Martin Corporation) as Operator of Sandia National Laboratories under its U.S. Department of Energy Contract No. DE-AC04-94AL85000.

  5. Experimental Nuclear Physics Activity in Italy

    NASA Astrophysics Data System (ADS)

    Chiavassa, E.; de Marco, N.

    2003-04-01

    The experimental Nuclear Physics activity of the Italian researchers is briefly reviewed. The experiments, that are financially supported by the INFN, are done in strict collaboration by more than 500 INFN and University researchers. The experiments cover all the most important field of the modern Nuclear Physics with probes extremely different in energy and interactions. Researches are done in all the four National Laboratories of the INFN even if there is a deeper involvement of the two national laboratories expressly dedicated to Nuclear Physics: the LNL (Laboratorio Nazionale di Legnaro) and LNS (Laboratorio Nazionale del Sud) where nuclear spectroscopy and reaction dynamics are investigated. All the activities with electromagnetic probes develops in abroad laboratories as TJNAF, DESY, MAMI, ESFR and are dedicated to the studies of the spin physics and of the nucleon resonance; hypernuclear and kaon physics is investigated at LNF. A strong community of researchers work in the relativistic and ultra-relativistic heavy ions field in particular at CERN with the SPS Pb beam and in the construction of the ALICE detector for heavy-ion physics at the LHC collider. Experiments of astrophysical interest are done with ions of very low energy; in particular the LUNA accelerator facility at LNGS (Laboratorio Nazionale del Gran Sasso) succeeded measuring cross section at solar energies, below or near the solar Gamow peak. Interdisciplinary researches on anti-hydrogen atom spectroscopy and on measurements of neutron cross sections of interest for ADS development are also supported.

  6. NCI and FDA to Study Cancer Proteogenomics Together | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The National Cancer Institute (NCI) Office of Cancer Clinical Proteomics Research (OCCPR), part of the National Institutes of Health, and the U.S. Food and Drug Administration (FDA) has signed a Memorandum of Understanding (MOU) in proteogenomic regulatory science.  This will allow the agencies to share information that will accelerate the development of proteogenomic technologies and biomarkers, as it relates to precision medicine in cancer.

  7. Flavorful Z‧ signatures at LHC and ILC

    NASA Astrophysics Data System (ADS)

    Chen, Shao-Long; Okada, Nobuchika

    2008-10-01

    There are lots of new physics models which predict an extra neutral gauge boson, referred as Z‧-boson. In a certain class of these new physics models, the Z‧-boson has flavor-dependent couplings with the fermions in the Standard Model (SM). Based on a simple model in which couplings of the SM fermions in the third generation with the Z‧-boson are different from those of the corresponding fermions in the first two generations, we study the signatures of Z‧-boson at the Large Hadron Collider (LHC) and the International Linear Collider (ILC). We show that at the LHC, the Z‧-boson with mass around 1 TeV can be produced through the Drell-Yan processes and its dilepton decay modes provide us clean signatures not only for the resonant production of Z‧-boson but also for flavor-dependences of the production cross sections. We also study fermion pair productions at the ILC involving the virtual Z‧-boson exchange. Even though the center-of-energy of the ILC is much lower than a Z‧-boson mass, the angular distributions and the forward-backward asymmetries of fermion pair productions show not only sizable deviations from the SM predictions but also significant flavor-dependences.

  8. Mechanical qualification of the support structure for MQXF, the Nb 3Sn low-β quadrupole for the high luminosity LHC

    DOE PAGES

    Juchno, M.; Ambrosio, G.; Anerella, M.; ...

    2016-01-26

    Within the scope of the High Luminosity LHC project, the collaboration between CERN and U.S. LARP is developing new low-β quadrupoles using the Nb 3Sn superconducting technology for the upgrade of the LHC interaction regions. The magnet support structure of the first short model was designed and two units were fabricated and tested at CERN and at LBNL. The structure provides the preload to the collars-coils subassembly by an arrangement of outer aluminum shells pre-tensioned with water-pressurized bladders. For the mechanical qualification of the structure and the assembly procedure, superconducting coils were replaced with solid aluminum “dummy coils”, the structuremore » was preloaded at room temperature, and then cooled-down to 77 K. Mechanical behavior of the magnet structure was monitored with the use of strain gauges installed on the aluminum shells, the dummy coils and the axial preload system. As a result, this paper reports on the outcome of the assembly and the cool-down tests with dummy coils, which were performed at CERN and at LBNL, and presents the strain gauge measurements compared to the 3D finite element model predictions.« less

  9. PROCEEDINGS OF RIKEN/BNL RESEARCH CENTER WORKSHOP, EQUILIBRIUM AND NON-EQUILIBRIM ASPECTTS OF HOT, DENSE QCD, VOLUME 28.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DE VEGA,H.J.; BOYANOVSKY,D.

    The Relativistic Heavy Ion Collider (RHIC) at Brookhaven, beginning operation this year, and the Large Hadron Collider (LHC) at CERN, beginning operation {approximately}2005, will provide an unprecedented range of energies and luminosities that will allow us to probe the Gluon-Quark plasma. At RHIC and LHC, at central rapidity typical estimates of energy densities and temperatures are e * 1-10 GeV/fm3 and T0 * 300 - 900 MeV. Such energies are well above current estimates for the GQ plasma. Initially, this hot, dense plasma is far from local thermal equilibrium, making the theoretical study of transport phenomena, kinetic and chemical equilibrationmore » in dense and hot plasmas, and related issues a matter of fundamental importance. During the last few years a consistent framework to study collective effects in the Gluon-Quark plasma, and a microscopic description of transport in terms of the hard thermal (and dense) loops resummation program has emerged. This approach has the potential of providing a microscopic formulation of transport, in the regime of temperatures and densities to be achieved at RHIC and LHC. A parallel development over the last few years has provided a consistent formulation of non-equilibrium quantum field theory that provides a real-time description of phenomena out of equilibrium. Novel techniques including non-perturbative approaches and the dynamical renormalization group techniques lead to new insights into transport and relaxation. A deeper understanding of collective.excitations and transport phenomena in the GQ plasma could lead to recognize novel potential experimental signatures. New insights into small-c physics reveals a striking similarity between small-c and hard thermal loops, and novel real-time numerical simulations have recently studied the parton distributions and their thermalizations in the initial stages of a heavy ion collision.« less

  10. PROCEEDINGS OF RIKEN/BNL RESEARCH CENTER WORKSHOP, EQUILIBRIUM AND NON-EQUILIBRIM ASPECTS OF HOT, DENSE QCD, VOLUME 28.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Vega, H.J.; Boyanovsky, D.

    The Relativistic Heavy Ion Collider (RHIC) at Brookhaven, beginning operation this year, and the Large Hadron Collider (LHC) at CERN, beginning operation {approximately}2005, will provide an unprecedented range of energies and luminosities that will allow us to probe the Gluon-Quark plasma. At RHIC and LHC, at central rapidity typical estimates of energy densities and temperatures are e * 1-10 GeV/fm3 and T0 * 300 - 900 MeV. Such energies are well above current estimates for the GQ plasma. Initially, this hot, dense plasma is far from local thermal equilibrium, making the theoretical study of transport phenomena, kinetic and chemical equilibrationmore » in dense and hot plasmas, and related issues a matter of fundamental importance. During the last few years a consistent framework to study collective effects in the Gluon-Quark plasma, and a microscopic description of transport in terms of the hard thermal (and dense) loops resummation program has emerged. This approach has the potential of providing a microscopic formulation of transport, in the regime of temperatures and densities to be achieved at RHIC and LHC. A parallel development over the last few years has provided a consistent formulation of non-equilibrium quantum field theory that provides a real-time description of phenomena out of equilibrium. Novel techniques including non-perturbative approaches and the dynamical renormalization group techniques lead to new insights into transport and relaxation. A deeper understanding of collective.excitations and transport phenomena in the GQ plasma could lead to recognize novel potential experimental signatures. New insights into small-c physics reveals a striking similarity between small-c and hard thermal loops, and novel real-time numerical simulations have recently studied the parton distributions and their thermalizations in the initial stages of a heavy ion collision.« less

  11. Characterisation of novel thin n-in-p planar pixel modules for the ATLAS Inner Tracker upgrade

    NASA Astrophysics Data System (ADS)

    Beyer, J.-C.; La Rosa, A.; Macchiolo, A.; Nisius, R.; Savic, N.; Taibah, R.

    2018-01-01

    In view of the high luminosity phase of the LHC (HL-LHC) to start operation around 2026, a major upgrade of the tracker system for the ATLAS experiment is in preparation. The expected neutron equivalent fluence of up to 2.4×1016 1 MeV neq./cm2 at the innermost layer of the pixel detector poses the most severe challenge. Thanks to their low material budget and high charge collection efficiency after irradiation, modules made of thin planar pixel sensors are promising candidates to instrument these layers. To optimise the sensor layout for the decreased pixel cell size of 50×50 μm2, TCAD device simulations are being performed to investigate the charge collection efficiency before and after irradiation. In addition, sensors of 100-150 μm thickness, interconnected to FE-I4 read-out chips featuring the previous generation pixel cell size of 50×250 μm2, are characterised with testbeams at the CERN-SPS and DESY facilities. The performance of sensors with various designs, irradiated up to a fluence of 1×1016 neq./cm2, is compared in terms of charge collection and hit efficiency. A replacement of the two innermost pixel layers is foreseen during the lifetime of HL-LHC . The replacement will require several months of intervention, during which the remaining detector modules cannot be cooled. They are kept at room temperature, thus inducing an annealing. The performance of irradiated modules will be investigated with testbeam campaigns and the method of accelerated annealing at higher temperatures.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    DOE-EERE's Bioenergy Technologies Office (BETO) works to accelerate the development of a sustainable, cost-competitive, advanced biofuel industry that can strengthen U.S. energy security, environmental quality, and economic vitality, through research, development, and demonstration projects in partnership with industry, academia, and national laboratory partners. BETO’s Advanced Algal Systems Program (also called the Algae Program) has a long-term applied research and development (R&D) strategy to increase the yields and lower the costs of algal biofuels. The team works with partners to develop new technologies, to integrate technologies at commercially relevant scales, and to conduct crosscutting analyses to better understand the potential andmore » challenges of the algal biofuels industry. Research has indicated that this industry is capable of producing billions of gallons of renewable diesel, gasoline, and jet fuels annually. R&D activities are integrated with BETO’s longstanding effort to accelerate the commercialization of lignocellulosic biofuels.« less

  13. LHCNet: Wide Area Networking and Collaborative Systems for HEP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, H.B,

    2007-08-20

    This proposal presents the status and progress in 2006-7, and the technical and financial plans for 2008-2010 for the US LHCNet transatlantic network supporting U.S. participation in the LHC physics program. US LHCNet provides transatlantic connections of the Tier1 computing facilities at Fermilab and Brookhaven with the Tier0 and Tier1 facilities at CERN as well as Tier1s elsewhere in Europe and Asia. Together with ESnet, Internet2, the GEANT pan-European network, and NSF’s UltraLight project, US LHCNet also supports connections between the Tier2 centers (where most of the analysis of the data will take place, starting this year) and the Tier1smore » as needed.See report« less

  14. Higgsino dark matter or not: Role of disappearing track searches at the LHC and future colliders

    NASA Astrophysics Data System (ADS)

    Fukuda, Hajime; Nagata, Natsumi; Otono, Hidetoshi; Shirai, Satoshi

    2018-06-01

    Higgsino in supersymmetric standard models is known to be a promising candidate for dark matter in the Universe. Its phenomenological property is strongly affected by the gaugino fraction in the Higgsino-like state. If this is sizable, in other words, if gaugino masses are less than O (10) TeV, we may probe the Higgsino dark matter in future non-accelerator experiments such as dark matter direct searches and measurements of electric dipole moments. On the other hand, if gauginos are much heavier, then it is hard to search for Higgsino in these experiments. In this case, due to a lack of gaugino components, the mass difference between the neutral and charged Higgsinos is uniquely determined by electroweak interactions to be around 350 MeV, which makes the heavier charged state rather long-lived, with a decay length of about 1 cm. In this letter, we argue that a charged particle with a flight length of O (1) cm can be probed in disappearing-track searches if we require only two hits in the pixel detector. Even in this case, we can reduce background events with the help of the displaced-vertex reconstruction technique. We study the prospects of this search strategy at the LHC and future colliders for the Higgsino dark matter scenario. It is found that an almost pure Higgsino is indeed within the reach of the future 33 TeV collider experiments. We then discuss that the interplay among collider and non-accelerator experiments plays a crucial role in testing the Higgsino dark matter scenarios. Our strategy for disappearing-track searches can also enlarge the discovery potential of pure wino dark matter as well as other electroweak-charged dark matter candidates.

  15. Accelerator-driven Medical Sterilization to Replace Co-60 Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroc, Thomas K.; Thangaraj, Jayakar C.T.; Penning, Richard T.

    This report documents the results of a study prepared at the request of the Office of Radiological Security of the National Nuclear Security Administration (NNSA), as part of the Domestic Protect and Reduce mission by the Illinois Accelerator Research Center (IARC) of Fermi National Accelerator Laboratory. The study included a literature survey of over 80 relevant documents and articles including industry standards, regulatory documents, technical papers, a court case, previous task force reports and industry white papers. The team also conducted interviews or had conversations with over 40 individuals representing over a dozen organizations over the course of its 10-monthmore » program. This report summarizes our findings, addresses the specific questions posed to us by NNSA, and concludes with a list of actionable recommendations.« less

  16. Study of radiation damage to the CMS Hadronic Endcap Calorimeter and investigation into new physics using multi-boson measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belloni, Alberto

    This document is the final report for the U.S. D.O.E. Grant No. DE-SC0014088, which covers the period from May 15, 2015 to March 31, 2016. The funded research covered the study of multi-boson final states, culminated in the measurement of the W ±γγ and, for the first time at an hadronic collider, of the Zγγ production cross sections. These processes, among the rarest multi-boson final states measurable by LHC experiments, allow us to investigate the possibility of new physics in a model-independent way, by looking for anomalies in the standard model couplings among electroweak bosons. In particular, these 3-boson finalmore » states access quartic gauge couplings; the W ±γγ analysis performed as a part of this proposal sets limits on anomalies in the WWγγ quartic gauge coupling. The award also covered R&D activities to define a radiation-tolerant material to be used in the incoming upgrade of the CMS hadronic endcap calorimeter. In particular, the usage of a liquid-scintillator-based detector was investigated. The research work performed in this direction has been collected in a paper recently submitted for publication in the Journal of Instrumentation (JINST).« less

  17. Sensors, nano-electronics and photonics for the Army of 2030 and beyond

    NASA Astrophysics Data System (ADS)

    Perconti, Philip; Alberts, W. C. K.; Bajaj, Jagmohan; Schuster, Jonathan; Reed, Meredith

    2016-02-01

    The US Army's future operating concept will rely heavily on sensors, nano-electronics and photonics technologies to rapidly develop situational understanding in challenging and complex environments. Recent technology breakthroughs in integrated 3D multiscale semiconductor modeling (from atoms-to-sensors), combined with ARL's Open Campus business model for collaborative research provide a unique opportunity to accelerate the adoption of new technology for reduced size, weight, power, and cost of Army equipment. This paper presents recent research efforts on multi-scale modeling at the US Army Research Laboratory (ARL) and proposes the establishment of a modeling consortium or center for semiconductor materials modeling. ARL's proposed Center for Semiconductor Materials Modeling brings together government, academia, and industry in a collaborative fashion to continuously push semiconductor research forward for the mutual benefit of all Army partners.

  18. Report of an EU-US symposium on understanding nutrition-related consumer behavior: strategies to promote a lifetime of healthy food choices

    PubMed Central

    Friedl, Karl E.; Rowe, Sylvia; Bellows, Laura L.; Johnson, Susan L.; Hetherington, Marion M.; de Froidmont-Görtz, Isabelle; Lammens, Veerle; Hubbard, Van S.

    2014-01-01

    This report summarizes an EU-US Task Force on Biotechnology Research symposium on healthy food choices and nutrition-related purchasing behaviors. This meeting was unique in its transdisciplinary approach to obesity and for bringing together scientists from academia, government, and industry. Discussion relevant to funders and researchers centered on: (1) increased use of public-private partnerships; (2) the complexity of food behaviors and obesity risk and multilevel aspects that must be considered; and (3) the importance of transatlantic cooperation and collaboration that could accelerate advances in this field. A call to action stressed these points along with a commitment to enhanced communication strategies. PMID:24974355

  19. Beyond Subprime Learning: Accelerating Progress in Early Education. Policy Brief

    ERIC Educational Resources Information Center

    Bornfreund, Laura; McCann, Clare; Williams, Conor; Guernsey, Lisa

    2014-01-01

    Earlier this year, in "Subprime Learning: Early Education in America since the Great Recession," the current state of early education in the U.S. was surveyed by examining progress over the last five years . It was found that while the public, political, and research consensus is stronger than ever, the field remains in dire need of…

  20. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    ScienceCinema

    None

    2017-12-09

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  1. From RHIC to LHC: Lessons on the QGP

    NASA Astrophysics Data System (ADS)

    Heinz, Ulrich

    2011-10-01

    Recent data from heavy-ion collisions at RHIC and LHC, together with significant advances in theory, have allowed us to make significant first steps in proceeding from a qualitative understanding of high energy collision dynamics to a quantitative characterization of the transport properties of the hot and dense QCD matter created in these collisions. The almost perfectly liquid nature of the Quark-Gluon Plasma (QGP) created at RHIC has recently also been confirmed at the much higher LHC energies, and we can now constrain the specific QGP shear viscosity (η / s) QGP to within a factor of 2.5 of its conjectured lower quantum bound. Viscous hydrodynamics, coupled to a microscopic hadron cascade at late times, has proven to be an extremely successful and highly predictive model for the QGP evolution at RHIC and LHC. The experimental discovery of higher order harmonic flow coefficients and their theoretically predicted differential sensitivity to shear viscosity promises additional gains in precision by about a factor 5 in (η / s) QGP for the very near future. The observed modification of jets and suppression of high-pT hadrons confirms the picture of the QGP as a strongly coupled colored liquid, and recent LHC data yield strong constraints on parton energy loss models, putting significant strain on some theoretical approaches, tuned to RHIC data, that are based on leading-order perturbative QCD. Thermal photon radiation provides important cross-checks on the early stages of dynamical evolution models and constrains the initial QGP temperature, but the recently measured strong photon elliptic flow challenges our present understanding of photon emission rates in the hadronic phase. Recent progress in developing a complete theoretical model for all stages of the QGP fireball expansion, from strong fluctuating gluon fields at its beginning to final hadronic freeze-out, and remaining challenges will be discussed. Work supported by DOE (grants DE-SC0004286 and DE-SC0004104 (JET Collaboration)).

  2. Global analysis of the pMSSM in light of the Fermi GeV excess: prospects for the LHC Run-II and astroparticle experiments

    NASA Astrophysics Data System (ADS)

    Bertone, Gianfranco; Calore, Francesca; Caron, Sascha; Ruiz, Roberto; Kim, Jong Soo; Trotta, Roberto; Weniger, Christoph

    2016-04-01

    We present a new global fit of the 19-dimensional phenomenological Minimal Supersymmetric Standard Model (pMSSM-19) that complies with all the latest experimental results from dark matter indirect, direct and accelerator dark matter searches. We show that the model provides a satisfactory explanation of the excess of gamma rays from the Galactic centre observed by the Fermi Large Area Telescope, assuming that it is produced by the annihilation of neutralinos in the Milky Way halo. We identify two regions that pass all the constraints: the first corresponds to neutralinos with a mass 0~ 80-10 GeV annihilating into WW with a branching ratio of 95%; the second to heavier neutralinos, with mass 0~ 180-20 GeV annihilating into bar tt with a branching ratio of 87%. We show that neutralinos compatible with the Galactic centre GeV excess will soon be within the reach of LHC run-II—notably through searches for charginos and neutralinos, squarks and light smuons—and of Xenon1T, thanks to its unprecedented sensitivity to spin-dependent cross-section off neutrons.

  3. 36th International Conference on High Energy Physics

    NASA Astrophysics Data System (ADS)

    The Australian particle physics community was honoured to host the 36th ICHEP conference in 2012 in Melbourne. This conference has long been the reference event for our international community. The announcement of the discovery of the Higgs boson at the LHC was a major highlight, with huge international press coverage. ICHEP2012 was described by CERN Director-General, Professor Rolf Heuer, as a landmark conference for our field. In additional to the Higgs announcement, important results from neutrino physics, from flavour physics, and from physics beyond the standard model also provided great interest. There were also updates on key accelerator developments such as the new B-factories, plans for the LHC upgrade, neutrino facilities and associated detector developments. ICHEP2012 exceeded the promise expected of the key conference for our field, and really did provide a reference point for the future. Many thanks to the contribution reviewers: Andy Bakich, Csaba Balazs, Nicole Bell, Catherine Buchanan, Will Crump, Cameron Cuthbert, Ben Farmer, Sudhir Gupta, Elliot Hutchison, Paul Jackson, Geng-Yuan Jeng, Archil Kobakhidze, Doyoun Kim, Tong Li, Antonio Limosani (Head Editor), Kristian McDonald, Nikhul Patel, Aldo Saavedra, Mark Scarcella, Geoff Taylor, Ian Watson, Graham White, Tony Williams and Bruce Yabsley.

  4. U.S. national report to the International Union of Geodesy and Geophysics

    NASA Technical Reports Server (NTRS)

    Gorney, D. J.

    1987-01-01

    This paper highlights progress by U.S. authors during 1983-1986 in the broad area of auroral research. Atmospheric emissions and their use as a tool for remote-sensing the dynamics, energetics, and effects of auroral activity is a subject which is emphasized here because of the vast progress made in this area on both observational and theoretical fronts. The evolution of primary auroral electrons, the acceleration of auroral ions, small-scale electric fields, auroral kilometric radiation, auroral empirical models and activity indices are also reviewed. An extensive bibliography is supplied.

  5. US national report to the International Union of Geodesy and Geophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorney, D.J.

    1987-04-01

    This paper highlights progress by U.S. authors during 1983-1986 in the broad area of auroral research. Atmospheric emissions and their use as a tool for remote-sensing the dynamics, energetics, and effects of auroral activity is a subject which is emphasized here because of the vast progress made in this area on both observational and theoretical fronts. The evolution of primary auroral electrons, the acceleration of auroral ions, small-scale electric fields, auroral kilometric radiation, auroral empirical models and activity indices are also reviewed. An extensive bibliography is supplied.

  6. Public health impact of accelerated immunization against rotavirus infection among children aged less than 6 months in the United States

    PubMed Central

    Weycker, Derek; Atwood, Mark Andrew; Standaert, Baudouin; Krishnarajah, Girishanthy

    2014-01-01

    We developed a cohort model to evaluate the expected public health impact of accelerated regimens for immunization against rotavirus gastroenteritis (RVGE). Alternative strategies for vaccination with the pentavalent human-bovine reassortant vaccine, Rotateq® (RV5, Merck) and the oral live attenuated human rotavirus vaccine, Rotarix® (RV1, GlaxoSmithKline Vaccines) were considered, including acceleration of the 1st dose only (by 2 weeks) as well as acceleration of the 1st (by 2 weeks) and subsequent doses (by up to 10 weeks). Assuming vaccine coverage levels consistent with current US clinical practice, accelerated regimens would be expected to reduce annual numbers of RVGE-related hospitalizations by 300–400, emergency department visits by 3000–4000, and outpatient visits by 3000–4000 (i.e., by 9–14%) among US children aged <6 months. Accordingly, accelerating the immunization of children against RVGE may yield substantive reductions in the number of RV-related encounters in US clinical practice. PMID:25424813

  7. The NASA Short-term Prediction Research and Transition (SPoRT) Center: A Collaborative Model for Accelerating Research into Operations

    NASA Technical Reports Server (NTRS)

    Goodman, S. J.; Lapenta, W.; Jedlovec, G.; Dodge, J.; Bradshaw, T.

    2003-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center in Huntsville, Alabama was created to accelerate the infusion of NASA earth science observations, data assimilation and modeling research into NWS forecast operations and decision-making. The principal focus of experimental products is on the regional scale with an emphasis on forecast improvements on a time scale of 0-24 hours. The SPoRT Center research is aligned with the regional prediction objectives of the US Weather Research Program dealing with 0-1 day forecast issues ranging from convective initiation to 24-hr quantitative precipitation forecasting. The SPoRT Center, together with its other interagency partners, universities, and the NASA/NOAA Joint Center for Satellite Data Assimilation, provides a means and a process to effectively transition NASA Earth Science Enterprise observations and technology to National Weather Service operations and decision makers at both the global/national and regional scales. This paper describes the process for the transition of experimental products into forecast operations, current products undergoing assessment by forecasters, and plans for the future.

  8. Design of an rf quadrupole for Landau damping

    NASA Astrophysics Data System (ADS)

    Papke, K.; Grudiev, A.

    2017-08-01

    The recently proposed superconducting quadrupole resonator for Landau damping in accelerators is subjected to a detailed design study. The optimization process of two different cavity types is presented following the requirements of the High Luminosity Large Hadron Collider (HL-LHC) with the main focus on quadrupolar strength, surface peak fields, and impedance. The lower order and higher order mode (LOM and HOM) spectrum of the optimized cavities is investigated and different approaches for their damping are proposed. On the basis of an example the first two higher order multipole errors are calculated. Likewise on this example the required rf power and optimal external quality factor for the input coupler is derived.

  9. PERSISTENT CURRENT EFFECT IN 15-16 T NB3SN ACCELERATOR DIPOLES AND ITS CORRECTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kashikhin, V. V.; Zlobin, A. V.

    2016-11-08

    Nb3Sn magnets with operating fields of 15-16 T are considered for the LHC Energy Doubler and a future Very High Energy pp Collider. Due to large coil volume, high critical current density and large superconducting (SC) filament size the persistent current effect is very large in Nb3Sn dipoles al low fields. This paper presents the results of analysis of the persistent current effect in the 15 T Nb3Sn dipole demonstrator being developed at FNAL, and describes different possibilities of its correction including passive SC wires, iron shims and coil geometry.

  10. Accelerating the pace of discovery in orthopaedic research: A vision toward team science.

    PubMed

    Bahney, Chelsea S; Bruder, Scott P; Cain, Jarrett D; Keyak, Joyce H; Killian, Megan L; Shapiro, Irving M; Jones, Lynne C

    2016-10-01

    The landscape of basic science in the United States and around the world is changing, and the field of orthopaedic research is positioned to lead by embracing a culture of collaborative, team science that reflects our field's interdisciplinary nature. In this article we hope to address some of the cultural challenges and programmatic barriers that impede a team science approach in the US and suggest opportunities for change. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 34:1673-1679, 2016. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  11. Ultra-Compact Accelerator Technologies for Application in Nuclear Techniques

    NASA Astrophysics Data System (ADS)

    Sampayan, S.; Caporaso, G.; Chen, Y.-J.; Carazo, V.; Falabella, S.; Guethlein, G.; Guse, S.; Harris, J. R.; Hawkins, S.; Holmes, C.; Krogh, M.; Nelson, S.; Paul, A. C.; Pearson, D.; Poole, B.; Schmidt, R.; Sanders, D.; Selenes, K.; Sitaraman, S.; Sullivan, J.; Wang, L.; Watson, J.

    2009-12-01

    We report on compact accelerator technology development for potential use as a pulsed neutron source quantitative post verifier. The technology is derived from our on-going compact accelerator technology development program for radiography under the US Department of Energy and for a clinic sized compact proton therapy systems under an industry sponsored Cooperative Research and Development Agreement. The accelerator technique relies on the synchronous discharge of a prompt pulse generating stacked transmission line structure with the beam transit. The goal of this technology is to achieve ˜10 MV/m gradients for 10 s of nanoseconds pulses and ˜100 MV/m gradients for ˜1 ns systems. As a post verifier for supplementing existing x-ray equipment, this system can remain in a charged, stand-by state with little or no energy consumption. We describe the progress of our overall component development effort with the multilayer dielectric wall insulators (i.e., the accelerator wall), compact power supply technology, kHz repetition-rate surface flashover ion sources, and the prompt pulse generation system consisting of wide-bandgap switches and high performance dielectric materials.

  12. Electrical Engineering in Los Alamos Neutron Science Center Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, Michael James

    The field of electrical engineering plays a significant role in particle accelerator design and operations. Los Alamos National Laboratories LANSCE facility utilizes the electrical energy concepts of power distribution, plasma generation, radio frequency energy, electrostatic acceleration, signals and diagnostics. The culmination of these fields produces a machine of incredible potential with uses such as isotope production, neutron spallation, neutron imaging and particle analysis. The key isotope produced in LANSCE isotope production facility is Strontium-82 which is utilized for medical uses such as cancer treatment and positron emission tomography also known as PET scans. Neutron spallation is one of the verymore » few methods used to produce neutrons for scientific research the other methods are natural decay of transuranic elements from nuclear reactors. Accelerator produce neutrons by accelerating charged particles into neutron dense elements such as tungsten imparting a neutral particle with kinetic energy, this has the benefit of producing a large number of neutrons as well as minimizing the waste generated. Utilizing the accelerator scientist can gain an understanding of how various particles behave and interact with matter to better understand the natural laws of physics and the universe around us.« less

  13. New geochronological constraints on the thermal and exhumation history of the Lesser and Higher Himalayan Crystalline Units in the Kullu–Kinnaur area of Himachal Pradesh (India)

    PubMed Central

    Thöni, M.; Miller, C.; Hager, C.; Grasemann, B.; Horschinegg, M.

    2012-01-01

    New geochronological, petrological and structural data from the Beas–Sutlej area of Himachal Pradesh (India) are used to reconstruct the tectonothermal and exhumation history of this part of the Himalayan orogen. Sm–Nd garnet ages at 40.5 ± 1.3 Ma obtained on a pegmatoid from the inverse metamorphic High Himalayan Crystalline (HHC) in the Malana–Parbati area probably mark local melting during initial decompression. Ongoing exhumation in ductilely deformed leuco-gneiss is constrained by Sm–Nd garnet ages at 29 ± 1 Ma and white mica Rb–Sr ages around 24–20 Ma, while Bt Rb–Sr ages indicate a drop of regional metamorphic temperatures below 300 °C between 15 and 12 Ma. The deep Sutlej gorge exposes medium-grade paragneisses and Proterozoic orthogneisses of the Lesser Himalayan Crystalline (LHC), overthrust by the HHC along the Main Central Thrust (MCT). Mica cooling ages in the HHC are in the range of 14–11 Ma. Above the extruded wedge of the HHC, the Leo Pargil leucogranite and associated dykes intrude the Haimanta Unit (HU) below the weakly metamorphic Palaeo-Mesozoic sediments of the Tethyan Himalayas (TH). The Leo Pargil leucogranite yielded a mean Sm–Nd garnet age of 19 ± 1 Ma and Rb–Sr muscovite and biotite cooling ages between 16.4 and 11.6 Ma. Marked young extrusion of LHC units resulted in differentiated exhumation/cooling of more frontal parts of the orogen. Very young ductile deformation of LHC gneisses near Wangtu is constrained by late-kinematic pegmatite intrusions crosscutting the main mylonitic foliation. Sm–Nd garnet and Rb–Sr muscovite ages of these pegmatites range between 7.9 ± 0.9 and 5.5 ± 0.1 Ma. Published apatite FT ages down to 0.6 Ma also document accelerated diachronous sub-recent exhumation of different parts of the orogen. Together with geochronological data from the literature, the new results demonstrate that the HHC and the HU were deformed by shortening and crustal thickening during the Eohimalayan phase (Late Eocene–Oligocene), followed by a strong thermal overprint and intrusions of granitoids during the Neohimalayan Phase (Early to Middle Miocene). The LHC experienced amphibolite facies metamorphic conditions in the Late Miocene prior to extrusion between the HHC and the very low-grade Lesser Himalayan sediments. In conjunction with climate changes, young tectonic activity in this central part of the Himalayan orogen may have strongly influenced fluvial incision and erosion, and therefore, contributed to the accelerated uplift, as indicated by extensive accumulation of Late Miocene to Early Pleistocene fluviatile–lacustrine sediments in the Zanda basin, the Transhimalayan headwaters of the Sutlej, in Western Tibet. PMID:27570473

  14. Modern Elementary Particle Physics

    NASA Astrophysics Data System (ADS)

    Kane, Gordon

    2017-02-01

    1. Introduction; 2. Relativistic notation, Lagrangians, and interactions; 3. Gauge invariance; 4. Non-abelian gauge theories; 5. Dirac notation for spin; 6. The Standard Model Lagrangian; 7. The electroweak theory and quantum chromodynamics; 8. Masses and the Higgs mechanism; 9. Cross sections, decay widths, and lifetimes: W and Z decays; 10. Production and properties of W± and Zᴼ; 11. Measurement of electroweak and QCD parameters: the muon lifetime; 12. Accelerators - present and future; 13. Experiments and detectors; 14. Low energy and non-accelerator experiments; 15. Observation of the Higgs boson at the CERN LHC: is it the Higgs boson?; 16. Colliders and tests of the Standard Model: particles are pointlike; 17. Quarks and gluons, confinement and jets; 18. Hadrons, heavy quarks, and strong isospin invariance; 19. Coupling strengths depend on momentum transfer and on virtual particles; 20. Quark (and lepton) mixing angles; 21. CP violation; 22. Overview of physics beyond the Standard Model; 23. Grand unification; 24. Neutrino masses; 25. Dark matter; 26. Supersymmetry.

  15. Development of a 15 T Nb 3Sn accelerator dipole demonstrator at Fermilab

    DOE PAGES

    Novitski, I.; Andreev, N.; Barzi, E.; ...

    2016-06-01

    Here, a 100 TeV scale Hadron Collider (HC) with a nominal operation field of at least 15 T is being considered for the post-LHC era, which requires using the Nb 3Sn technology. Practical demonstration of this field level in an accelerator-quality magnet and substantial reduction of the magnet costs are the key conditions for realization of such a machine. FNAL has started the development of a 15 T Nb 3Sn dipole demonstrator for a 100 TeV scale HC. The magnet design is based on 4-layer shell type coils, graded between the inner and outer layers to maximize the performance andmore » reduce the cost. The experience gained during the Nb 3Sn magnet R&D is applied to different aspects of the magnet design. This paper describes the magnetic and structural designs and parameters of the 15 T Nb 3Sn dipole and the steps towards the demonstration model fabrication.« less

  16. Identifying WIMP dark matter from particle and astroparticle data

    NASA Astrophysics Data System (ADS)

    Bertone, Gianfranco; Bozorgnia, Nassim; Kim, Jong Soo; Liem, Sebastian; McCabe, Christopher; Otten, Sydney; Ruiz de Austri, Roberto

    2018-03-01

    One of the most promising strategies to identify the nature of dark matter consists in the search for new particles at accelerators and with so-called direct detection experiments. Working within the framework of simplified models, and making use of machine learning tools to speed up statistical inference, we address the question of what we can learn about dark matter from a detection at the LHC and a forthcoming direct detection experiment. We show that with a combination of accelerator and direct detection data, it is possible to identify newly discovered particles as dark matter, by reconstructing their relic density assuming they are weakly interacting massive particles (WIMPs) thermally produced in the early Universe, and demonstrating that it is consistent with the measured dark matter abundance. An inconsistency between these two quantities would instead point either towards additional physics in the dark sector, or towards a non-standard cosmology, with a thermal history substantially different from that of the standard cosmological model.

  17. Tandem accelerators in Romania: Multi-tools for science, education and technology

    NASA Astrophysics Data System (ADS)

    Burducea, I.; GhiÅ£ǎ, D. G.; Sava, T. B.; Straticiuc, M.

    2017-06-01

    An educated selection of the main beam parameters - particle type, velocity and intensity, can result in a cutting-edge scalpel to remove tumors, sanitize sewage, act as a nuclear forensics detective, date an artefact, clean up air, improve a microprocessor, transmute nuclear waste, detect a counterfeit or even look into the stars. Nowadays more than particle accelerators operate worldwide in medicine, industry and basic research. For example the proton therapy market is expected to attain 1 billion US per year in 2019 with almost 330 proton therapy rooms, while the annual market for the ion implantation industry already reached 1.5 G in revenue [1,2]. A brief history of the Tandem Accelerators Complex at IFIN-HH [3] emphasizing on their applications and the physics behind the scenes, is also presented [4-6].

  18. Optimizing pulse shaping and zooming for acceleration to high velocities and fusion neutron production on the Nike laser

    NASA Astrophysics Data System (ADS)

    Karasik, Max; Weaver, J. L.; Aglitskiy, Y.; Zalesak, S. T.; Velikovich, A. L.; Oh, J.; Obenschain, S. P.; Arikawa, Y.; Watari, T.

    2010-11-01

    We will present results from follow-on experiments to the record-high velocities of 1000 km/s achieved on Nike [Karasik et al., Phys. Plasmas 17, 056317 (2010) ], in which highly accelerated planar foils of deuterated polystyrene were made to collide with a witness foil to produce extreme shock pressures and result in heating of matter to thermonuclear temperatures. Still higher velocities and higher target densities are required for impact fast ignition. The aim of these experiments is shaping the driving pulse to minimize shock heating of the accelerated target and using the focal zoom capability of Nike to achieve higher densities and velocities. Spectroscopic measurements of electron temperature achieved upon impact will complement the neutron time-of-flight ion temperature measurement. Work is supported by US DOE and Office of Naval Research.

  19. Comparison of different hadron production models for the study of π±, K±, protons and antiprotons production in proton-carbon interactions at 90 GeV/c

    NASA Astrophysics Data System (ADS)

    Ajaz, M.; Ali, Y.; Ullah, S.; Ali, Q.; Tabassam, U.

    2018-05-01

    In this research paper, comprehensive results on the double differential yield of π± and K± mesons, protons and antiprotons as a function of laboratory momentum in several polar angle ranges from 0-420 mrad for pions, 0-360 mrad for kaons, proton and antiproton are reported. EPOS 1.99, EPOS-LHC and QGSJETII-04 models are used to perform simulations. The predictions of these models at 90 GeV/c are plotted for comparison, which shows that QGSJETII-04 model gives overall higher yield for π+ mesons in the polar angle interval of 0-40 mrad but for the π‑ the yield is higher only up to 20 mrad. For π+ mesons after 40 mrad, EPOS-LHC predicts higher yield as compared to EPOS 1.99 and QGSJETII-04 while EPOS-LHC and EPOS 1.99 give similar behavior in these two intervals. However, for π‑ mesons EPOS-LHC and EPOS 1.99 give similar behavior in these two intervals. For of K± mesons, QGSJETII-04 model gives higher predictions in all cases from 0-300 mrad, while EPOS 1.99 and EPOS-LHC show similar distributions. In case of protons, all models give similar distribution but this is not true for antiproton. All models are in good agreement for p > 20 GeV/c. EPOS 1.99 produce lower yield compared to the other two models from 60-360 mrad polar angle interval.

  20. TEST/QA PLAN FOR THE VERIFICATION TESTING OF ALTERNATIVES OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSONS, AND LUBRICANTS FOR HIGHWAY AND NONROAD USE HEAVY DUTY DIESEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  1. The US Spallation Neutron Source Project

    NASA Astrophysics Data System (ADS)

    Olsen, David K.

    1997-10-01

    Slow neutrons, with wavelengths between a few tenths to a few tens of angstroms, are an important probe for condensed-matter physics and are produced with either fission reactors or accelerator-based spallation sources. The Spallation Neutron Source (SNS) is a collaborative project between DOE National Laboratories including LBNL, LANL, BNL, ANL and ORNL to build the next research neutron source in the US. This source will be sited at ORNL and is being designed to serve the needs of the neutron science community well into the next century. The SNS consists of a 1.1-mA H- front end and a 1.0-GeV high-intensity pulsed proton linac. The 1-ms pulses from the linac will be compressed in a 221-m-circumference accumulator ring to produce 600-ns pulses at a 60-Hz rate. This accelerator system will produce spallation neutrons from a 1.0-MW liquid Hg target for a broad spectrum of neutron scattering research with an initial target hall containing 18 instruments. The baseline conceptual design, critical issues, upgrade possibilities, and the collaborative arrangement will be discussed. It is expected that SNS construction will commence in FY99 and, following a seven year project, start operation in 2006.

  2. Laser-driven ion acceleration at BELLA

    NASA Astrophysics Data System (ADS)

    Bin, Jianhui; Steinke, Sven; Ji, Qing; Nakamura, Kei; Treffert, Franziska; Bulanov, Stepan; Roth, Markus; Toth, Csaba; Schroeder, Carl; Esarey, Eric; Schenkel, Thomas; Leemans, Wim

    2017-10-01

    BELLA is a high repetiton rate PW laser and we used it for high intensity laser plasma acceleration experiments. The BELLA-i program is focused on relativistic laser plasma interaction such as laser driven ion acceleration, aiming at establishing an unique collaborative research facility providing beam time to selected external groups for fundamental physics and advanced applications. Here we present our first parameter study of ion acceleration driven by the BELLA-PW laser with truly high repetition rate. The laser repetition rate of 1Hz allows for scanning the laser pulse duration, relative focus location and target thickness for the first time at laser peak powers of above 1 PW. Furthermore, the long focal length geometry of the experiment (f ∖65) and hence, large focus size provided ion beams of reduced divergence and unprecedented charge density. This work was supported by the Director, Office of Science, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.

  3. Simplified limits on resonances at the LHC

    NASA Astrophysics Data System (ADS)

    Chivukula, R. Sekhar; Ittisamai, Pawin; Mohan, Kirtimaan; Simmons, Elizabeth H.

    2016-11-01

    In the earliest stages of evaluating new collider data, especially if a small excess may be present, it would be useful to have a method for comparing the data with entire classes of models, to get an immediate sense of which classes could conceivably be relevant. In this paper, we propose a method that applies when the new physics invoked to explain the excess corresponds to the production and decay of a single, relatively narrow, s -channel resonance. A simplifed model of the resonance allows us to convert an estimated signal cross section into general bounds on the product of the branching ratios corresponding to the dominant production and decay modes. This quickly reveals whether a given class of models could possibly produce a signal of the required size at the LHC. Our work sets up a general framework, outlines how it operates for resonances with different numbers of production and decay modes, and analyzes cases of current experimental interest, including resonances decaying to dibosons, diphotons, dileptons, or dijets. If the LHC experiments were to report their searches for new resonances beyond the standard model in the simplified limits variable ζ defined in this paper, that would make it far easier to avoid blind alleys and home in on the most likely candidate models to explain any observed excesses.

  4. LHC - a "Why" Facility

    ScienceCinema

    Gordon Kane

    2017-12-09

    The Standard Models of particle physics and cosmology describe the world we see, and how it works, very well. But we want to understand (not just accommodate) much more – how does the Higgs mechanism work, what is the dark matter, why is the universe matter and not antimatter, why is parity violated, why are the particles (quarks and leptons) what they are, and why are the forces that act on them to make our world what they are, and more. Today is an exciting time to be doing particle physics – on the experimental side we have data coming from LHC and dark matter experiments that will provide clues to these questions, and on the theoretical side we have a framework (string theory) that addresses all these “why” questions. LHC data will not qualitatively improve our description – rather, it may provide the data that will allow us to learn about the dark matter, the Higgs physics, the matter asymmetry, etc, to test underlying theories such as string theory, and begin to answer the “why” questions. Supersymmetry is the best motivated discovery, and it would also open a window to the underlying theory near the Planck scale.

  5. Beam-dynamics driven design of the LHeC energy-recovery linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pellegrini, Dario; Latina, Andrea; Schulte, Daniel

    The LHeC study is a possible upgrade of the LHC that aims at delivering an electron beam for collisions with the existing hadronic beams. The current baseline design for the electron facility consists of a multi-pass superconducting energy-recovery linac operating in a continuous wave mode. Here, we summarize the overall layout of such ERL complex located on the LHC site and introduce the most recent developments. We review of the lattice components, presenting their baseline design along with possible alternatives that aims at improving the overall machine performance. The detector bypass has been designed and integrated into the lattice. Trackingmore » simulations allowed us to verify the high current (~150 mA in the linacs) beam operation required for the LHeC to serve as an Higgs Factory. The impact of single and multi-bunch wake-fields, synchrotron radiation and beam-beam effects has been assessed in this paper.« less

  6. Beam-dynamics driven design of the LHeC energy-recovery linac

    DOE PAGES

    Pellegrini, Dario; Latina, Andrea; Schulte, Daniel; ...

    2015-12-23

    The LHeC study is a possible upgrade of the LHC that aims at delivering an electron beam for collisions with the existing hadronic beams. The current baseline design for the electron facility consists of a multi-pass superconducting energy-recovery linac operating in a continuous wave mode. Here, we summarize the overall layout of such ERL complex located on the LHC site and introduce the most recent developments. We review of the lattice components, presenting their baseline design along with possible alternatives that aims at improving the overall machine performance. The detector bypass has been designed and integrated into the lattice. Trackingmore » simulations allowed us to verify the high current (~150 mA in the linacs) beam operation required for the LHeC to serve as an Higgs Factory. The impact of single and multi-bunch wake-fields, synchrotron radiation and beam-beam effects has been assessed in this paper.« less

  7. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nucleimore » existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe« less

  8. Constraining anomalous Higgs boson couplings to the heavy-flavor fermions using matrix element techniques

    NASA Astrophysics Data System (ADS)

    Gritsan, Andrei V.; Röntsch, Raoul; Schulze, Markus; Xiao, Meng

    2016-09-01

    In this paper, we investigate anomalous interactions of the Higgs boson with heavy fermions, employing shapes of kinematic distributions. We study the processes p p →t t ¯+H , b b ¯+H , t q +H , and p p →H →τ+τ- and present applications of event generation, reweighting techniques for fast simulation of anomalous couplings, as well as matrix element techniques for optimal sensitivity. We extend the matrix element likelihood approach (MELA) technique, which proved to be a powerful matrix element tool for Higgs boson discovery and characterization during Run I of the LHC, and implement all analysis tools in the JHU generator framework. A next-to-leading-order QCD description of the p p →t t ¯+H process allows us to investigate the performance of the MELA in the presence of extra radiation. Finally, projections for LHC measurements through the end of Run III are presented.

  9. Top B physics at the LHC.

    PubMed

    Gedalia, Oram; Isidori, Gino; Maltoni, Fabio; Perez, Gilad; Selvaggi, Michele; Soreq, Yotam

    2013-06-07

    In top-pair events where at least one of the tops decays semileptonically, the identification of the lepton charge allows us to tag not only the top quark charge but also that of the subsequent b quark. In cases where the b also decays semileptonically, the charge of the two leptons can be used to probe CP violation in heavy flavor mixing and decays. This strategy to measure CP violation is independent of those adopted so far in experiments, and can already constrain non standard model sources of CP violation with current and near future LHC data. To demonstrate the potential of this method we construct two CP asymmetries based on same-sign and opposite-sign leptons and estimate their sensitivities. This proposal opens a new window for doing precision measurements of CP violation in b and c quark physics via high p(T) processes at ATLAS and CMS.

  10. Mass hierarchy and energy scaling of the Tsallis - Pareto parameters in hadron productions at RHIC and LHC energies

    NASA Astrophysics Data System (ADS)

    Bíró, Gábor; Barnaföldi, Gergely Gábor; Biró, Tamás Sándor; Shen, Keming

    2018-02-01

    The latest, high-accuracy identified hadron spectra measurements in highenergy nuclear collisions led us to the investigation of the strongly interacting particles and collective effects in small systems. Since microscopical processes result in a statistical Tsallis - Pareto distribution, the fit parameters q and T are well suited for identifying system size scalings and initial conditions. Moreover, parameter values provide information on the deviation from the extensive, Boltzmann - Gibbs statistics in finite-volumes. We apply here the fit procedure developed in our earlier study for proton-proton collisions [1, 2]. The observed mass and center-of-mass energy trends in the hadron production are compared to RHIC dAu and LHC pPb data in different centrality/multiplicity classes. Here we present new results on mass hierarchy in pp and pA from light to heavy hadrons.

  11. Preliminary consideration of a double, 480 GeV, fast cycling proton accelerator for production of neutrino beams at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piekarz, Henryk; Hays, Steven; /Fermilab

    We propose to build the DSF-MR (Double Super-Ferric Main Ring), 480 GeV, fast-cycling (2 second repetition rate) two-beam proton accelerator in the Main Ring tunnel of Fermilab. This accelerator design is based on the super-ferric magnet technology developed for the VLHC, and extended recently to the proposed LER injector for the LHC and fast cycling SF-SPS at CERN. The DSF-MR accelerator system will constitute the final stage of the proton source enabling production of two neutrino beams separated by 2 second time period. These beams will be sent alternately to two detectors located at {approx} 3000 km and {approx} 7500more » km away from Fermilab. It is expected that combination of the results from these experiments will offer more than 3 order of magnitudes increased sensitivity for detection and measurement of neutrino oscillations with respect to expectations in any current experiment, and thus may truly enable opening the window into the physics beyond the Standard Model. We examine potential sites for the long baseline neutrino detectors accepting beams from Fermilab. The current injection system consisting of 400 MeV Linac, 8 GeV Booster and the Main Injector can be used to accelerate protons to 45 GeV before transferring them to the DSF-MR. The implementation of the DSF-MR will allow for an 8-fold increase in beam power on the neutrino production target. In this note we outline the proposed new arrangement of the Fermilab accelerator complex. We also briefly describe the DSF-MR magnet design and its power supply, and discuss necessary upgrade of the Tevatron RF system for the use with the DSF-MR accelerator. Finally, we outline the required R&D, cost estimate and possible timeline for the implementation of the DSF-MR accelerator.« less

  12. US Particle Accelerators at Age 50.

    ERIC Educational Resources Information Center

    Wilson, R. R.

    1981-01-01

    Reviews the development of accelerators over the past 50 years. Topics include: types of accelerators, including cyclotrons; sociology of accelerators (motivation, financing, construction, and use); impact of war; national laboratories; funding; applications; future projects; foreign projects; and international collaborations. (JN)

  13. Measurement of the semileptonic CP violating asymmetry a(sl)s in B(s)0 decays and the D(s)+ - D(s)- production asymmetry in 7 TeV pp collisions

    NASA Astrophysics Data System (ADS)

    Xing, Zhou

    The large hadron collider (LHC) at the European Organization for Nuclear Research (CERN) in Geneva is the world's largest and highest-energy particle accelerator. It is located in a tunnel with a circumference of 27 kilometers (17 miles) whose synchrotron is designed to mainly collide opposing particle beams of protons with energy up to 7 TeV in 2011 and 8 TeV in 2012. LHC is designed to address some of the fundamental open questions in physics regarding the basic laws governing the interactions and forces among the elementary particles. Among the four major experiments at LHC: A Toroidal LHC Apparatus (ATLAS), Compact Muon Solenoid (CMS), Large Hadron Collider beauty (LHCb) and A Large Ion Collider Experiment (ALICE), LHCb is the one that is specialized on heavy flavor physics whose goal is to measure the Charge Parity Violation (CPV) parameters and rare decays of the Beauty and Charm hadrons. Such studies can help to explain the Matter-Antimatter asymmetry of the Universe. CP violation searches are performed at LHCb in quite a few probing decay channels and systems. In B0s-B0s mixing, the CP violation is expected to be tiny in the Standard Model, but can be significantly enhanced in the presence of new CP violation phases in general physics models. This thesis presents, in Chapter 5, the measurement of semileptonic asymmetry assl in B0s-B0s mixing system at LHCb. The CP-violating asymmetry assl is studied using samples of B0s and B0s semileptonic decays in pp collisions at a centre-of-mass energy of 7 TeV using a data sample, corresponding to an integrated luminosity of 1 fb-1 collected by LHCb. The detected final states are D+/-sm∓ , with the D+/-s particle reconstructed in the φpi+/- mode. The D+/-sm∓ yields are summed over untagged B0s and B0s initial states, and integrated with respect to decay time. Data-driven methods are used to measure all the efficiency ratios needed to determine assl . We obtain assl = (-0.06 +/- 0.50 +/- 0.36)%, where the first uncertainty is statistical and the second systematic. Specific attention is drawn to an elegant data-driven approach that is developed to determine the relative pion detection efficiency as described in Chapter 3. It is a key building block of the assl measurement and can open many other doors to CP searches at LHCb. As a "litmus test" for this tool, we measure the D+s-D-s production asymmetry using φpi+/- mode in 7 TeV pp collisions at LHC in Chapter 4. Heavy quark production in 7 TeV center-of-mass energy of pp collisions at the LHC is not necessarily flavor symmetric. The production asymmetry, AP, between D+s and D-s mesons is studied using the φpi+/- decay mode in a data sample of 1.0 fb-1 collected with the LHCb detector. The difference between pi+ and pi -- detection efficiencies is determined using the ratios of fully reconstructed to partially reconstructed D*+/- decays. The overall production asymmetry in the D+/-s rapidity region 2.0 to 4.5 with transverse momentum larger than 2 GeV is measured to be AP=(-0.33 +/- 0.22 +/- 0.10)%. While theoretical predictions are difficult and vague, a precise measure of the production asymmetry constrains future heavy quark models and can be used as inputs for other CP measurements.

  14. Search strategies for pair production of heavy Higgs bosons decaying invisibly at the LHC

    NASA Astrophysics Data System (ADS)

    Arganda, E.; Diaz-Cruz, J. L.; Mileo, N.; Morales, R. A.; Szynkman, A.

    2018-04-01

    The search for heavy Higgs bosons at the LHC represents an intense experimental program, carried out by the ATLAS and CMS collaborations, which includes the hunt for invisible Higgs decays and dark matter candidates. No significant deviations from the SM backgrounds have been observed in any of these searches, imposing significant constraints on the parameter space of different new physics models with an extended Higgs sector. Here we discuss an alternative search strategy for heavy Higgs bosons decaying invisibly at the LHC, focusing on the pair production of a heavy scalar H together with a pseudoscalar A, through the production mode q q bar →Z* → HA. We identify as the most promising signal the final state made up of 4 b +ET miss, coming from the heavy scalar decay mode H → hh → b b bar b b bar , with h being the discovered SM-like Higgs boson with mh = 125GeV, together with the invisible channel of the pseudoscalar. We work within the context of simplified MSSM scenarios that contain quite heavy sfermions of most types with O (10)TeV masses, while the stops are heavy enough to reproduce the 125 GeV mass for the lightest SM-like Higgs boson. By contrast, the gauginos/higgsinos and the heavy MSSM Higgs bosons have masses near the EW scale. Our search strategies, for a LHC center-of-mass energy of √{ s } = 14TeV, allow us to obtain statistical significances of the signal over the SM backgrounds with values up to ∼ 1.6 σ and ∼ 3 σ, for total integrated luminosities of 300fb-1 and 1000fb-1, respectively.

  15. Production of vector resonances at the LHC via WZ-scattering: a unitarized EChL analysis

    NASA Astrophysics Data System (ADS)

    Delgado, R. L.; Dobado, A.; Espriu, D.; Garcia-Garcia, C.; Herrero, M. J.; Marcano, X.; Sanz-Cillero, J. J.

    2017-11-01

    In the present work we study the production of vector resonances at the LHC by means of the vector boson scattering WZ → WZ and explore the sensitivities to these resonances for the expected future LHC luminosities. We are assuming that these vector resonances are generated dynamically from the self interactions of the longitudinal gauge bosons, W L and Z L , and work under the framework of the electroweak chiral Lagrangian to describe in a model independent way the supposedly strong dynamics of these modes. The properties of the vector resonances, mass, width and couplings to the W and Z gauge bosons are derived from the inverse amplitude method approach. We implement all these features into a single model, the IAM-MC, adapted for MonteCarlo, built in a Lagrangian language in terms of the electroweak chiral Lagrangian and a chiral Lagrangian for the vector resonances, which mimics the resonant behavior of the IAM and provides unitary amplitudes. The model has been implemented in MadGraph, allowing us to perform a realistic study of the signal versus background events at the LHC. In particular, we have focused our study on the pp → WZjj type of events, discussing first on the potential of the hadronic and semileptonic channels of the final WZ, and next exploring in more detail the most clear signals. These are provided by the leptonic decays of the gauge bosons, leading to a final state with ℓ 1 + ℓ 1 - ℓ 2 + νjj, ℓ = e, μ, having a very distinctive signature, and showing clearly the emergence of the resonances with masses in the range of 1.5-2.5 TeV, which we have explored.

  16. R&D Toward a Neutrino Factory and Muon Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zisman, Michael S

    2011-03-20

    Significant progress has been made in recent years in R&D towards a neutrino factory and muon collider. The U.S. Muon Accelerator Program (MAP) has been formed recently to expedite the R&D efforts. This paper will review the U.S. MAP R&D programs for a neutrino factory and muon collider. Muon ionization cooling research is the key element of the program. The first muon ionization cooling demonstration experiment, MICE (Muon Ionization Cooling Experiment), is under construction now at RAL (Rutherford Appleton Laboratory) in the UK. The current status of MICE will be described.

  17. ARPA-E: Accelerating U.S. Energy Innovation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manser, Joseph S.; Rollin, Joseph A.; Brown, Kristen E.

    With aggressive commitments to mitigate the impacts of climate change and emphasis on maintaining an advantage in technological development in an increasingly globalized marketplace, the U.S. government is actively taking measures to ensure the nation’s environmental and economic health and sustainability. As part of its broader strategy, with motivation from the National Academies,(1) the United States established the Advanced Research Project Agency-Energy (ARPA-E) within the Department of Energy (DOE) through the America Competes Act in 2007.(2) The agency was allotted an initial appropriation of $400 million in 2009 as part of the American Recovery and Reinvestment Act.

  18. Reconstitution of the Light Harvesting Chlorophyll a/b Pigment-Protein Complex into Developing Chloroplast Membranes Using a Dialyzable Detergent 1

    PubMed Central

    Darr, Sylvia C.; Arntzen, Charles J.

    1986-01-01

    Conditions were developed to isolate the light-harvesting chlorophyll-protein complex serving photosystem II (LHC-II) using a dialyzable detergent, octylpolyoxyethylene. This LHC-II was successfully reconstituted into partially developed chloroplast thylakoids of Hordeum vulgare var Morex (barley) seedlings which were deficient in LHC-II. Functional association of LHC-II with the photosystem II (PSII) core complex was measured by two independent functional assays of PSII sensitization by LHC-II. A 3-fold excess of reconstituted LHC-II was required to equal the activity of LHC developing in vivo. We suggest that a linker component may be absent in the partially developed membranes which is required for specific association of the PSII core complex and LHC-II. Images Fig. 1 PMID:16664744

  19. Historical review of tactical missile airframe developments

    NASA Technical Reports Server (NTRS)

    Spearman, M. L.

    1992-01-01

    A comprehensive development history of missile airframe aerodynamics is presented, encompassing ground-, ground vehicle-, ship-, and air-launched categories of all ranges short of strategic. Emphasis is placed on the swift acceleration of missile configuration aerodynamics by German researchers in the course of the Second World War and by U.S. research establishments thereafter, often on the foundations laid by German workers. Examples are given of foundational airframe design criteria established by systematic researches undertaken in the 1950s, regarding L/D ratios, normal force and pitching moment characteristics, minimum drag forebodies and afterbodies, and canard and delta winged configuration aerodynamics.

  20. Theoretical and Experimental Studies in Accelerator Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenzweig, James

    This report describes research supported by the US Dept. of Energy Office of High Energy Physics (OHEP), performed by the UCLA Particle Beam Physics Laboratory (PBPL). The UCLA PBPL has, over the last two decades-plus, played a critical role in the development of advanced accelerators, fundamental beam physics, and new applications enabled by these thrusts, such as new types of accelerator-based light sources. As the PBPL mission is broad it is natural that it has been grown within the context of the accelerator science and technology stewardship of the OHEP. Indeed, steady OHEP support for the program has always beenmore » central to the success of the PBPL; it has provided stability, and above all has set the over-arching themes for our research directions, which have producing over 500 publications (>120 in high level journals). While other agency support has grown notably in recent years, permitting more vigorous pursuit of the program, it is transient by comparison. Beyond permitting program growth in a time of flat OHEP budgets, the influence of other agency missions is found in push to adapt advanced accelerator methods to applications, in light of the success the field has had in proof-of-principle experiments supported first by the DoE OHEP. This three-pronged PBPL program — advanced accelerators, fundamental beam physics and technology, and revolutionary applications — has produced a generation of students that have had a profound affect on the US accelerator physics community. PBPL graduates, numbering 28 in total, form a significant population group in the accelerator community, playing key roles as university faculty, scientific leaders in national labs (two have been named Panofsky Fellows at SLAC), and vigorous proponents of industrial application of accelerators. Indeed, the development of advanced RF, optical and magnet technology at the PBPL has led directly to the spin-off company, RadiaBeam Technologies, now a leading industrial accelerator firm. We note also that PBPL graduates remain as close elaborators for the program after leaving UCLA. The UCLA PBPL program is a foremost developer of on-campus facilities, such as the Neptune and Pegasus Laboratories, providing a uniquely strong environment for student-based research. In addition, the PBPL is a strong user of off-campus national lab facilities, such as SLAC FACET and NLCTA, and the BNL ATF. UCLA has also vigorously participated in the development of these facilities. The dual emphases on off- and on-campus opportunities permit the PBPL to address in an agile way a wide selection of cutting-edge research topics. The topics embraced by this proposal illustrate this program aspect well. These include: GV/m dielectric wakefield acceleration/coherent Cerenkov radiation experiments at FACET (E-201) and the ATF; synergistic laser-excited dielectric accelerator and light source development; plasma wakefield (PWFA) experiments on “Trojan horse” ionization injection (FACET E-210), quasi-nonlinear PWFA at BNL and the production at Neptune high transformer ratio plasma wakes; the inauguration of a new type of RF photoinjector termed “hybrid” at UCLA, and application to PWFA; space-charge dominated beam and cathode/near cathode physics; the study of advanced IFEL systems, for very high energy gain and utilization of novel OAM modes; the physcis of inverse Compton scattering (ICS), with applications to e+ production and γγ colliders; electron diffraction; and advanced beam diagnostics using coherent imaging techniques. These subjects are addressed under the leadership of PBPL director Prof. James Rosenzweig in Task A, and Prof. Pietro Musumeci in Task J, which was initiated following his OHEP Outstanding Junior Investigator award.« less

  1. Use of zooming and pulseshaping for acceleration to high velocities and fusion neutron production on the Nike laser

    NASA Astrophysics Data System (ADS)

    Karasik, Max; Weaver, J. L.; Aglitskiy, Y.; Kehne, D. M.; Zalesak, S. T.; Velikovich, A. L.; Oh, J.; Obenschain, S. P.; Arikawa, Y.

    2011-10-01

    We will present results from follow-on experiments to the record-high velocities of 1000 km/s achieved on Nike [Karasik et al, Phys. Plasmas 17, 056317(2010)], in which highly accelerated planar foils of deuterated polystyrene were made to collide with a witness foil to produce ~ 1 Gbar shock pressures and result in heating of matter to thermonuclear temperatures. Still higher velocities and higher target densities are required for impact fast ignition. The aim of these experiments is using the focal zoom capability of Nike and shaping the driving pulse to minimize shock heating of the accelerated target to achieve higher densities and velocities. In-flight target density is inferred from target heating upon collision via DD neutron time-of-flight ion temperature measurement. Work is supported by US DOE (NNSA) and Office of Naval Research. SAIC

  2. High peak current acceleration of narrow divergence ions beams with the BELLA-PW laser

    NASA Astrophysics Data System (ADS)

    Steinke, Sven; Ji, Qing; Treffert, Franziska; Bulanov, Stepan; Bin, Jianhui; Nakamura, Kei; Gonsalves, Anthony; Toth, Csaba; Park, Jaehong; Roth, Markus; Esarey, Eric; Schenkel, Thomas; Leemans, Wim

    2017-10-01

    We present a parameter study of ion acceleration driven by the BELLA-PW laser. The laser repetition rate of 1Hz allowed for scanning the laser pulse duration, relative focus location and target thickness for the first time at laser peak powers of above 1 petawatt. Further, the long focal length geometry of the experiment (f\\65) and hence, large focus size provided ion beams of reduced divergence and unprecedented charge density. This work was supported by Office of Science, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231 and Laboratory Directed Research and Development (LDRD) funding from Lawrence Berkeley National Laboratory.

  3. Experimental Results from a Resonant Dielectric Laser Accelerator

    NASA Astrophysics Data System (ADS)

    Yoder, Rodney; McNeur, Joshua; Sozer, Esin; Travish, Gil; Hazra, Kiran Shankar; Matthews, Brian; England, Joel; Peralta, Edgar; Wu, Ziran

    2015-04-01

    Laser-powered accelerators have the potential to operate with very large accelerating gradients (~ GV/m) and represent a path toward extremely compact colliders and accelerator technology. Optical-scale laser-powered devices based on field-shaping structures (known as dielectric laser accelerators, or DLAs) have been described and demonstrated recently. Here we report on the first experimental results from the Micro-Accelerator Platform (MAP), a DLA based on a slab-symmetric resonant optical-scale structure. As a resonant (rather than near-field) device, the MAP is distinct from other DLAs. Its cavity resonance enhances its accelerating field relative to the incoming laser fields, which are coupled efficiently through a diffractive optic on the upper face of the device. The MAP demonstrated modest accelerating gradients in recent experiments, in which it was powered by a Ti:Sapphire laser well below its breakdown limit. More detailed results and some implications for future developments will be discussed. Supported in part by the U.S. Defense Threat Reduction Agency (UCLA); U.S. Dept of Energy (SLAC); and DARPA (SLAC).

  4. Accelerated Leach Testing of GLASS: ALTGLASS Version 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trivelpiece, Cory L.; Jantzen, Carol M.; Crawford, Charles L.

    The Accelerated Leach Testing of GLASS (ALTGLASS) database is a collection of data from short- and long-term product consistency tests (PCT, ASTM C1285 A and B) on high level waste (HLW) as well as low activity waste (LAW) glasses. The database provides both U.S. and international researchers with an archive of experimental data for the purpose of studying, modeling, or validating existing models of nuclear waste glass corrosion. The ALTGLASS database is maintained and updated by researchers at the Savannah River National Laboratory (SRNL). This newest version, ALTGLASS Version 3.0, has been updated with an additional 503 rows of datamore » representing PCT results from corrosion experiments conducted in the United States by the Savannah River National Laboratory, Pacific Northwest National Laboratory, Argonne National Laboratory, and the Vitreous State Laboratory (SRNL, PNNL, ANL, VSL, respectively) as well as the National Nuclear Laboratory (NNL) in the United Kingdom.« less

  5. Overview of Heavy Ion Fusion Accelerator Research in the U. S.

    NASA Astrophysics Data System (ADS)

    Friedman, Alex

    2002-12-01

    This article provides an overview of current U.S. research on accelerators for Heavy Ion Fusion, that is, inertial fusion driven by intense beams of heavy ions with the goal of energy production. The concept, beam requirements, approach, and major issues are introduced. An overview of a number of new experiments is presented. These include: the High Current Experiment now underway at Lawrence Berkeley National Laboratory; studies of advanced injectors (and in particular an approach based on the merging of multiple beamlets), being investigated experimentally at Lawrence Livermore National Laboratory); the Neutralized (chamber) Transport Experiment being assembled at Lawrence Berkeley National Laboratory; and smaller experiments at the University of Maryland and at Princeton Plasma Physics Laboratory. The comprehensive program of beam simulations and theory is outlined. Finally, prospects and plans for further development of this promising approach to fusion energy are discussed.

  6. Phenotype characterization of embryoid body structures generated by a crystal comet effect tail in an intercellular cancer collision scenario.

    PubMed

    Diaz, Jairo A; Murillo, Mauricio F

    2012-01-01

    Cancer is, by definition, the uncontrolled growth of autonomous cells that eventually destroy adjacent tissues and generate architectural disorder. However, this concept cannot be totally true. In three well documented studies, we have demonstrated that cancer tissues produce order zones that evolve over time and generate embryoid body structures in a space-time interval. The authors decided to revise the macroscopic and microscopic material in well-developed malignant tumors in which embryoid bodies were identified to determine the phenotype characterization that serves as a guideline for easy recognition. The factors responsible for this morphogenesis are physical, bioelectric, and magnetic susceptibilities produced by crystals that act as molecular designers for the topographic gradients that guide the surrounding silhouette and establish tissue head-tail positional identities. The structures are located in amniotic-like cavities and show characteristic somite-like embryologic segmentation. Immunophenotypic study has demonstrated exclusion factor positional identity in relation to enolase-immunopositive expression of embryoid body and human chorionic gonadotropin immunopositivity exclusion factor expression in the surrounding tissues. The significance of these observations is that they can also be predicted by experimental image data collected by the Large Hadron Collider (LHC) accelerator at the European Organization for Nuclear Research, in which two-beam subatomic collision particles in the resulting debris show hyperorder domains similar to those identified by us in intercellular cancer collisions. Our findings suggest that we are dealing with true reverse biologic system information in an activated collective cancer stem cell memory, in which physics participates in the elaboration of geometric complexes and chiral biomolecules that serve to build bodies with embryoid print as it develops during gestation. Reversal mechanisms in biology are intimately linked with DNA repair. Further genotype studies must be carried out to determine whether the subproducts of these structures can be used in novel strategies to treat cancer.

  7. Phenotype characterization of embryoid body structures generated by a crystal comet effect tail in an intercellular cancer collision scenario

    PubMed Central

    Diaz, Jairo A; Murillo, Mauricio F

    2012-01-01

    Cancer is, by definition, the uncontrolled growth of autonomous cells that eventually destroy adjacent tissues and generate architectural disorder. However, this concept cannot be totally true. In three well documented studies, we have demonstrated that cancer tissues produce order zones that evolve over time and generate embryoid body structures in a space-time interval. The authors decided to revise the macroscopic and microscopic material in well-developed malignant tumors in which embryoid bodies were identified to determine the phenotype characterization that serves as a guideline for easy recognition. The factors responsible for this morphogenesis are physical, bioelectric, and magnetic susceptibilities produced by crystals that act as molecular designers for the topographic gradients that guide the surrounding silhouette and establish tissue head-tail positional identities. The structures are located in amniotic-like cavities and show characteristic somite-like embryologic segmentation. Immunophenotypic study has demonstrated exclusion factor positional identity in relation to enolase-immunopositive expression of embryoid body and human chorionic gonadotropin immunopositivity exclusion factor expression in the surrounding tissues. The significance of these observations is that they can also be predicted by experimental image data collected by the Large Hadron Collider (LHC) accelerator at the European Organization for Nuclear Research, in which two-beam subatomic collision particles in the resulting debris show hyperorder domains similar to those identified by us in intercellular cancer collisions. Our findings suggest that we are dealing with true reverse biologic system information in an activated collective cancer stem cell memory, in which physics participates in the elaboration of geometric complexes and chiral biomolecules that serve to build bodies with embryoid print as it develops during gestation. Reversal mechanisms in biology are intimately linked with DNA repair. Further genotype studies must be carried out to determine whether the subproducts of these structures can be used in novel strategies to treat cancer. PMID:22346365

  8. MR-2016 US-Japan Workshop on Magentic Reconnection Travel Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forest, Cary

    The US-Japan workshop on Magnetic Reconnection (MR2016), was held in Napa California from March 7th through 11th, 2016. Details about the program, including invited speakers can be found here: (http://www.magneticreconnection.org/mr2016). Background: The MR Workshop is an international meeting that began in 2000 with its original focus on magnetic reconnection serving as a link between the research groups in US and Japan. Since then, the meeting has grown and is now recognized as one of the primary international workshops on magnetic reconnection. In its format, researchers from both the laboratory community and from the space research community have held 12 workshopsmore » bringing together the diverse researchers from the space and laboratory experimental fields. Plasma physics is the common language that ties together all scientists who study the waves, particle acceleration and heating, magnetic reconnection, dynamos, global and micro-stability of plasmas, magnetic turbulence and plasma’s transport problems. The meeting received $9,575 from the U.S. Dept. of Energy funding. This support was used to cover the registration fees ($575 per person) and accommodations for ten junior colleagues (graduate students and postdocs). Applications were solicited and then reviewed by the program committee based on recommendations from the applicants’ advisers.« less

  9. Simulation of orientational coherent effects via Geant4

    NASA Astrophysics Data System (ADS)

    Bagli, E.; Asai, M.; Brandt, D.; Dotti, A.; Guidi, V.; Verderi, M.; Wright, D.

    2017-10-01

    Simulation of orientational coherent effects via Geant4 beam manipulation of high-and very-high-energy particle beams is a hot topic in accelerator physics. Coherent effects of ultra-relativistic particles in bent crystals allow the steering of particle trajectories thanks to the strong electrical field generated between atomic planes. Recently, a collimation experiment with bent crystals was carried out at the CERN-LHC, paving the way to the usage of such technology in current and future accelerators. Geant4 is a widely used object-oriented tool-kit for the Monte Carlo simulation of the interaction of particles with matter in high-energy physics. Moreover, its areas of application include also nuclear and accelerator physics, as well as studies in medical and space science. We present the first Geant4 extension for the simulation of orientational effects in straight and bent crystals for high energy charged particles. The model allows the manipulation of particle trajectories by means of straight and bent crystals and the scaling of the cross sections of hadronic and electromagnetic processes for channeled particles. Based on such a model, an extension of the Geant4 toolkit has been developed. The code and the model have been validated by comparison with published experimental data regarding the deflection efficiency via channeling and the variation of the rate of inelastic nuclear interactions.

  10. pp interaction at very high energies in cosmic ray experiments

    NASA Astrophysics Data System (ADS)

    Kendi Kohara, A.; Ferreira, Erasmo; Kodama, Takeshi

    2014-11-01

    An analysis of p-air cross section data from extensive air shower measurements is presented, based on an analytical representation of the pp scattering amplitudes that describes with high precision all available accelerator data at ISR, SPS and LHC energies. The theoretical basis of the representation, together with the very smooth energy dependence of parameters controlled by unitarity and dispersion relations, permits reliable extrapolation to high energy cosmic ray (CR) and asymptotic energy ranges. Calculations of σ p-airprod based on Glauber formalism are made using the input values of the quantities σ , ρ , BI and BR at high energies, with attention given to the independence of the slope parameters, with {{B}R}\

  11. Laboratory and field testing of an accelerated bridge construction demonstration bridge : US Highway 6 bridge over Keg Creek.

    DOT National Transportation Integrated Search

    2013-04-01

    The US Highway 6 Bridge over Keg Creek outside of Council Bluffs, Iowa is a demonstration bridge site chosen to put into practice : newly-developed Accelerated Bridge Construction (ABC) concepts. One of these new concepts is the use of prefabricated ...

  12. A experimental research program on chirality at the LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markert, Christina

    Heavy-ion collisions provide a unique opportunity to investigate the fundamental laws of physics of the strong force. The extreme conditions created by the collisions within a finite volume are akin to the properties of the deconfined partonic state which existed very shortly after the Big Bang and just prior to visible matter formation in the Universe. In this state massless quarks and gluons (partons) are ``quasi free" particles, the so-called Quark Gluon Plasma (QGP). By following the expansion and cooling of this state, we will map out the process of nucleonic matter formation, which occurs during the phase transition. Themore » fundamental properties of this early partonic phase of matter are not well understood, but they are essential for confirming QCD (Quantum Chromo-Dynamics) and the Standard Model. The specific topic, chiral symmetry restoration, has been called ``the remaining puzzle of QCD.'' This puzzle can only be studied in the dense partonic medium generated in heavy-ion collisions. The research objectives of this proposal are the development and application of new analysis strategies to study chirality and the properties of the medium above the QGP phase transition using hadronic resonances detected with the ALICE experiment at the Large Hadron Collider (LHC) at the CERN research laboratory in Switzerland. This grant funded a new effort at the University of Texas at Austin (UT Austin) to investigate the Quark Gluon Plasma (QGP) at the highest possible energy of 2.76 TeV per nucleon at the Large Hadron Collider (LHC) at CERN via the ALICE experiment. The findings added to our knowledge of the dynamical evolution and the properties of the hot, dense matter produced in heavy-ion collisions, and provided a deeper understanding of multi-hadron interactions in these extreme nuclear matter systems. Our group contributed as well to the hardware and software for the ALICE USA-funded Calorimeter Detector (EMCal). The LHC research program and its connection to fundamental questions in high energy, nuclear and astrophysics has triggered the imagination of many young students worldwide. The studies also promoted the early involvement of students and young postdocs in a large, multi-national research effort abroad, which provided them with substantial experience and skills prior to choosing their career path. The undergraduate program, in conjunction with the Freshman Research Initiative at UT Austin, allowed the students to complete a research project within the field of Nuclear Physics.« less

  13. A review of polymer electrolyte membrane fuel cell durability test protocols

    NASA Astrophysics Data System (ADS)

    Yuan, Xiao-Zi; Li, Hui; Zhang, Shengsheng; Martin, Jonathan; Wang, Haijiang

    Durability is one of the major barriers to polymer electrolyte membrane fuel cells (PEMFCs) being accepted as a commercially viable product. It is therefore important to understand their degradation phenomena and analyze degradation mechanisms from the component level to the cell and stack level so that novel component materials can be developed and novel designs for cells/stacks can be achieved to mitigate insufficient fuel cell durability. It is generally impractical and costly to operate a fuel cell under its normal conditions for several thousand hours, so accelerated test methods are preferred to facilitate rapid learning about key durability issues. Based on the US Department of Energy (DOE) and US Fuel Cell Council (USFCC) accelerated test protocols, as well as degradation tests performed by researchers and published in the literature, we review degradation test protocols at both component and cell/stack levels (driving cycles), aiming to gather the available information on accelerated test methods and degradation test protocols for PEMFCs, and thereby provide practitioners with a useful toolbox to study durability issues. These protocols help prevent the prolonged test periods and high costs associated with real lifetime tests, assess the performance and durability of PEMFC components, and ensure that the generated data can be compared.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    East, D. R.; Sexton, J.

    This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and IBM TJ Watson Research Center to research, assess feasibility and develop an implementation plan for a High Performance Computing Innovation Center (HPCIC) in the Livermore Valley Open Campus (LVOC). The ultimate goal of this work was to help advance the State of California and U.S. commercial competitiveness in the arena of High Performance Computing (HPC) by accelerating the adoption of computational science solutions, consistent with recent DOE strategy directives. The desired result of this CRADA was a well-researched,more » carefully analyzed market evaluation that would identify those firms in core sectors of the US economy seeking to adopt or expand their use of HPC to become more competitive globally, and to define how those firms could be helped by the HPCIC with IBM as an integral partner.« less

  15. Feasibility Studies for Single Transverse-Spin Asymmetry Measurements at a Fixed-Target Experiment Using the LHC Proton and Lead Beams (AFTER@LHC)

    DOE PAGES

    Kikoła, Daniel; Echevarria, Miguel García; Hadjidakis, Cynthia; ...

    2017-05-17

    Measurement of Single Transverse-Spin Asymmetrymore » $$A_N$$ for various quarkonia states and Drell-Yan lepton pairs can shed light on the orbital angular momentum of quarks and gluons, a fundamental ingredient of the spin puzzle of the proton. The AFTER@LHC experiment combines a unique kinematic coverage and large luminosities of the Large Hadron Collider beams to deliver precise measurements, complementary to the knowledge provided by collider experiments such as RHIC. Here, we report on sensitivity studies for $$J/\\Psi$$, $$\\Upsilon$$ and Drell-Yan $$A_N$$ done using the performance of a LHCb-like and ALICE-like detectors, combined with a polarised hydrogen and $^3$He target. Particularly, such research will provide new insights and knowledge about transverse-momentum-dependent parton distribution functions for quarks and gluons and on twist-3 collinear matrix elements in a proton and a neutron.« less

  16. Qualification of Sub-Atmospheric Pressure Sensors for the Cryomagnet Bayonet Heat Exchangers of the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Bager, T.; Casas-Cubillos, J.; Jeanmonod, N.

    2006-04-01

    The superconducting magnets of the Large Hadron Collider (LHC) will be cooled at 1.9 K by distributed cooling loops working with saturated two-phase superfluid helium flowing in 107 m long bayonet heat exchangers located in each magnet cold-mass cell. The temperature of the magnets could be difficult to control because of the large dynamic heat load variations. Therefore, it is foreseen to measure the heat exchangers pressure to feed the regulation loops with the corresponding saturation temperature. The required uncertainty of the sub-atmospheric saturation pressure measurement shall be of the same order of the one associated to the magnet thermometers, in pressure it translates as ±5 Pa at 1.6 kPa. The transducers shall be radiation hard as they will endure, in the worst case, doses up to 10 kGy and 1015 neutronsṡcm-2 over 10 years. The sensors under evaluation were installed underground in the dump section of the SPS accelerator with a radiation environment close to the one expected for the LHC. The monitoring equipment was installed in a remote radiation protected area. This paper presents the results of the radiation qualification campaign with emphasis on the reliability and accuracy of the pressure sensors under the test conditions.

  17. Integration of Panda Workload Management System with supercomputers

    NASA Astrophysics Data System (ADS)

    De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.

    2016-09-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms. We will present our current accomplishments in running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facility's infrastructure for High Energy and Nuclear Physics, as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.

  18. Report of an EU-US symposium on understanding nutrition-related consumer behavior: strategies to promote a lifetime of healthy food choices.

    PubMed

    Friedl, Karl E; Rowe, Sylvia; Bellows, Laura L; Johnson, Susan L; Hetherington, Marion M; de Froidmont-Görtz, Isabelle; Lammens, Veerle; Hubbard, Van S

    2014-01-01

    This report summarizes an EU-US Task Force on Biotechnology Research symposium on healthy food choices and nutrition-related purchasing behaviors. This meeting was unique in its transdisciplinary approach to obesity and in bringing together scientists from academia, government, and industry. Discussion relevant to funders and researchers centered on (1) increased use of public-private partnerships, (2) the complexity of food behaviors and obesity risk and multilevel aspects that must be considered, and (3) the importance of transatlantic cooperation and collaboration that could accelerate advances in this field. A call to action stressed these points along with a commitment to enhanced communication strategies. Copyright © 2014 Society for Nutrition Education and Behavior. All rights reserved.

  19. ELIMED: a new hadron therapy concept based on laser driven ion beams

    NASA Astrophysics Data System (ADS)

    Cirrone, Giuseppe A. P.; Margarone, Daniele; Maggiore, Mario; Anzalone, Antonello; Borghesi, Marco; Jia, S. Bijan; Bulanov, Stepan S.; Bulanov, Sergei; Carpinelli, Massimo; Cavallaro, Salvatore; Cutroneo, Mariapompea; Cuttone, Giacomo; Favetta, Marco; Gammino, Santo; Klimo, Ondrej; Manti, Lorenzo; Korn, Georg; La Malfa, Giuseppe; Limpouch, Jiri; Musumarra, Agatino; Petrovic, Ivan; Prokupek, Jan; Psikal, Jan; Ristic-Fira, Aleksandra; Renis, Marcella; Romano, Francesco P.; Romano, Francesco; Schettino, Giuseppe; Schillaci, Francesco; Scuderi, Valentina; Stancampiano, Concetta; Tramontana, Antonella; Ter-Avetisyan, Sargis; Tomasello, Barbara; Torrisi, Lorenzo; Tudisco, Salvo; Velyhan, Andriy

    2013-05-01

    Laser accelerated proton beams have been proposed to be used in different research fields. A great interest has risen for the potential replacement of conventional accelerating machines with laser-based accelerators, and in particular for the development of new concepts of more compact and cheaper hadrontherapy centers. In this context the ELIMED (ELI MEDical applications) research project has been launched by INFN-LNS and ASCR-FZU researchers within the pan-European ELI-Beamlines facility framework. The ELIMED project aims to demonstrate the potential clinical applicability of optically accelerated proton beams and to realize a laser-accelerated ion transport beamline for multi-disciplinary user applications. In this framework the eye melanoma, as for instance the uveal melanoma normally treated with 62 MeV proton beams produced by standard accelerators, will be considered as a model system to demonstrate the potential clinical use of laser-driven protons in hadrontherapy, especially because of the limited constraints in terms of proton energy and irradiation geometry for this particular tumour treatment. Several challenges, starting from laser-target interaction and beam transport development up to dosimetry and radiobiology, need to be overcome in order to reach the ELIMED final goals. A crucial role will be played by the final design and realization of a transport beamline capable to provide ion beams with proper characteristics in terms of energy spectrum and angular distribution which will allow performing dosimetric tests and biological cell irradiation. A first prototype of the transport beamline has been already designed and other transport elements are under construction in order to perform a first experimental test with the TARANIS laser system by the end of 2013. A wide international collaboration among specialists of different disciplines like Physics, Biology, Chemistry, Medicine and medical doctors coming from Europe, Japan, and the US is growing up around the ELIMED project with the aim to work on the conceptual design, technical and experimental realization of this core beamline of the ELI Beamlines facility.

  20. Connecting LHC signals with deep physics at the TeV scale and baryogenesis

    NASA Astrophysics Data System (ADS)

    Shu, Jing

    We address in this dissertation two primary questions aimed at deciphering collider signals at the Large Hadron Collider (LHC) to give a deep and concrete understanding of the TeV scale physics and to interpret the origin of baryon asymmetry in our universe. We are at a stage of exploring new physics at the terascale which is responsible for the electroweak symmetry breaking (EWSB) in the Standard Model (SM) of particle physics. The LHC, which begins its operation this year, will break us into such a new energy frontier and seek for the possible signals of new physics. Theorists have come up with many possible models beyond SM to explain the origin of EWSB. However, how we will determine the underlying physics from LHC data is still an open question. In the first part of this dissertation, we consider several examples to connect the expected LHC signals to the underlying physics in a completely model independent way. We first explore the Randall-Sundrum (RS) scenario, and use the collider signals of first Kaluza-Klein (KK) excitations of gluons to discriminate several commonly considered theories which attempt to render RS consistent with precision electroweak data. We then investigate top compositeness. We derive a bound for the energy scale of right handed top compositeness from top pair production at the Tevatron, and we find that the cross section to produce four tops will be greatly amplified by 3 orders of magnitude. We next consider the possibilities that the gauge symmetry in the underlying theory is violated in the incomplete theory that we can reconstruct from the LHC observables. We derive a model independent bound on the scale of new physics from unitarity of the S-matrix if we observe a new massive vector boson with nonzero axial couplings to fermions at LHC. Finally, we derive a generalized Landau-Yang theorem and apply it to the Z' decay into two Z bosons. We show that there is a phase shift in the azimuthal angle distribution in the normalized differential cross section and the anomalous coupling of Z'-Z-Z can be discriminated from the regular one at the 3s level when both Z bosons decay leptonically at the LHC. The origin of baryon asymmetry of the Universe (BAU) remains an important, unsolved problem for particle physics and cosmology, and is one of the motivations to search for possible new physics beyond SM. In the second part of this dissertation, we attempt to account for the baryon number generation in our universe through some novel mechanisms. We first systematically investigate models of baryogenesis from spontaneously Lorentz violating background (SLVB). We find that the sphaleron transitions will generate a nonzero B+L asymmetry in the presence of SLVB and we identify two scenarios of interest. We then consider the possibilities to generate a baryon asymmetry through an earlier time phase transition and address the question whether or not we can still test the baryogenesis mechanism at LHC/ILC if the electroweak phase transition is not strongly first order. We find a general framework and realize this idea in the top flavor model. We show that the realistic baryon density can be achieved in the natural parameter space of topflavor model.

  1. Expected Improvements in Work Truck Efficiency Through Connectivity and Automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walkowicz, Kevin A

    This presentation focuses on the potential impact of connected and automated technologies on commercial vehicle operations. It includes topics such as the U.S. Department of Energy's Energy Efficient Mobility Systems (EEMS) program and the Systems and Modeling for Accelerated Research in Transportation (SMART) Mobility Initiative. It also describes National Renewable Energy Laboratory (NREL) research findings pertaining to the potential energy impacts of connectivity and automation and stresses the need for integration and optimization to take advantage of the benefits offered by these transformative technologies while mitigating the potential negative consequences.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Derrick, M.

    These proceedings document a number of aspects of a big science facility and its impact on science, on technology, and on the continuing program of a major US research institution. The Zero Gradient Synchrotron (ZGS) was a 12.5 GeV weak focusing proton accelerator that operated at Argonne for fifteen years--from 1964 to 1979. It was a major user facility which led to new close links between the Laboratory and university groups: in the research program; in the choice of experiments to be carried out; in the design and construction of beams and detectors; and even in the Laboratory management. Formore » Argonne, it marked a major move from being a Laboratory dominated by nuclear reactor development to one with a stronger basic research orientation. The present meeting covered the progress in accelerator science, in the applications of technology pioneered or developed by people working at the ZGS, as well as in physics research and detector construction. At this time, when the future of the US research programs in science is being questioned as a result of the ending of the Cold War and plans to balance the Federal budget, the specific place of the National Laboratories in the spectrum of research activities is under particular examination. This Symposium highlights one case history of a major science program that was completed more than a decade ago--so that the further developments of both the science and the technology can be seen in some perspective. The subsequent activities of the people who had worked in the ZGS program as well as the redeployment of the ZGS facilities were addressed. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less

  3. The Awful Truth About Zero-Gravity: Space Acceleration Measurement System; Orbital Acceleration Research Experiment

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Earth's gravity holds the Shuttle in orbit, as it does satellites and the Moon. The apparent weightlessness experienced by astronauts and experiments on the Shuttle is a balancing act, the result of free-fall, or continuously falling around Earth. An easy way to visualize what is happening is with a thought experiment that Sir Isaac Newton did in 1686. Newton envisioned a mountain extending above Earth's atmosphere so that friction with the air would be eliminated. He imagined a cannon atop the mountain and aimed parallel to the ground. Firing the cannon propels the cannonball forward. At the same time, Earth's gravity pulls the cannonball down to the surface and eventual impact. Newton visualized using enough powder to just balance gravity so the cannonball would circle the Earth. Like the cannonball, objects orbiting Earth are in continuous free-fall, and it appears that gravity has been eliminated. Yet, that appearance is deceiving. Activities aboard the Shuttle generate a range of accelerations that have effects similar to those of gravity. The crew works and exercises. The main data relay antenna quivers 17 times per second to prevent 'stiction,' where parts stick then release with a jerk. Cooling pumps, air fans, and other systems add vibration. And traces of Earth's atmosphere, even 200 miles up, drag on the Shuttle. While imperceptible to us, these vibrations can have a profound impact on the commercial research and scientific experiments aboard the Shuttle. Measuring these forces is necessary so that researchers and scientists can see what may have affected their experiments when analyzing data. On STS-107 this service is provided by the Space Acceleration Measurement System for Free Flyers (SAMS-FF) and the Orbital Acceleration Research Experiment (OARE). Precision data from these two instruments will help scientists analyze data from their experiments and eliminate outside influences from the phenomena they are studying during the mission.

  4. Results and prospects in multi-messenger particle astrophysics

    NASA Astrophysics Data System (ADS)

    Mostafa, Miguel

    2017-01-01

    In high-energy particle astrophysics the old days were certainly not better than these. Our field has thrived in the past decade with experiments covering thousands of square kilometers to measure the suppression in the flux of the highest energy cosmic rays ever observed, instrumenting a cubic kilometer of Antarctic ice to discover astrophysical neutrinos, and measuring a change in arm length as small as 10-19 m for the ground-breaking direct observation of gravitational waves. Additionally, the current generation of space-borne and ground-based gamma-ray experiments have revealed a plethora of gamma-ray sources, including pulsars, compact binaries, the galactic center, and extragalactic sources such as starburst galaxies and radio galaxies. Before the next generation of instruments bring us yet another order of magnitude in sensitivity, we can combine current observations to probe physics beyond the standard model, and to extend the high-energy frontier well above the energies accessible to laboratory accelerators. One example of this potential is the search for dark-matter annihilation and decay products. To use the multi-messenger approach effectively for probing dark-matter signatures and physics beyond the LHC energy requires understanding the origin (or acceleration mechanism) and the propagation processes. High energy protons and nuclei, neutrinos, gamma-rays, X-rays, and gravitational waves bring new and complementary views of the astrophysical sources. By comparing observations through different windows, we can use the sites of violent phenomena as a laboratory to probe the physical processes under extreme conditions throughout the Universe, and to test the fundamental laws of particle physics and gravitation. As a community we need to engage in a bold synergistic approach to understanding the violent processes that give rise to the high-energy cosmic phenomena in the Universe. In this invited talk, I will present on-going multi-messenger studies to obtain new information about cosmic sources, and I will discuss the prospects of combining data from the electromagnetic, particle, and gravitational windows to advance high energy astrophysics into a new era.

  5. Achieving production-level use of HEP software at the Argonne Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Uram, T. D.; Childers, J. T.; LeCompte, T. J.; Papka, M. E.; Benjamin, D.

    2015-12-01

    HEP's demand for computing resources has grown beyond the capacity of the Grid, and these demands will accelerate with the higher energy and luminosity planned for Run II. Mira, the ten petaFLOPs supercomputer at the Argonne Leadership Computing Facility, is a potentially significant compute resource for HEP research. Through an award of fifty million hours on Mira, we have delivered millions of events to LHC experiments by establishing the means of marshaling jobs through serial stages on local clusters, and parallel stages on Mira. We are running several HEP applications, including Alpgen, Pythia, Sherpa, and Geant4. Event generators, such as Sherpa, typically have a split workload: a small scale integration phase, and a second, more scalable, event-generation phase. To accommodate this workload on Mira we have developed two Python-based Django applications, Balsam and ARGO. Balsam is a generalized scheduler interface which uses a plugin system for interacting with scheduler software such as HTCondor, Cobalt, and TORQUE. ARGO is a workflow manager that submits jobs to instances of Balsam. Through these mechanisms, the serial and parallel tasks within jobs are executed on the appropriate resources. This approach and its integration with the PanDA production system will be discussed.

  6. Precision timing detectors with cadmium-telluride sensor

    NASA Astrophysics Data System (ADS)

    Bornheim, A.; Pena, C.; Spiropulu, M.; Xie, S.; Zhang, Z.

    2017-09-01

    Precision timing detectors for high energy physics experiments with temporal resolutions of a few 10 ps are of pivotal importance to master the challenges posed by the highest energy particle accelerators such as the LHC. Calorimetric timing measurements have been a focus of recent research, enabled by exploiting the temporal coherence of electromagnetic showers. Scintillating crystals with high light yield as well as silicon sensors are viable sensitive materials for sampling calorimeters. Silicon sensors have very high efficiency for charged particles. However, their sensitivity to photons, which comprise a large fraction of the electromagnetic shower, is limited. To enhance the efficiency of detecting photons, materials with higher atomic numbers than silicon are preferable. In this paper we present test beam measurements with a Cadmium-Telluride (CdTe) sensor as the active element of a secondary emission calorimeter with focus on the timing performance of the detector. A Schottky type CdTe sensor with an active area of 1cm2 and a thickness of 1 mm is used in an arrangement with tungsten and lead absorbers. Measurements are performed with electron beams in the energy range from 2 GeV to 200 GeV. A timing resolution of 20 ps is achieved under the best conditions.

  7. Beam Dynamics Considerations in Electron Ion Colliders

    NASA Astrophysics Data System (ADS)

    Krafft, Geoffrey

    2015-04-01

    The nuclear physics community is converging on the idea that the next large project after FRIB should be an electron-ion collider. Both Brookhaven National Lab and Thomas Jefferson National Accelerator Facility have developed accelerator designs, both of which need novel solutions to accelerator physics problems. In this talk we discuss some of the problems that must be solved and their solutions. Examples in novel beam optics systems, beam cooling, and beam polarization control will be presented. Authored by Jefferson Science Associates, LLC under U.S. DOE Contract No. DE-AC05-06OR23177. The U.S. Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce this manuscript for U.S. Government purposes.

  8. Precision searches in dijets at the HL-LHC and HE-LHC

    NASA Astrophysics Data System (ADS)

    Chekanov, S. V.; Childers, J. T.; Proudfoot, J.; Wang, R.; Frizzell, D.

    2018-05-01

    This paper explores the physics reach of the High-Luminosity Large Hadron Collider (HL-LHC) for searches of new particles decaying to two jets. We discuss inclusive searches in dijets and b-jets, as well as searches in semi-inclusive events by requiring an additional lepton that increases sensitivity to different aspects of the underlying processes. We discuss the expected exclusion limits for generic models predicting new massive particles that result in resonant structures in the dijet mass. Prospects of the Higher-Energy LHC (HE-LHC) collider are also discussed. The study is based on the Pythia8 Monte Carlo generator using representative event statistics for the HL-LHC and HE-LHC running conditions. The event samples were created using supercomputers at NERSC.

  9. Unveiling the top secrets with the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Chierici, R.

    2013-12-01

    Top quark physics is one of the pillars of fundamental research in the field of high energy physics. It not only gives access to precision measurements for constraining the Standard Model of particles and interactions but also it represents a privileged domain for new physics searches. This contribution summarizes the main results in top quark physics obtained with the two general-purpose detectors ATLAS and CMS during the first two years of operations of the Large Hadron Collider (LHC) at CERN. It covers the 2010 and 2011 data taking periods, where the LHC ran at a centre-of-mass energy of 7 TeV.

  10. U.S. Department of Energy physical protection upgrades at the Latvian Academy of Sciences Nuclear Research Center, Latvia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haase, M.; Hine, C.; Robertson, C.

    1996-12-31

    Approximately five years ago, the Safe, Secure Dismantlement program was started between the US and countries of the Former Soviet Union (FSU). The purpose of the program is to accelerate progress toward reducing the risk of nuclear weapons proliferation, including such threats as theft, diversion, and unauthorized possession of nuclear materials. This would be accomplished by strengthening the material protection, control, and accounting systems within the FSU countries. Under the US Department of Energy`s program of providing cooperative assistance to the FSU countries in the areas of Material Protection, Control, and Accounting (MPC and A), the Latvian Academy of Sciencesmore » Nuclear Research Center (LNRC) near Riga, Latvia, was identified as a candidate site for a cooperative MPC and A project. The LNRC is the site of a 5-megawatt IRT-C pool-type research reactor. This paper describes: the process involved, from initial contracting to project completion, for the physical protection upgrades now in place at the LNRC; the intervening activities; and a brief overview of the technical aspects of the upgrades.« less

  11. The ALICE TPC Upgrad

    NASA Astrophysics Data System (ADS)

    Castro, Andrew; Alice-Usa Collaboration; Alice-Tpc Collaboration

    2017-09-01

    The Time Projection Chamber (TPC) currently used for ALICE (A Large Ion Collider Experiment at CERN) is a gaseous tracking detector used to study both proton-proton and heavy-ion collisions at the Large Hadron Collider (LHC) In order to accommodate the higher luminosit collisions planned for the LHC Run-3 starting in 2021, the ALICE-TPC will undergo a major upgrade during the next LHC shut down. The TPC is limited to a read out of 1000 Hz in minimum bias events due to the intrinsic dead time associated with back ion flow in the multi wire proportional chambers (MWPC) in the TPC. The TPC upgrade will handle the increase in event readout to 50 kHz for heavy ion minimum bias triggered events expected with the Run-3 luminosity by switching the MWPCs to a stack of four Gaseous Electron Multiplier (GEM) foils. The GEM layers will combine different hole pitches to reduce the dead time while maintaining the current spatial and energy resolution of the existing TPC. Undertaking the upgrade of the TPC represents a massive endeavor in terms of design, production, construction, quality assurance, and installation, thus the upgrade is coordinated over a number of institutes worldwide. The talk will go over the physics motivation for the upgrade, the ALICE-USA contribution to the construction of Inner Read Out Chambers IROCs, and QA from the first chambers built in the U.S

  12. Soviet Free-Electron Laser Research

    DTIC Science & Technology

    1985-05-01

    can generate a narrow band electromagnetic radiation over a wide frequency range that can potentially extend from microwaves through the visible and...refer to experiments listed in Table 2. Table 2 COMPARISON OF SOVIET-U.S. HIGH-CURRENT FEL EXPERIMENT S SOVIET u.s. Pulse line accelerators...Power ... Pulse length Efficiency . 3cm 10MW 0.7 p.sec 1.5% 2. Columbia, 2 February 1977 [9] Hollow electron beam Energy

  13. How America Can Look Within to Achieve Energy Security and Reduce Global Warming

    DTIC Science & Technology

    2008-09-01

    Linear Accelerator Center, Stanford University Maxine Savitz The Advisory Group Daniel Sperling University of California, Davis Study Group Members...Development Mr. Don Von Dollen, Electric Power Research Institute Mr. Anant Vyas, Argonne National Laboratory Dr. E.D. Tate, General Motors...other nation on Earth except China [EIA, 2007b]. Source: Lutsey and Sperling , 2005 Figure 6 U.S. fuel economy vs. fuel efficiency Passenger cars

  14. Ribbon electron beam formation by a forevacuum plasma electron source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klimov, A. S., E-mail: klimov@main.tusur.ru; Burdovitsin, V. A.; Grishkov, A. A.

    2016-01-15

    Results of the numerical analysis and experimental research on ribbon electron beam generation based on hollow cathode discharge at forevacuum gas pressure are presented. Geometry of the accelerating gap has modified. It lets us focus the ribbon electron beam and to transport it on a distance of several tens of centimeters in the absence of an axial magnetic field. The results of numerical simulations are confirmed by the experiment.

  15. 46th Annual Gun and Missile Systems Conference and Exhibition. Volume 2. Wednesday

    DTIC Science & Technology

    2011-09-01

    military/systems/munitions/images/ Page 7 Designing for Operational Challenges  Gun hardening – Multiple charges • Angular acceleration variation ...The industrial base overestimated readiness at SDD start – Analysis/models were naive • Impulsive loads — pressure variation — SOM under impulse...Manufacture and Producibility Branch, US Army Armament Research, Development and Engineering Center • Alan Sweet and William Goldberg , Packaging Division

  16. Nb3Sn SRF Cavities for Nuclear Physics Applications

    NASA Astrophysics Data System (ADS)

    Eremeev, Grigory

    2017-01-01

    Nuclear physics experiments rely increasingly on accelerators, which employ superconducting RF (SRF) technology. CEBAF, SNS, FRIB, ESS, among others exploit the low surface resistance of SRF cavities to efficiently accelerate particle beams towards experimental targets. Niobium is the cavity material of choice for all current or planned SRF accelerators, but it has been long recognized that other superconductors with high superconducting transition temperatures have the potential to surpass niobium for SRF applications. Among the alternatives, Nb3Sn coated cavities are the most advanced on the path to practical applications: Nb3Sn coatings on R&D cavities have Tc consistently close the optimal 18 K, very low RF surface resistances, and very recently were shown to reach above Hc1 without anomalous RF surface resistance increase. In my talk I will discuss the prospects of Nb3Sn SRF cavities, the research efforts to realize Nb3Sn coatings on practical multi-cell accelerating structures, and the path toward possible inclusion in CEBAF. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics.

  17. An assessment of research and development leadership in advanced batteries for electric vehicles

    NASA Astrophysics Data System (ADS)

    Bruch, V. L.

    1994-02-01

    Due to the recently enacted California regulations requiring zero emission vehicles be sold in the market place by 1998, electric vehicle research and development (R&D) is accelerating. Much of the R&D work is focusing on the Achilles' heel of electric vehicles -- advanced batteries. This report provides an assessment of the R&D work currently underway in advanced batteries and electric vehicles in the following countries: Denmark, France, Germany, Italy, Japan, Russia, and the United Kingdom. Although the US can be considered one of the leading countries in terms of advanced battery and electric vehicle R&D work, it lags other countries, particularly France, in producing and promoting electric vehicles. The US is focusing strictly on regulations to promote electric vehicle usage while other countries are using a wide variety of policy instruments (regulations, educational outreach programs, tax breaks and subsidies) to encourage the use of electric vehicles. The US should consider implementing additional policy instruments to ensure a domestic market exists for electric vehicles. The domestic is the largest and most important market for the US auto industry.

  18. Clinical research for neuropathies.

    PubMed

    Kaufmann, Petra

    2012-05-01

    The National Institutes of Health (NIH) has a long-standing commitment to neuropathy research. From 2005-2009, the NIH has committed US $115 million each year. A collaborative effort between researchers and patients can accelerate the translation of pre-clinical discoveries into better treatments for neuropathy patients. Clinical trials are needed to test these new treatments, but they can only be implemented in a timely fashion if patients with neuropathies are willing to participate. This perspective focuses on the value of having various outlets for informing both the patients and the physicians about existing clinical research opportunities and on the potential benefit of establishing patient registries to help with trial recruitment. Once data have been collected, there is a need to broadly share the data in order to inform future trials, and a first step would be to harmonize data collection by using Common Data Elements (CDEs). Published 2012. This article is a U.S. Government work and is in the public domain in the USA.

  19. Particle Theory & Cosmology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shafi, Qaisar; Barr, Steven; Gaisser, Thomas

    1. Executive Summary (April 1, 2012 - March 31, 2015) Title: Particle Theory, Particle Astrophysics and Cosmology Qaisar Shafi University of Delaware (Principal Investigator) Stephen M. Barr, University of Delaware (Co-Principal Investigator) Thomas K. Gaisser, University of Delaware (Co-Principal Investigator) Todor Stanev, University of Delaware (Co-Principal Investigator) The proposed research was carried out at the Bartol Research included Professors Qaisar Shafi Stephen Barr, Thomas K. Gaisser, and Todor Stanev, two postdoctoral fellows (Ilia Gogoladze and Liucheng Wang), and several graduate students. Five students of Qaisar Shafi completed their PhD during the period August 2011 - August 2014. Measures of themore » group’s high caliber performance during the 2012-2015 funding cycle included pub- lications in excellent refereed journals, contributions to working groups as well as white papers, and conference activities, which together provide an exceptional record of both individual performance as well as overall strength. Another important indicator of success is the outstanding quality of the past and current cohort of graduate students. The PhD students under our supervision regularly win the top departmental and university awards, and their publications records show excellence both in terms of quality and quantity. The topics covered under this grant cover the frontline research areas in today’s High Energy Theory & Phenomenology. For Professors Shafi and Barr they include LHC related topics including supersymmetry, collider physics, fl vor physics, dark matter physics, Higgs boson and seesaw physics, grand unifi and neutrino physics. The LHC two years ago discovered the Standard Model Higgs boson, thereby at least partially unlocking the secrets behind electroweak symmetry breaking. We remain optimistic that new and exciting physics will be found at LHC 14, which explain our focus on physics beyond the Standard Model. Professors Shafi continued his investigations in cosmology, specifically on supergravity and GUT infl models, primordial gravity waves, dark matter models. The origin of baryon and dark matter in the universe has been explored by Professors Barr and Shafi The research program of Professors Gaisser and Stanev address current research topics in Particle Astrophysics, in particular atmospheric and cosmogenic neutrinos and ultra-high energy cosmic rays. Work also included use of LHC data to improve tools for interpreting cascades generated in the atmosphere by high-energy particles from the cosmos. Cosmogenic neutrinos produced by interactions of ultra-high energy cosmic rays as they propagate through the cosmic microwave background radiation provides insight into the origin of the highest energy particles in nature. Overall, the research covered topics in the energy, cosmic and intensity frontiers.« less

  20. A beam radiation monitor based on CVD diamonds for SuperB

    NASA Astrophysics Data System (ADS)

    Cardarelli, R.; Di Ciaccio, A.

    2013-08-01

    Chemical Vapor Deposition (CVD) diamond particle detectors are in use in the CERN experiments at LHC and at particle accelerator laboratories in Europe, USA and Japan mainly as beam monitors. Nowadays it is considered a proven technology with a very fast signal read-out and a very high radiation tolerance suitable for measurements in high radiation environment zones i.e. near the accelerators beam pipes. The specific properties of CVD diamonds make them a prime candidate for measuring single particles as well as high-intensity particle cascades, for timing measurements on the sub-nanosecond scale and for beam protection systems in hostile environments. A single-crystalline CVD (scCVD) diamond sensor, read out with a new generation of fast and high transition frequency SiGe bipolar transistor amplifiers, has been tested for an application as radiation monitor to safeguard the silicon vertex tracker in the SuperB detector from excessive radiation damage, cumulative dose and instantaneous dose rates. Test results with 5.5 MeV alpha particles from a 241Am radioactive source and from electrons from a 90Sr radioactive source are presented in this paper.

  1. Pinning down the large- x gluon with NNLO top-quark pair differential distributions

    NASA Astrophysics Data System (ADS)

    Czakon, Michał; Hartland, Nathan P.; Mitov, Alexander; Nocera, Emanuele R.; Rojo, Juan

    2017-04-01

    Top-quark pair production at the LHC is directly sensitive to the gluon PDF at large x. While total cross-section data is already included in several PDF determinations, differential distributions are not, because the corresponding NNLO calculations have become available only recently. In this work we study the impact on the large- x gluon of top-quark pair differential distributions measured by ATLAS and CMS at √{s}=8 TeV. Our analysis, performed in the NNPDF3.0 framework at NNLO accuracy, allows us to identify the optimal combination of LHC top-quark pair measurements that maximize the constraints on the gluon, as well as to assess the compatibility between ATLAS and CMS data. We find that differential distributions from top-quark pair production provide significant constraints on the large- x gluon, comparable to those obtained from inclusive jet production data, and thus should become an important ingredient for the next generation of global PDF fits.

  2. Search for heavy right-handed neutrinos at the LHC and beyond in the same-sign same-flavor leptons final state

    NASA Astrophysics Data System (ADS)

    Ng, John N.; de la Puente, Alejandro; Pan, Bob Wei-Ping

    2015-12-01

    In this study we explore the LHC's Run II potential to the discovery of heavy Majorana neutrinos, with luminosities between 30 and 3000 fb-1 in the l ± l ± j j final state. Given that there exist many models for neutrino mass generation, even within the Type I seesaw framework, we use a simplified model approach and study two simple extensions to the Standard Model, one with a single heavy Majorana neutrino, singlet under the Standard Model gauge group, and a limiting case of the left-right symmetric model. We then extend the analysis to a future hadron collider running at 100 TeV center of mass energies. This extrapolation in energy allows us to study the relative importance of the resonant production versus gauge boson fusion processes in the study of Majorana neutrinos at hadron colliders. We analyze and propose different search strategies designed to maximize the discovery potential in either the resonant production or the gauge boson fusion modes.

  3. AmeriFlux US-IB2 Fermi National Accelerator Laboratory- Batavia (Prairie site)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matamala, Roser

    2016-01-01

    This is the AmeriFlux version of the carbon flux data for the site US-IB2 Fermi National Accelerator Laboratory- Batavia (Prairie site). Site Description - Two eddy correlation systems are installed at Fermi National Accelerator Laboratory: one on a restored prairie (established October 2004) and one on a corn/soybean rotation agricultural field (established in July 2005). The prairie site had been farmed for more than 100 years, but was converted to prairie in 1989. April annual to bi-annual prescribed burns have taken place from 1994 - 2007.

  4. Acceleration in U.S. Mean Sea Level? A New Insight using Improved Tools

    NASA Astrophysics Data System (ADS)

    Watson, Phil J.

    2016-08-01

    The detection of acceleration in mean sea level around the data-rich margins of the United States has been a keen endeavour of sea-level researchers following the seminal work of Bruce Douglas in 1992. Over the past decade, such investigations have taken on greater prominence given mean sea level remains a key proxy by which to measure a changing climate system. The physics-based climate projection models are forecasting that the current global average rate of mean sea-level rise (≈3 mm/y) might climb to rates in the range of 10020 mm/y by 2100. Most research in this area has centred on reconciling current rates of rise with the significant accelerations required to meet the forecast projections of climate models. The analysis in this paper is based on a recently developed analytical package titled "msltrend," specifically designed to enhance estimates of trend, real-time velocity and acceleration in the relative mean sea-level signal derived from long annual average ocean-water-level time series. Key findings are that at the 95% confidence level, no consistent or substantial evidence (yet) exists that recent rates of rise are higher or abnormal in the context of the historical records available for the United States, nor does any evidence exist that geocentric rates of rise are above the global average. It is likely that a further 20 years of data will identify whether recent increases east of Galveston and along the east coast are evidence of the onset of climate change induced acceleration.

  5. Future cancer research priorities in the USA: a Lancet Oncology Commission.

    PubMed

    Jaffee, Elizabeth M; Dang, Chi Van; Agus, David B; Alexander, Brian M; Anderson, Kenneth C; Ashworth, Alan; Barker, Anna D; Bastani, Roshan; Bhatia, Sangeeta; Bluestone, Jeffrey A; Brawley, Otis; Butte, Atul J; Coit, Daniel G; Davidson, Nancy E; Davis, Mark; DePinho, Ronald A; Diasio, Robert B; Draetta, Giulio; Frazier, A Lindsay; Futreal, Andrew; Gambhir, Sam S; Ganz, Patricia A; Garraway, Levi; Gerson, Stanton; Gupta, Sumit; Heath, James; Hoffman, Ruth I; Hudis, Cliff; Hughes-Halbert, Chanita; Ibrahim, Ramy; Jadvar, Hossein; Kavanagh, Brian; Kittles, Rick; Le, Quynh-Thu; Lippman, Scott M; Mankoff, David; Mardis, Elaine R; Mayer, Deborah K; McMasters, Kelly; Meropol, Neal J; Mitchell, Beverly; Naredi, Peter; Ornish, Dean; Pawlik, Timothy M; Peppercorn, Jeffrey; Pomper, Martin G; Raghavan, Derek; Ritchie, Christine; Schwarz, Sally W; Sullivan, Richard; Wahl, Richard; Wolchok, Jedd D; Wong, Sandra L; Yung, Alfred

    2017-11-01

    We are in the midst of a technological revolution that is providing new insights into human biology and cancer. In this era of big data, we are amassing large amounts of information that is transforming how we approach cancer treatment and prevention. Enactment of the Cancer Moonshot within the 21st Century Cures Act in the USA arrived at a propitious moment in the advancement of knowledge, providing nearly US$2 billion of funding for cancer research and precision medicine. In 2016, the Blue Ribbon Panel (BRP) set out a roadmap of recommendations designed to exploit new advances in cancer diagnosis, prevention, and treatment. Those recommendations provided a high-level view of how to accelerate the conversion of new scientific discoveries into effective treatments and prevention for cancer. The US National Cancer Institute is already implementing some of those recommendations. As experts in the priority areas identified by the BRP, we bolster those recommendations to implement this important scientific roadmap. In this Commission, we examine the BRP recommendations in greater detail and expand the discussion to include additional priority areas, including surgical oncology, radiation oncology, imaging, health systems and health disparities, regulation and financing, population science, and oncopolicy. We prioritise areas of research in the USA that we believe would accelerate efforts to benefit patients with cancer. Finally, we hope the recommendations in this report will facilitate new international collaborations to further enhance global efforts in cancer control. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Accelerating reproductive and child health programme impact with community-based services: the Navrongo experiment in Ghana.

    PubMed Central

    Phillips, James F.; Bawah, Ayaga A.; Binka, Fred N.

    2006-01-01

    OBJECTIVE: To determine the demographic and health impact of deploying health service nurses and volunteers to village locations with a view to scaling up results. METHODS: A four-celled plausibility trial was used for testing the impact of aligning community health services with the traditional social institutions that organize village life. Data from the Navrongo Demographic Surveillance System that tracks fertility and mortality events over time were used to estimate impact on fertility and mortality. RESULTS: Assigning nurses to community locations reduced childhood mortality rates by over half in 3 years and accelerated the time taken for attainment of the child survival Millennium Development Goal (MDG) in the study areas to 8 years. Fertility was also reduced by 15%, representing a decline of one birth in the total fertility rate. Programme costs added 1.92 US Dollar per capita to the 6.80 US Dollar per capita primary health care budget. CONCLUSION: Assigning nurses to community locations where they provide basic curative and preventive care substantially reduces childhood mortality and accelerates progress towards attainment of the child survival MDG. Approaches using community volunteers, however, have no impact on mortality. The results also demonstrate that increasing access to contraceptive supplies alone fails to address the social costs of fertility regulation. Effective deployment of volunteers and community mobilization strategies offsets the social constraints on the adoption of contraception. The research in Navrongo thus demonstrates that affordable and sustainable means of combining nurse services with volunteer action can accelerate attainment of both the International Conference on Population and Development agenda and the MDGs. PMID:17242830

  7. Proposal to search for mu- N -> e- N with a single event sensitivity below 10 -16

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carey, R.M.; Lynch, K.R.; Miller, J.P.

    2008-10-01

    We propose a new experiment, Mu2e, to search for charged lepton flavor violation with unprecedented sensitivity. We will measure the ratio of the coherent neutrinoless conversion in the field of a nucleus of a negatively charged muon into an electron to the muon capture process: R{sub {mu}e} = {mu}{sup -} + A(Z,N) {yields} e{sup -} + A(Z,N)/{mu}{sup -} + A(Z,N) {yields} {nu}{sub {mu}} + A(Z-1, N), with a sensitivity R{sub {mu}e} {le} 6 x 10{sup -17} at 90% CL. This is almost a four order-of-magnitude improvement over the existing limit. The observation of such a process would be unambiguous evidencemore » of physics beyond the Standard Model. Since the discovery of the muon in 1936, physicists have attempted to answer I.I. Rabi's famous question: 'Who ordered that?' Why is there a muon? What role does it play in the larger questions of why there are three families and flavors of quarks, leptons, and neutrinos? We know quarks mix through a mechanism described by the Cabbibo-Kobayashi-Maskawa matrix, which has been studied for forty years. Neutrino mixing has been observed in the last decade, but mixing among the family of charged leptons has never been seen. The current limits are of order 10{sup -11} - 10{sup -13} so the process is rare indeed. Why is such an experiment important and timely? A major motivation for experiments at the Large Hadron Collider (LHC) is the possible observation of supersymmetric particles in the TeV mass range. Many of these supersymmetric models predict a {mu}-e conversion signal at R{sub {mu}e} {approx} 10{sup -15}. We propose to search for {mu}-e conversion at a sensitivity that exceeds this by more than an order of magnitude. The LHC may not be able to conclusively distinguish among supersymmetric models, so Mu2e will provide invaluable information should the LHC observe a signal. In the case where the LHC finds no evidence of supersymmetry, or other beyond-the-standard-model physics, Mu2e will probe for new physics at mass scales up to 10{sup 4} TeV, far beyond the reach of any planned accelerator.« less

  8. Direct investigations of supersymmetry: subgroup summary report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephen P Martin et al.

    A recurring element of the discussions in the Snowmass study is that there is a need and opportunity for improved theoretical tools in preparation for the discovery of Supersymmetry (SUSY). In order to be competitive with mass measurements at the LHC and a linear collider (LC), predictions of sparticle and Higgs masses from given model parameters need to be improved by an order of magnitude in some cases. There is also room for growth and improvement in (Monte Carlo) SUSY event generators. It seems injudicious to discuss priorities in the field of direct SUSY detection independently of having established directlymore » the existence and mass of the Higgs since it is the particle that led to the founding of SUSY models. But under the hypothesis that a light Higgs exists with mass compatible with SUSY, then they should discuss such priorities. As outlined in section III in the context of the flavor-respecting minimal supersymmetric standard model (MSSM), there is no fundamental symmetry to tie, e.g., squark masses to slepton masses, or the gluino mass to the chargino mass or the chargino mass to the neutralino mass. However, models such as mSUGRA do lead to such relationships and LHC studies show that the heaviest super-partners, the squarks and the gluino should be observable for masses up to about 2.5 TeV at the LHC in such models. Depending on the actual decay chains, some other superpartners may be identifiable in the cascade decays of the quarks and the gluino. On the other hand a LC with CM energy of 1 TeV could comprehensively explore and discover superpartners with masses less than 0.5 TeV largely independently of their nature (neutral, charged, strong, electroweak) and decay modes. In most supersymmetric models, the chargino and neutralino and often the sleptons are much lighter than the squarks and gluino. A VLHC could extend the mass reach for squarks and the gluino but would not necessarily add much value if these had already been seen at the LHC. In summary, it would appear that if SUSY is accessible at near future accelerators, the most promising new direction for understanding its nature is a LC with sufficient CM energy.« less

  9. A Bridge Too Far: The Demise of the Superconducting Super Collider, 1989-1993

    NASA Astrophysics Data System (ADS)

    Riordan, Michael

    2015-04-01

    In October 1993 the US Congress terminated the Superconducting Super Collider -- at over 10 billion the largest and costliest basic-science project ever attempted. It was a disastrous loss for the nation's once-dominant high-energy physics community, which has been slowly declining since then. With the 2012 discovery of the Higgs boson at CERN's Large Hadron Collider, Europe has assumed world leadership in this field. A combination of fiscal austerity, continuing SSC cost overruns, intense Congressional scrutiny, lack of major foreign contributions, waning Presidential support, and the widespread public perception of mismanagement led to the project's demise nearly five years after it had begun. Its termination occurred against the political backdrop of changing scientific needs as US science policy shifted to a post-Cold War footing during the early 1990s. And the growing cost of the SSC inevitably exerted undue pressure upon other worthy research, thus weakening its support in Congress and the broader scientific community. As underscored by the Higgs boson discovery, at a mass substantially below that of the top quark, the SSC did not need to collide protons at 40 TeV in order to attain its premier physics goal. The selection of this design energy was governed more by politics than by physics, given that Europeans could build the LHC by eventually installing superconducting magnets in the LEP tunnel under construction in the mid-1980s. In hindsight, there were good alternative projects the US high-energy physics community could have pursued that did not involve building a gargantuan, multibillion-dollar machine at a green-field site in Texas. Research supported by the National Science Foundation, Department of Energy, and the Richard Lounsbery Foundation.

  10. Proceedings of the Annual Precise Time and Time Interval (PTTI) Planning Meeting (6th). Held at U.S. Naval Research Laboratory, December 3-5, 1974

    DTIC Science & Technology

    1974-01-01

    General agreement seems to be developing that the geophysical system should be defined in terms of a large number of points...34A Laser-Interferometer System for the Absolute Determination of the Acceleration due to Gravity," In Proc. Int. Conf. on Precision Measurement...MO %. The ratio of the plasmaspheric to the total time-delays due to free

  11. Physics perspectives with AFTER@LHC (A Fixed Target ExpeRiment at LHC)

    NASA Astrophysics Data System (ADS)

    Massacrier, L.; Anselmino, M.; Arnaldi, R.; Brodsky, S. J.; Chambert, V.; Da Silva, C.; Didelez, J. P.; Echevarria, M. G.; Ferreiro, E. G.; Fleuret, F.; Gao, Y.; Genolini, B.; Hadjidakis, C.; Hřivnáčová, I.; Kikola, D.; Klein, A.; Kurepin, A.; Kusina, A.; Lansberg, J. P.; Lorcé, C.; Lyonnet, F.; Martinez, G.; Nass, A.; Pisano, C.; Robbe, P.; Schienbein, I.; Schlegel, M.; Scomparin, E.; Seixas, J.; Shao, H. S.; Signori, A.; Steffens, E.; Szymanowski, L.; Topilskaya, N.; Trzeciak, B.; Uggerhøj, U. I.; Uras, A.; Ulrich, R.; Wagner, J.; Yamanaka, N.; Yang, Z.

    2018-02-01

    AFTER@LHC is an ambitious fixed-target project in order to address open questions in the domain of proton and neutron spins, Quark Gluon Plasma and high-x physics, at the highest energy ever reached in the fixed-target mode. Indeed, thanks to the highly energetic 7 TeV proton and 2.76 A.TeV lead LHC beams, center-of-mass energies as large as = 115 GeV in pp/pA and = 72 GeV in AA can be reached, corresponding to an uncharted energy domain between SPS and RHIC. We report two main ways of performing fixed-target collisions at the LHC, both allowing for the usage of one of the existing LHC experiments. In these proceedings, after discussing the projected luminosities considered for one year of data taking at the LHC, we will present a selection of projections for light and heavy-flavour production.

  12. Analysis of the quench propagation along Nb3Sn Rutherford cables with the THELMA code. Part I: Geometric and thermal models

    NASA Astrophysics Data System (ADS)

    Manfreda, G.; Bellina, F.

    2016-12-01

    The paper describes the new lumped thermal model recently implemented in THELMA code for the coupled electromagnetic-thermal analysis of superconducting cables. A new geometrical model is also presented, which describes the Rutherford cables used for the accelerator magnets. A first validation of these models has been given by the analysis of the quench longitudinal propagation velocity in the Nb3Sn prototype coil SMC3, built and tested in the frame of the EUCARD project for the development of high field magnets for LHC machine. This paper shows in detail the models, while their application to the quench propagation analysis is presented in a companion paper.

  13. Quench Protection of SC Quadrupole Magnets

    NASA Astrophysics Data System (ADS)

    Feher, S.; Bossert, R.; Dimarco, J.; Mitchell, D.; Lamm, M. J.; Limon, P. J.; Mazur, P.; Nobrega, F.; Orris, D.; Ozelis, J. P.; Strait, J. B.; Tompkins, J. C.; Zlobin, A. V.; McInturff, A. D.

    1997-05-01

    The energy stored in a superconducting accelerator magnet is dissipated after a quench in the coil normal zones, heating the coil and generating a turn to turn and coil to ground voltage drop. Quench heaters are used to protect the superconducting magnet by greatly increasing the coil normal zone thus allowing the energy to be dissipated over a larger conductor volume. Such heaters will be required for the Fermilab/LBNL design of the high gradient quads (HGQ) designed for the LHC interaction regions. As a first step, heaters were installed and tested in several Tevatron low-β superconducting quadrupoles. Experimental studies in normal and superfluid helium are presented which show the heater-induced quench response as a function of magnet excitation current, magnet temperature and peak heater energy density.

  14. Heaven’s Carousel

    NASA Image and Video Library

    2014-03-26

    Last week researchers from around the world gathered at the Accademia dei Lincei in Rome for the Science with the Hubble Space Telescope IV conference. The event celebrated the history of Hubble’s extraordinary achievements, and looked to the future at what might yet be achieved and how the James Webb Space Telescope will build on our knowledge of the Universe. As part of this celebration artist Tim Otto Roth revealed a new artwork, Heaven’s Carousel, inspired by Hubble’s work on the accelerating expansion of the Universe. This installation, named Heaven’s Carousel, links together the fields of art, music and astronomy. Conceptualised and designed by German artist and composer Tim Otto Roth, the work is inspired by novel work on the accelerating expansion of the Universe by Nobel laureate Adam Riess (STScl), Greek cosmology and Renaissance astronomers. Read more here: www.spacetelescope.org/news/heic1407/ Credit: NASA, ESA, and Pam Jeffries (STScI) NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kazmerski, L. L.

    '' . . . with robust investments in research and market development, the picture changes dramatically.'' Thus, the realigned U.S. Photovoltaic Industry Roadmap highlights R&D as critical to the tipping point that will make solar photovoltaics (PV) significant in the U.S. energy portfolio--part of a well-designed plan that would bring ''2034 expectations'' to reality by 2020. Technology improvement and introduction depend on key, focused, and pertinent research contributions that range from the most fundamental through the applied. In this paper, we underscore the successes and relevance of our current systems-driven PV R&D programs, which are built on integrated capabilities. Thesemore » capabilities span atomic-level characterization, nanotechnology, new materials design, interface and device engineering, theoretical guidance and modeling, processing, measurements and analysis, and process integration. This presentation identifies and provides examples of critical research tipping points needed to foster now and near technologies (primarily crystalline silicon and thin films) and to introduce coming generations of solar PV that provide options to push us to the next performance levels (devices with ultra-high efficiencies and with ultra-low cost). The serious importance of science and creativity to U.S. PV technology ownership--and the increased focus to accelerate the time from laboratory discovery to industry adoption--are emphasized at this ''tipping point'' for solar PV.« less

  16. Homage to Professor Meinhart H. Zenk: Crowd accelerated research and innovation.

    PubMed

    Heinz, Nanna; Møller, Birger Lindberg

    2013-07-01

    Professor Meinhart H. Zenk has had an enormous impact within the plant biochemistry area. Throughout his entire career he was able to identify and address key scientific issues within chemistry and plant secondary metabolism. Meinhart H. Zenk and his research associates have provided seminal scientific contributions within a multitude of research topics. A hallmark in Meinhart H. Zenk's research has been to rapidly introduce and apply new technologies and to initiate cross-disciplinary collaborations to provide groundbreaking new knowledge within research areas that at the time appeared highly complex and inaccessible to experimentation. He strived and managed to reach scientific excellence. In this way, he was an eminent key mentor within the plant biochemistry research community. Today, few single individuals possess so much knowledge. However, web-based social platforms enable fast and global distribution and sharing of information also including science related matters, unfortunately often prior to assessment of its correctness. Thus the demand of scientific mentoring that Meinhart H. Zenk offered the science community is as important as ever. In the honor of Meinhart H. Zenk, let us keep up that tradition and widen our engagement to encompass the new social media and benefit from the opportunities offered by crowd accelerated innovation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bingaman, Jeff

    During the opening session of the EFRC Summit, Senator Jeff Bingaman (D-NM) explained how the EFRCs play an important role in the U.S. energy innovation ecosystem. The 2011 EFRC Summit and Forum brought together the EFRC community and science and policy leaders from universities, national laboratories, industry and government to discuss "Science for our Nation's Energy Future." In August 2009, the Office of Science established 46 Energy Frontier Research Centers. The EFRCs are collaborative research efforts intended to accelerate high-risk, high-reward fundamental research, the scientific basis for transformative energy technologies of the future. These Centers involve universities, national laboratories, nonprofitmore » organizations, and for-profit firms, singly or in partnerships, selected by scientific peer review. They are funded at $2 to $5 million per year for a total planned DOE commitment of $777 million over the initial five-year award period, pending Congressional appropriations. These integrated, multi-investigator Centers are conducting fundamental research focusing on one or more of several “grand challenges” and use-inspired “basic research needs” recently identified in major strategic planning efforts by the scientific community. The purpose of the EFRCs is to integrate the talents and expertise of leading scientists in a setting designed to accelerate research that transforms the future of energy and the environment.« less

  18. Performance of the CMS precision electromagnetic calorimeter at LHC Run II and prospects for High-Luminosity LHC

    NASA Astrophysics Data System (ADS)

    Zhang, Zhicai

    2018-04-01

    Many physics analyses using the Compact Muon Solenoid (CMS) detector at the LHC require accurate, high-resolution electron and photon energy measurements. Following the excellent performance achieved during LHC Run I at center-of-mass energies of 7 and 8 TeV, the CMS electromagnetic calorimeter (ECAL) is operating at the LHC with proton-proton collisions at 13 TeV center-of-mass energy. The instantaneous luminosity delivered by the LHC during Run II has achieved unprecedented levels. The average number of concurrent proton-proton collisions per bunch-crossing (pileup) has reached up to 40 interactions in 2016 and may increase further in 2017. These high pileup levels necessitate a retuning of the ECAL readout and trigger thresholds and reconstruction algorithms. In addition, the energy response of the detector must be precisely calibrated and monitored. We present new reconstruction algorithms and calibration strategies that were implemented to maintain the excellent performance of the CMS ECAL throughout Run II. We will show performance results from the 2015-2016 data taking periods and provide an outlook on the expected Run II performance in the years to come. Beyond the LHC, challenging running conditions for CMS are expected after the High-Luminosity upgrade of the LHC (HL-LHC) . We review the design and R&D studies for the CMS ECAL and present first test beam studies. Particular challenges at HL-LHC are the harsh radiation environment, the increasing data rates, and the extreme level of pile-up events, with up to 200 simultaneous proton-proton collisions. We present test beam results of hadron irradiated PbWO crystals up to fluences expected at the HL-LHC . We also report on the R&D for the new readout and trigger electronics, which must be upgraded due to the increased trigger and latency requirements at the HL-LHC.

  19. Taxonomic distribution and origins of the extended LHC (light-harvesting complex) antenna protein superfamily

    PubMed Central

    2010-01-01

    Background The extended light-harvesting complex (LHC) protein superfamily is a centerpiece of eukaryotic photosynthesis, comprising the LHC family and several families involved in photoprotection, like the LHC-like and the photosystem II subunit S (PSBS). The evolution of this complex superfamily has long remained elusive, partially due to previously missing families. Results In this study we present a meticulous search for LHC-like sequences in public genome and expressed sequence tag databases covering twelve representative photosynthetic eukaryotes from the three primary lineages of plants (Plantae): glaucophytes, red algae and green plants (Viridiplantae). By introducing a coherent classification of the different protein families based on both, hidden Markov model analyses and structural predictions, numerous new LHC-like sequences were identified and several new families were described, including the red lineage chlorophyll a/b-binding-like protein (RedCAP) family from red algae and diatoms. The test of alternative topologies of sequences of the highly conserved chlorophyll-binding core structure of LHC and PSBS proteins significantly supports the independent origins of LHC and PSBS families via two unrelated internal gene duplication events. This result was confirmed by the application of cluster likelihood mapping. Conclusions The independent evolution of LHC and PSBS families is supported by strong phylogenetic evidence. In addition, a possible origin of LHC and PSBS families from different homologous members of the stress-enhanced protein subfamily, a diverse and anciently paralogous group of two-helix proteins, seems likely. The new hypothesis for the evolution of the extended LHC protein superfamily proposed here is in agreement with the character evolution analysis that incorporates the distribution of families and subfamilies across taxonomic lineages. Intriguingly, stress-enhanced proteins, which are universally found in the genomes of green plants, red algae, glaucophytes and in diatoms with complex plastids, could represent an important and previously missing link in the evolution of the extended LHC protein superfamily. PMID:20673336

  20. Illinois Cleantech Ecosystem Consortium (ICE) for the Department of Energy Innovation Ecosystem Development Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zielke, Jason

    The DoE Innovation Ecosystem Initiative was a gamechanger for Clean Energy Trust. The grant accelerated our development from a concept to a real company in 2010, seeding us with the capital to begin our mission to “accelerate the growth of clean energy businesses in the Midwest”. Now three years later, we have scores and scores of partners which fund us through sponsorship donations to our programs, and we have played a key role in launching several new companies, and helping them acquire funding and reach their milestones. In three years we have grown from two people to nine, now withmore » an annual budget of over $3M. We started with the following simple plan (verbatim from our original submission): “The short-term objective of ICE is to fortify and enhance the platform for collaboration necessary to create a robust ecosystem for clean energy innovation. This includes launching a number of initiatives designed to source, evaluate, and launch new clean energy businesses derived from university research.« less

  1. Radiation protection and environmental management at the relativistic heavy ion collider.

    PubMed

    Musolino, S V; Briggs, S L; Stevens, A J

    2001-01-01

    The Relativistic Heavy Ion Collider (RHIC) is a high energy hadron accelerator built to study basic nuclear physics. It consists of two counter-rotating beams of fully stripped gold ions that are accelerated in two rings to an energy of 100 GeV/nucleon or protons at 250 GeV/c. The beams can be stored for a period of five to ten hours and brought into collision for experiments during that time. The first major physics objective is to recreate a state of matter, the quark-gluon plasma, that has been predicted to have existed at a short time after the creation of the universe. Because there are only a few other high energy particle accelerators like RHIC in the world, the rules promulgated in the US Code of Federal Regulations under the Atomic Energy Act, State regulations, or international guidance documents do not cover prompt radiation from accelerators to govern directly the design and operation of a superconducting collider. Special design criteria for prompt radiation were developed to provide guidance tor the design of radiation shielding. Environmental Management at RHIC is accomplished through the ISO 14001 Environmental Management System. The applicability, benefits, and implementation of ISO 14001 within the framework of a large research accelerator complex are discussed in the paper.

  2. Accelerating Innovation Through Coopetition: The Innovation Learning Network Experience.

    PubMed

    McCarthy, Chris; Ford Carleton, Penny; Krumpholz, Elizabeth; Chow, Marilyn P

    Coopetition, the simultaneous pursuit of cooperation and competition, is a growing force in the innovation landscape. For some organizations, the primary mode of innovation continues to be deeply secretive and highly competitive, but for others, a new style of shared challenges, shared purpose, and shared development has become a superior, more efficient way of working to accelerate innovation capabilities and capacity. Over the last 2 decades, the literature base devoted to coopetition has gradually expanded. However, the field is still in its infancy. The majority of coopetition research is qualitative, primarily consisting of case studies. Few studies have addressed the nonprofit sector or service industries such as health care. The authors believe that this article may offer a unique perspective on coopetition in the context of a US-based national health care learning alliance designed to accelerate innovation, the Innovation Learning Network or ILN. The mission of the ILN is to "Share the joy and pain of innovation," accelerating innovation by sharing solutions, teaching techniques, and cultivating friendships. These 3 pillars (sharing, teaching, and cultivating) form the foundation for coopetition within the ILN. Through the lens of coopetition, we examine the experience of the ILN over the last 10 years and provide case examples that illustrate the benefits and challenges of coopetition in accelerating innovation in health care.

  3. Joint Center for Artificial Photosynthesis

    ScienceCinema

    Koval, Carl; Lee, Kenny; Houle, Frances; Lewis, Na

    2018-05-30

    The Joint Center for Artificial Photosynthesis (JCAP) is the nation's largest research program dedicated to the development of an artificial solar-fuel generation technology. Established in 2010 as a U.S. Department of Energy (DOE) Energy Innovation Hub, JCAP aims to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide as inputs. JCAP brings together more than 140 top scientists and researchers from the California Institute of Technology and its lead partner, Berkeley Lab, along with collaborators from the SLAC National Accelerator Laboratory, and the University of California campuses at Irvine and San Diego.

  4. Joint Center for Artificial Photosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koval, Carl; Lee, Kenny; Houle, Frances

    2013-12-10

    The Joint Center for Artificial Photosynthesis (JCAP) is the nation's largest research program dedicated to the development of an artificial solar-fuel generation technology. Established in 2010 as a U.S. Department of Energy (DOE) Energy Innovation Hub, JCAP aims to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide as inputs. JCAP brings together more than 140 top scientists and researchers from the California Institute of Technology and its lead partner, Berkeley Lab, along with collaborators from the SLAC National Accelerator Laboratory, and the University of California campuses at Irvine and San Diego.

  5. Dark energy two decades after: observables, probes, consistency tests.

    PubMed

    Huterer, Dragan; Shafer, Daniel L

    2018-01-01

    The discovery of the accelerating universe in the late 1990s was a watershed moment in modern cosmology, as it indicated the presence of a fundamentally new, dominant contribution to the energy budget of the universe. Evidence for dark energy, the new component that causes the acceleration, has since become extremely strong, owing to an impressive variety of increasingly precise measurements of the expansion history and the growth of structure in the universe. Still, one of the central challenges of modern cosmology is to shed light on the physical mechanism behind the accelerating universe. In this review, we briefly summarize the developments that led to the discovery of dark energy. Next, we discuss the parametric descriptions of dark energy and the cosmological tests that allow us to better understand its nature. We then review the cosmological probes of dark energy. For each probe, we briefly discuss the physics behind it and its prospects for measuring dark energy properties. We end with a summary of the current status of dark energy research.

  6. MeV electron acceleration at 1 kHz with <10 mJ laser pulses

    NASA Astrophysics Data System (ADS)

    Salehi, Fatholah; Goers, Andy; Hine, George; Feder, Linus; Kuk, Donghoon; Miao, Bo; Woodbury, Daniel; Kim, Ki-Yong; Milchberg, Howard

    2017-01-01

    We demonstrate laser driven acceleration of electrons to MeV-scale energies at 1 kHz repetition rate using <10 mJ pulses focused on near-critical density He and H2 gas jets. Using the H2 gas jet, electron acceleration to 0.5 MeV in 10 fC bunches was observed with laser pulse energy as low as 1.3 mJ. Increasing the pulse energy to 10 mJ, we measure 1pC charge bunches with >1 MeV energy for both He and H gas jets. Such a high repetition rate, high flux ultrafast source has immediate application to time resolved probing of matter for scientific, medical, or security applications, either using the electrons directly or using a high-Z foil converter to generate ultrafast γ-rays. This work is supported by the US Department of Energy, the National Science Foundation, and the Air Force Office of Scientific Research.

  7. Laboratory Astrophysics Prize: Laboratory Astrophysics with Nuclei

    NASA Astrophysics Data System (ADS)

    Wiescher, Michael

    2018-06-01

    Nuclear astrophysics is concerned with nuclear reaction and decay processes from the Big Bang to the present star generation controlling the chemical evolution of our universe. Such nuclear reactions maintain stellar life, determine stellar evolution, and finally drive stellar explosion in the circle of stellar life. Laboratory nuclear astrophysics seeks to simulate and understand the underlying processes using a broad portfolio of nuclear instrumentation, from reactor to accelerator from stable to radioactive beams to map the broad spectrum of nucleosynthesis processes. This talk focuses on only two aspects of the broad field, the need of deep underground accelerator facilities in cosmic ray free environments in order to understand the nucleosynthesis in stars, and the need for high intensity radioactive beam facilities to recreate the conditions found in stellar explosions. Both concepts represent the two main frontiers of the field, which are being pursued in the US with the CASPAR accelerator at the Sanford Underground Research Facility in South Dakota and the FRIB facility at Michigan State University.

  8. Quantum supercharger library: hyper-parallelism of the Hartree-Fock method.

    PubMed

    Fernandes, Kyle D; Renison, C Alicia; Naidoo, Kevin J

    2015-07-05

    We present here a set of algorithms that completely rewrites the Hartree-Fock (HF) computations common to many legacy electronic structure packages (such as GAMESS-US, GAMESS-UK, and NWChem) into a massively parallel compute scheme that takes advantage of hardware accelerators such as Graphical Processing Units (GPUs). The HF compute algorithm is core to a library of routines that we name the Quantum Supercharger Library (QSL). We briefly evaluate the QSL's performance and report that it accelerates a HF 6-31G Self-Consistent Field (SCF) computation by up to 20 times for medium sized molecules (such as a buckyball) when compared with mature Central Processing Unit algorithms available in the legacy codes in regular use by researchers. It achieves this acceleration by massive parallelization of the one- and two-electron integrals and optimization of the SCF and Direct Inversion in the Iterative Subspace routines through the use of GPU linear algebra libraries. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  9. Turning the LHC ring into a new physics search machine

    NASA Astrophysics Data System (ADS)

    Orava, Risto

    2017-03-01

    The LHC Collider Ring is proposed to be turned into an ultimate automatic search engine for new physics in four consecutive phases: (1) Searches for heavy particles produced in Central Exclusive Process (CEP): pp → p + X + p based on the existing Beam Loss Monitoring (BLM) system of the LHC; (2) Feasibility study of using the LHC Ring as a gravitation wave antenna; (3) Extensions to the current BLM system to facilitate precise registration of the selected CEP proton exit points from the LHC beam vacuum chamber; (4) Integration of the BLM based event tagging system together with the trigger/data acquisition systems of the LHC experiments to facilitate an on-line automatic search machine for the physics of tomorrow.

  10. Search for New Phenomena Using W/Z + (b)-Jets Measurements Performed with the ATLAS Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beauchemin, Pierre-Hugues

    2015-06-30

    The Project proposed to use data of the ATLAS experiment, obtained during the 2011 and 2012 data-taking campaigns, to pursue studies of the strong interaction (QCD) and to examine promising signatures for new physics. The Project also contains a service component dedicated to a detector development initiative. The objective of the strong interaction studies is to determine how various predictions from the main theory (QCD) compare to the data. Results of a set of measurements developed by the Tufts team indicate that the dominant factor of discrepancy between data and QCD predictions come from the mis-modeling of the low energymore » gluon radiation as described by algorithms called parton showers. The discrepancies introduced by parton showers on LHC predictions could even be larger than the effect due to completely new phenomena (dark matter, supersymmetry, etc.) and could thus block further discoveries at the LHC. Some of the results obtained in the course of this Project also specify how QCD predictions must be improved in order to open the possibility for the discovery of something completely new at the LHC during Run-II. This has been integrated in the Run-II ATLAS physics program. Another objective of Tufts studies of the strong interaction was to determine how the hypothesis about an intrinsic heavy-quark component of the proton (strange, charm or bottom quarks) could be tested at the LHC. This hypothesis has been proposed by theorists 30 years ago and is still controversial. The Tufts team demonstrated that intrinsic charms can be observed, or severely constrained, at the LHC, and determine how the measurement should be performed in order to maximize its sensitivity to such an intrinsic heavy-quark component of the proton. Tufts also embarked on performing the measurement that is in progress, but final results are not yet available. They should shade a light of understanding on the fundamental structure of the proton. Determining the nature of dark matter particles, composing about 25% of all the matter in the universe, is one of the most exciting research goals at the LHC. Within this Project, the Tufts team proposed a way to improve over the standard approach used to look for dark matter at the LHC in events involving jets and a large amount of unbalanced energy in the detector (jets+ETmiss). The Tufts team has developed a measurement to test these improvements on data available (ATLAS 2012 dataset), in order to be ready to apply them on the new Run-II data that will be available at the end of 2015. Preliminary results on the proposed measurement indicate that a very high precision can be obtained on results free of detector effects. That will allow for better constrains of dark matter theories and will spare the needs for huge computing resources in order to compare dark matter theories to data. Finally, the Tufts team played a leading role in the development and the organization of the 6Et trigger, the detector component needed to collect the data used in dark matter searches and in many other analyses. The team compared the performance of the various algorithms capable of reconstructing the value of the ETmiss on each LHC collision event, and developed a strategy to commission these algorithms online. Tufts also contributed in the development of the ETmiss trigger monitoring software. Finally, the PI of this Project acted as the co-coordinator of the group of researchers at CERN taking care of the development and the operation of this detector component. The ETmiss trigger is now taking data, opening the possibility for the discovery of otherwise undetectable particles at the LHC.« less

  11. Performance profiling for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Choi, Wonqook; Cho, Kihyeon; Yeo, Insung

    2018-05-01

    In many physics applications, a significant amount of software (e.g. R, ROOT and Geant4) is developed on novel computing architectures, and much effort is expended to ensure the software is efficient in terms of central processing unit (CPU) time and memory usage. Profiling tools are used during the evaluation process to evaluate the efficiency; however, few such tools are able to accommodate low-energy physics regions. To address this limitation, we developed a low-energy physics profiling system in Geant4 to profile the CPU time and memory of software applications in brachytherapy applications. This paper describes and evaluates specific models that are applied to brachytherapy applications in Geant4, such as QGSP_BIC_LIV, QGSP_BIC_EMZ, and QGSP_BIC_EMY. The physics range in this tool allows it to be used to generate low energy profiles in brachytherapy applications. This was a limitation in previous studies, which caused us to develop a new profiling tool that supports profiling in the MeV range, in contrast to the TeV range that is supported by existing high-energy profiling tools. In order to easily compare the profiling results between low-energy and high-energy modes, we employed the same software architecture as that in the SimpliCarlo tool developed at the Fermilab National Accelerator Laboratory (FNAL) for the Large Hadron Collider (LHC). The results show that the newly developed profiling system for low-energy physics (less than MeV) complements the current profiling system used for high-energy physics (greater than TeV) applications.

  12. The new CMS DAQ system for run-2 of the LHC

    DOE PAGES

    Bawej, Tomasz; Behrens, Ulf; Branson, James; ...

    2015-05-21

    The data acquisition (DAQ) system of the CMS experiment at the CERN Large Hadron Collider assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high level trigger (HLT) farm. The HLT farm selects interesting events for storage and offline analysis at a rate of around 1 kHz. The DAQ system has been redesigned during the accelerator shutdown in 2013/14. The motivation is twofold: Firstly, the current compute nodes, networking, and storage infrastructure will have reached the end of their lifetime by the time the LHC restarts. Secondly, in ordermore » to handle higher LHC luminosities and event pileup, a number of sub-detectors will be upgraded, increasing the number of readout channels and replacing the off-detector readout electronics with a μTCA implementation. The new DAQ architecture will take advantage of the latest developments in the computing industry. For data concentration, 10/40 Gb/s Ethernet technologies will be used, as well as an implementation of a reduced TCP/IP in FPGA for a reliable transport between custom electronics and commercial computing hardware. A Clos network based on 56 Gb/s FDR Infiniband has been chosen for the event builder with a throughput of ~ 4 Tb/s. The HLT processing is entirely file based. This allows the DAQ and HLT systems to be independent, and to use the HLT software in the same way as for the offline processing. The fully built events are sent to the HLT with 1/10/40 Gb/s Ethernet via network file systems. Hierarchical collection of HLT accepted events and monitoring meta-data are stored into a global file system. As a result, this paper presents the requirements, technical choices, and performance of the new system.« less

  13. Analysis of 440 GeV proton beam-matter interaction experiments at the High Radiation Materials test facility at CERN

    NASA Astrophysics Data System (ADS)

    Burkart, F.; Schmidt, R.; Raginel, V.; Wollmann, D.; Tahir, N. A.; Shutov, A.; Piriz, A. R.

    2015-08-01

    In a previous paper [Schmidt et al., Phys. Plasmas 21, 080701 (2014)], we presented the first results on beam-matter interaction experiments that were carried out at the High Radiation Materials test facility at CERN. In these experiments, extended cylindrical targets of solid copper were irradiated with beam of 440 GeV protons delivered by the Super Proton Synchrotron (SPS). The beam comprised of a large number of high intensity proton bunches, each bunch having a length of 0.5 ns with a 50 ns gap between two neighboring bunches, while the length of this entire bunch train was about 7 μs. These experiments established the existence of the hydrodynamic tunneling phenomenon the first time. Detailed numerical simulations of these experiments were also carried out which were reported in detail in another paper [Tahir et al., Phys. Rev. E 90, 063112 (2014)]. Excellent agreement was found between the experimental measurements and the simulation results that validate our previous simulations done using the Large Hadron Collider (LHC) beam of 7 TeV protons [Tahir et al., Phys. Rev. Spec. Top.--Accel. Beams 15, 051003 (2012)]. According to these simulations, the range of the full LHC proton beam and the hadronic shower can be increased by more than an order of magnitude due to the hydrodynamic tunneling, compared to that of a single proton. This effect is of considerable importance for the design of machine protection system for hadron accelerators such as SPS, LHC, and Future Circular Collider. Recently, using metal cutting technology, the targets used in these experiments have been dissected into finer pieces for visual and microscopic inspection in order to establish the precise penetration depth of the protons and the corresponding hadronic shower. This, we believe will be helpful in studying the very important phenomenon of hydrodynamic tunneling in a more quantitative manner. The details of this experimental work together with a comparison with the numerical simulations are presented in this paper.

  14. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De, K; Jha, S; Klimentov, A

    2016-01-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full production for the ATLAS experiment since September 2015. We will present our current accomplishments with running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.« less

  15. INTEGRATION OF PANDA WORKLOAD MANAGEMENT SYSTEM WITH SUPERCOMPUTERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De, K; Jha, S; Maeno, T

    Abstract The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the funda- mental nature of matter and the basic forces that shape our universe, and were recently credited for the dis- covery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Datamore » Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data cen- ters are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Com- puting Facility (OLCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single- threaded workloads in parallel on Titan s multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms. We will present our current accom- plishments in running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facility s infrastructure for High Energy and Nuclear Physics, as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.« less

  16. Model for fluorescence quenching in light harvesting complex II in different aggregation states.

    PubMed

    Andreeva, Atanaska; Abarova, Silvia; Stoitchkova, Katerina; Busheva, Mira

    2009-02-01

    Low-temperature (77 K) steady-state fluorescence emission spectroscopy and dynamic light scattering were applied to the main chlorophyll a/b protein light harvesting complex of photosystem II (LHC II) in different aggregation states to elucidate the mechanism of fluorescence quenching within LHC II oligomers. Evidences presented that LHC II oligomers are heterogeneous and consist of large and small particles with different fluorescence yield. At intermediate detergent concentrations the mean size of the small particles is similar to that of trimers, while the size of large particles is comparable to that of aggregated trimers without added detergent. It is suggested that in small particles and trimers the emitter is monomeric chlorophyll, whereas in large aggregates there is also another emitter, which is a poorly fluorescing chlorophyll associate. A model, describing populations of antenna chlorophyll molecules in small and large aggregates in their ground and first singlet excited states, is considered. The model enables us to obtain the ratio of the singlet excited-state lifetimes in small and large particles, the relative amount of chlorophyll molecules in large particles, and the amount of quenchers as a function of the degree of aggregation. These dependencies reveal that the quenching of the chl a fluorescence upon aggregation is due to the formation of large aggregates and the increasing of the amount of chlorophyll molecules forming these aggregates. As a consequence, the amount of quenchers, located in large aggregates, is increased, and their singlet excited-state lifetimes steeply decrease.

  17. Status and Plan for The Upgrade of The CMS Pixel Detector

    NASA Astrophysics Data System (ADS)

    Lu, Rong-Shyang; CMS Collaboration

    2016-04-01

    The silicon pixel detector is the innermost component of the CMS tracking system and plays a crucial role in the all-silicon CMS tracker. While the current pixel tracker is designed for and performing well at an instantaneous luminosity of up to 1 ×1034cm-2s-1, it can no longer be operated efficiently at significantly higher values. Based on the strong performance of the LHC accelerator, it is anticipated that peak luminosities of two times the design luminosity are likely to be reached before 2018 and perhaps significantly exceeded in the running period until 2022, referred to as LHC Run 3. Therefore, an upgraded pixel detector, referred to as the phase 1 upgrade, is planned for the year-end technical stop in 2016. With a new pixel readout chip (ROC), an additional fourth layer, two additional endcap disks, and a significantly reduced material budget the upgraded pixel detector will be able to sustain the efficiency of the pixel tracker at the increased requirements imposed by high luminosities and pile-up. The main new features of the upgraded pixel detector will be an ultra-light mechanical design, a digital readout chip with higher rate capability and a new cooling system. These and other design improvements, along with results of Monte Carlo simulation studies for the expected performance of the new pixel detector, will be discussed and compared to those of the current CMS detector.

  18. Shedding light on neutrino masses with dark forces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Batell, Brian; Pospelov, Maxim; Shuve, Brian

    Heavy right-handed neutrinos, N , provide the simplest explanation for the origin of light neutrino masses and mixings. If M N is at or below the weak scale, direct experimental discovery of these states is possible at accelerator experiments such as the LHC or new dedicated beam dump experiments; in these experiments, N decays after traversing a macroscopic distance from the collision point. The experimental sensitivity to right-handed neutrinos is significantly enhanced if there is a new “dark” gauge force connecting them to the Standard Model (SM), and detection of N can be the primary discovery mode for the newmore » dark force itself. We take the well-motivated example of a B – L gauge symmetry and analyze the sensitivity to displaced decays of N produced via the new gauge interaction in two experiments: the LHC and the proposed SHiP beam dump experiment. In the most favorable case in which the mediator can be produced on-shell and decays to right handed neutrinos (pp → X + V B–L → X + N N ), the sensitivity reach is controlled by the square of the B – L gauge coupling. Here, we demonstrate that these experiments could access neutrino parameters responsible for the observed SM neutrino masses and mixings in the most straightforward implementation of the see-saw mechanism.« less

  19. Applications of electron lenses: scraping of high-power beams, beam-beam compensation, and nonlinear optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stancari, Giulio

    Electron lenses are pulsed, magnetically confined electron beams whose current-density profile is shaped to obtain the desired effect on the circulating beam. Electron lenses were used in the Fermilab Tevatron collider for bunch-by-bunch compensation of long-range beam-beam tune shifts, for removal of uncaptured particles in the abort gap, for preliminary experiments on head-on beam-beam compensation, and for the demonstration of halo scraping with hollow electron beams. Electron lenses for beam-beam compensation are being commissioned in the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory (BNL). Hollow electron beam collimation and halo control were studied as an option to complementmore » the collimation system for the upgrades of the Large Hadron Collider (LHC) at CERN; a conceptual design was recently completed. Because of their electric charge and the absence of materials close to the proton beam, electron lenses may also provide an alternative to wires for long-range beam-beam compensation in LHC luminosity upgrade scenarios with small crossing angles. At Fermilab, we are planning to install an electron lens in the Integrable Optics Test Accelerator (IOTA, a 40-m ring for 150-MeV electrons) as one of the proof-of-principle implementations of nonlinear integrable optics to achieve large tune spreads and more stable beams without loss of dynamic aperture.« less

  20. Suppression versus enhancement of heavy quarkonia in p A collisions

    NASA Astrophysics Data System (ADS)

    Kopeliovich, B. Z.; Schmidt, Iván; Siddikov, M.

    2017-06-01

    We describe the production of heavy quarkonia in p A collisions within the dipole approach by assuming the dominance of the perturbative color-singlet mechanism (CSM) in the pT-integrated cross section. Although accounting for a nonzero heavy Q -Q ¯ separation is a higher-twist correction that is usually neglected, we found it to be the dominant source of nuclear effects, significantly exceeding the effects of leading-twist gluon shadowing and energy loss. Moreover, this contribution turns out to be the most reliably predicted, relying on the precise measurements of the dipole cross section at the Hadron-Electron Ring Accelerator (HERA) at DESY. The nuclear suppression of quarkonia has been anticipated to become stronger with energy because the dipole cross section steeply rises. However, the measured nuclear effects remain essentially unchanged within the energy range from that of the BNL Relativistic Heavy Ion Collider (RHIC) to that of the Large Hadron Collider (LHC). A production mechanism is proposed that enhances the charmonium yield. Nuclear effects for the production of J /ψ , ψ (2 S ) , Υ (1 S ) , and Υ (2 S ) are calculated and are in agreement with data from RHIC and LHC. The dipole description offers a unique explanation for the observed significant nuclear suppression of the ψ (2 S ) -to-J /ψ ratio, which is related to the nontrivial features of the ψ (2 S ) wave function.

  1. Shedding light on neutrino masses with dark forces

    DOE PAGES

    Batell, Brian; Pospelov, Maxim; Shuve, Brian

    2016-08-08

    Heavy right-handed neutrinos, N , provide the simplest explanation for the origin of light neutrino masses and mixings. If M N is at or below the weak scale, direct experimental discovery of these states is possible at accelerator experiments such as the LHC or new dedicated beam dump experiments; in these experiments, N decays after traversing a macroscopic distance from the collision point. The experimental sensitivity to right-handed neutrinos is significantly enhanced if there is a new “dark” gauge force connecting them to the Standard Model (SM), and detection of N can be the primary discovery mode for the newmore » dark force itself. We take the well-motivated example of a B – L gauge symmetry and analyze the sensitivity to displaced decays of N produced via the new gauge interaction in two experiments: the LHC and the proposed SHiP beam dump experiment. In the most favorable case in which the mediator can be produced on-shell and decays to right handed neutrinos (pp → X + V B–L → X + N N ), the sensitivity reach is controlled by the square of the B – L gauge coupling. Here, we demonstrate that these experiments could access neutrino parameters responsible for the observed SM neutrino masses and mixings in the most straightforward implementation of the see-saw mechanism.« less

  2. Anatomy of the inert two-Higgs-doublet model in the light of the LHC and non-LHC dark matter searches

    NASA Astrophysics Data System (ADS)

    Belyaev, Alexander; Cacciapaglia, Giacomo; Ivanov, Igor P.; Rojas-Abatte, Felipe; Thomas, Marc

    2018-02-01

    The inert two-Higgs-doublet model (i2HDM) is a theoretically well-motivated example of a minimal consistent dark matter (DM) model which provides monojet, mono-Z , mono-Higgs, and vector-boson-fusion +ETmiss signatures at the LHC, complemented by signals in direct and indirect DM search experiments. In this paper we have performed a detailed analysis of the constraints in the full five-dimensional parameter space of the i2HDM, coming from perturbativity, unitarity, electroweak precision data, Higgs data from the LHC, DM relic density, direct/indirect DM detection, and LHC monojet analysis, as well as implications of experimental LHC studies on disappearing charged tracks relevant to a high DM mass region. We demonstrate the complementarity of the above constraints and present projections for future LHC data and direct DM detection experiments to probe further i2HDM parameter space. The model is implemented into the CalcHEP and micrOMEGAs packages, which are publicly available at the HEPMDB database, and it is ready for a further exploration in the context of the LHC, relic density, and DM direct detection.

  3. Status and progress of the RERTR program in the year 2003.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Travelli, A.; Nuclear Engineering Division

    2003-01-01

    One of the most important events affecting the RERTR program during the past year was the decision by the U.S. Department of Energy to request the U.S. Congress to significantly increase RERTR program funding. This decision was prompted, at least in part, by the terrible events of September 11, 2001, and by a high-level U.S./Russian Joint Expert Group recommendation to immediately accelerate RERTR program activities in both countries, with the goal of converting all the world's research reactors to low-enriched fuel at the earliest possible time, and including both Soviet-designed and United States-designed research reactors. The U.S. Congress is expectedmore » to approve this request very soon, and the RERTR program has prepared itself well for the intense activities that the 'Accelerated RERTR Program' will require. Promising results have been obtained in the development of a fabrication process for monolithic LEU U-Mo fuel. Most existing and future research reactors could be converted to LEU with this fuel, which has a uranium density between 15.4 and 16.4 g/cm{sup 3} and yielded promising irradiation results in 2002. The most promising method hinges on producing the monolithic meat by cold-rolling a thin ingot produced by casting. The aluminum clad and the meat are bonded by friction stir welding and the cladding surface is finished by a light cold roll. This method can be applied to the production of miniplates and appears to be extendable to the production of full-size plates, possibly with intermediate anneals. Other methods planned for investigation include high temperature bonding and hot isostatic pressing. The progress achieved within the Russian RERTR program, both for the traditional tube-type elements and for the new 'universal' LEU U-Mo pin-type elements, promises to enable soon the conversion of many Russian-designed research and test reactors. Irradiation testing of both fuel types with LEU U-Mo dispersion fuels has begun. Detailed studies are in progress to define the feasibility of converting each Russian-designed research and test reactor to either fuel type. The plan for the Accelerated RERTR Program is structured to achieve LEU conversion of all HEU research reactors supplied by the United States and Russia during the next nine years. This effort will address, in addition to the fuel development and qualification, the analyses and performance/economic/safety evaluations needed to implement the conversions. In combination with this over-arching goal, the RERTR program plans to achieve at the earliest possible date qualification of LEU U-Mo dispersion fuels with uranium densities of 6 g/cm{sup 3} and 7 g/cm{sup 3}. Reactors currently using or planning to use LEU silicide fuel will rely on this fuel after termination of the FRRSNFA program, because it is acceptable to COGEMA for reprocessing. Qualification of LEU U-Mo dispersion fuels has suffered some unavoidable delays but, to accelerate it as much as possible, the RERTR program, the French CEA, and the Australian ANSTO have agreed to jointly pursue a two-element qualification test of LEU U-Mo dispersion fuel with uranium density of 7.0 g/cm{sup 3} to be performed in the Osiris reactor during 2004. The RERTR program also intends to eliminate all obstacles to the utilization of LEU in targets for isotope production, so that this important function can be performed without the need for weapons-grade materials. All of us, working together as we have for many years, can ensure that all these goals will be achieved. By promoting the efficiency and safety of research reactors while eliminating the traffic in weapons-grade uranium, we can prevent the possibility that some of this material might fall in the wrong hands. Few causes can be more deserving of our joint efforts.« less

  4. The High Luminosity LHC Project

    NASA Astrophysics Data System (ADS)

    Rossi, Lucio

    The High Luminosity LHC is one of the major scientific project of the next decade. It aims at increasing the luminosity reach of LHC by a factor five for peak luminosity and a factor ten in integrated luminosity. The project, now fully approved and funded, will be finished in ten years and will prolong the life of LHC until 2035-2040. It implies deep modifications of the LHC for about 1.2 km around the high luminosity insertions of ATLAS and CMS and relies on new cutting edge technologies. We are developing new advanced superconducting magnets capable of reaching 12 T field; superconducting RF crab cavities capable to rotate the beams with great accuracy; 100 kA and hundred meter long superconducting links for removing the power converter out of the tunnel; new collimator concepts, etc... Beside the important physics goals, the High Luminosity LHC project is an ideal test bed for new technologies for the next hadron collider for the post-LHC era.

  5. Progress Report to the U.S. Department of Energy, Grant DE-FG02-91ER40626: Neutrino Physics, Particle Theory and Cosmology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shafi, Qaisar; Barr, Stephen M; Gaisser, Thomas K

    2009-07-30

    Research conducted under this grant over the past year has been driven by the impending operation of the Large Hadron Collider (LHC), and by the ongoing developments in neutrino physics and cosmology. The recent launch of the Planck satellite should have far reaching implications for cosmology in the coming years. Research topics include particle astrophysics, neutrino physics, grand unified theories, Higgs and sparticle spectroscopy, dark energy and dark matter, inflationary cosmology, and baryo/lepto-genesis. Faculty members on the grant are Stephen Barr, Thomas Gaisser, Qaisar Shafi and Todor Stanev. Ilia Gogoladze and Hasan Yuksel are the two postdoctoral scientists supported bymore » the DOE grant. There are currently several excellent students in our research program. One of them, Mansoor Rehman, has been awarded a competitive university fellowship on which he will be supported from September 1, 2009 – June 30, 2010. Another student, Joshua Wickman, has been awarded a fellowship by the Delaware Space Grant Consortium (in affiliation with NASA), and will be supported by this fellowship from September 1, 2009 – August 31, 2010. Both of these students also attended the TASI Summer School in June 2009, at which they each presented a student talk on topics in inflationary cosmology.« less

  6. The Capabilities of the U.S. Government to Collect and Analyze Economic Intelligence

    DTIC Science & Technology

    1994-01-01

    by management ? Some economic and business aspects might not be critically time sensitive; however, knowledge about a competitor’s contract bid and...such a relationship include: reduced product research and development (R&D) timelines, reduced R&D costs, accelerated time from R&D to product marketing...intelligence that business cannot obtain on its own? 19 IimtIinua. Can the IC provide timely information to business in time for it to be used effectively

  7. Wind Power Technologies FY 2017 Budget At-A-Glance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    2016-03-01

    The Wind Program accelerates U.S. deployment of clean, affordable, and reliable domestic wind power through research, development, and demonstration activities. These advanced technology investments directly contribute to the goals for the United States to generate 80% of the nation’s electricity from clean, carbon-free energy sources by 2035; reduce carbon emissions 26%-28% below 2005 levels by 2025; and reduce carbon emissions 80% by 2050 by reducing costs and increasing performance of wind energy systems.

  8. Exploiting microRNA Specificity and Selectivity: Paving a Sustainable Path Towards Precision Medicine.

    PubMed

    Santulli, Gaetano

    2015-01-01

    In his State of the Union address before both chambers of the US Congress, President Barack Obama called for increased investment in US infrastructure and research and announced the launch of a new Precision Medicine Initiative, aiming to accelerate biomedical discovery. Due to their well-established selectivity and specificity, microRNAs can represent a useful tool, both in diagnosis and therapy, in forging the path towards the achievement of precision medicine. This introductory chapter represents a guide for the Reader in examining the functional roles of microRNAs in the most diverse aspects of clinical practice, which will be explored in this third volume of the microRNA trilogy.

  9. Exploiting microRNA Specificity and Selectivity: Paving a Sustainable Path Towards Precision Medicine

    PubMed Central

    2016-01-01

    In his State of the Union address before both chambers of the US Congress, President Barack Obama called for increased investment in US infrastructure and research and announced the launch of a new Precision Medicine Initiative, aiming to accelerate biomedical discovery. Due to their well-established selectivity and specificity, microRNAs can represent a useful tool, both in diagnosis and therapy, in forging the path towards the achievement of precision medicine. This introductory chapter represents a guide for the Reader in examining the functional roles of microRNAs in the most diverse aspects of clinical practice, which will be explored in this third volume of the microRNA trilogy. PMID:26663175

  10. Reliability and Engineering of Thin-Film Photovoltaic Modules. Research forum proceedings

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr. (Editor); Royal, E. L. (Editor)

    1985-01-01

    A Research Forum on Reliability and Engineering of Thin Film Photovoltaic Modules, under sponsorship of the Jet Propulsion Laboratory's Flat Plate Solar Array (FSA) Project and the U.S. Department of Energy, was held in Washington, D.C., on March 20, 1985. Reliability attribute investigations of amorphous silicon cells, submodules, and modules were the subjects addressed by most of the Forum presentations. Included among the reliability research investigations reported were: Arrhenius-modeled accelerated stress tests on a Si cells, electrochemical corrosion, light induced effects and their potential effects on stability and reliability measurement methods, laser scribing considerations, and determination of degradation rates and mechanisms from both laboratory and outdoor exposure tests.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The U.S. Department of Energy's (DOE) Co-Optimization of Fuels & Engines (Co-Optima) initiative is accelerating the introduction of affordable, scalable, and sustainable fuels and high-efficiency, low-emission engines with a first-of-its-kind effort to simultaneously tackle fuel and engine research and development (R&D). This report summarizes accomplishments in the first year of the project. Co-Optima is conducting concurrent research to identify the fuel properties and engine design characteristics needed to maximize vehicle performance and affordability, while deeply cutting emissions. Nine national laboratories - the National Renewable Energy Laboratory and Argonne, Idaho, Lawrence Berkeley, Lawrence Livermore, Los Alamos, Oak Ridge, Pacific Northwest, andmore » Sandia National Laboratories - are collaborating with industry and academia on this groundbreaking research.« less

  12. The development of diamond tracking detectors for the LHC

    NASA Astrophysics Data System (ADS)

    Adam, W.; Berdermann, E.; Bergonzo, P.; de Boer, W.; Bogani, F.; Borchi, E.; Brambilla, A.; Bruzzi, M.; Colledani, C.; Conway, J.; D'Angelo, P.; Dabrowski, W.; Delpierre, P.; Doroshenko, J.; Dulinski, W.; van Eijk, B.; Fallou, A.; Fischer, P.; Fizzotti, F.; Furetta, C.; Gan, K. K.; Ghodbane, N.; Grigoriev, E.; Hallewell, G.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H.; Kaplon, J.; Karl, C.; Kass, R.; Keil, M.; Knöpfle, K. T.; Koeth, T.; Krammer, M.; Logiudice, A.; Lu, R.; mac Lynne, L.; Manfredotti, C.; Marshall, R. D.; Meier, D.; Menichelli, D.; Meuser, S.; Mishina, M.; Moroni, L.; Noomen, J.; Oh, A.; Perera, L.; Pernegger, H.; Pernicka, M.; Polesello, P.; Potenza, R.; Riester, J. L.; Roe, S.; Rudge, A.; Sala, S.; Sampietro, M.; Schnetzer, S.; Sciortino, S.; Stelzer, H.; Stone, R.; Sutera, C.; Trischuk, W.; Tromson, D.; Tuve, C.; Vincenzo, B.; Weilhammer, P.; Wermes, N.; Wetstein, M.; Zeuner, W.; Zoeller, M.; RD42 Collaboration

    2003-11-01

    Chemical vapor deposition diamond has been discussed extensively as an alternate sensor material for use very close to the interaction region of the LHC where extreme radiation conditions exist. During the last few years diamond devices have been manufactured and tested with LHC electronics with the goal of creating a detector usable by all LHC experiment. Extensive progress on diamond quality, on the development of diamond trackers and on radiation hardness studies has been made. Transforming the technology to the LHC specific requirements is now underway. In this paper we present the recent progress achieved.

  13. Joint Center for Satellite Data Assimilation Overview and Research Activities

    NASA Astrophysics Data System (ADS)

    Auligne, T.

    2017-12-01

    In 2001 NOAA/NESDIS, NOAA/NWS, NOAA/OAR, and NASA, subsequently joined by the US Navy and Air Force, came together to form the Joint Center for Satellite Data Assimilation (JCSDA) for the common purpose of accelerating the use of satellite data in environmental numerical prediction modeling by developing, using, and anticipating advances in numerical modeling, satellite-based remote sensing, and data assimilation methods. The primary focus was to bring these advances together to improve operational numerical model-based forecasting, under the premise that these partners have common technical and logistical challenges assimilating satellite observations into their modeling enterprises that could be better addressed through cooperative action and/or common solutions. Over the last 15 years, the JCSDA has made and continues to make major contributions to operational assimilation of satellite data. The JCSDA is a multi-agency U.S. government-owned-and-operated organization that was conceived as a venue for the several agencies NOAA, NASA, USAF and USN to collaborate on advancing the development and operational use of satellite observations into numerical model-based environmental analysis and forecasting. The primary mission of the JCSDA is to "accelerate and improve the quantitative use of research and operational satellite data in weather, ocean, climate and environmental analysis and prediction systems." This mission is fulfilled through directed research targeting the following key science objectives: Improved radiative transfer modeling; new instrument assimilation; assimilation of humidity, clouds, and precipitation observations; assimilation of land surface observations; assimilation of ocean surface observations; atmospheric composition; and chemistry and aerosols. The goal of this presentation is to briefly introduce the JCSDA's mission and vision, and to describe recent research activities across various JCSDA partners.

  14. Physics through the 1990s: Atomic, molecular and optical physics

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The volume presents a program of research initiatives in atomic, molecular, and optical physics. The current state of atomic, molecular, and optical physics in the US is examined with respect to demographics, education patterns, applications, and the US economy. Recommendations are made for each field, with discussions of their histories and the relevance of the research to government agencies. The section on atomic physics includes atomic theory, structure, and dynamics; accelerator-based atomic physics; and large facilities. The section on molecular physics includes spectroscopy, scattering theory and experiment, and the dynamics of chemical reactions. The section on optical physics discusses lasers, laser spectroscopy, and quantum optics and coherence. A section elucidates interfaces between the three fields and astrophysics, condensed matter physics, surface science, plasma physics, atmospheric physics, and nuclear physics. Another section shows applications of the three fields in ultra-precise measurements, fusion, national security, materials, medicine, and other topics.

  15. Non-US electrodynamic launchers research and development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, J.V.; Batteh, J.H.; Greig, J.R.

    Electrodynamic launcher research and development work of scientists outside the United States is analyzed and assessed by six internationally recognized US experts in the field of electromagnetic and electrothermal launchers. The assessment covers five broad technology areas: (1) Experimental railguns; (2) Railgun theory and design; (3) Induction launchers; (4) Electrothermal guns; (5) Energy storage and power supplies. The overall conclusion is that non-US work on electrodynamic launchers is maturing rapidly after a relatively late start in many countries. No foreign program challenges the US efforts in scope, but it is evident that the United States may be surpassed in somemore » technologies within the next few years. Until recently, published Russian work focused on hypervelocity for research purposes. Within the last two years, large facilities have been described where military-oriented development has been underway since the mid-1980s. Financial support for these large facilities appears to have collapsed, leaving no effective effort to develop practical launchers for military or civilian applications. Electrodynamic launcher research in Europe is making rapid progress by focusing on a single application, tactical launchers for the military. Four major laboratories, in Britain, France, Germany, and the Netherlands, are working on this problem. Though narrower in scope than the US effort, the European work enjoys a continuity of support that has accelerated its progress. The next decade will see the deployment of electrodynamic launcher technology, probably in the form of an electrothermal-chemical upgrade for an existing gun system. The time scale for deployment of electromagnetic launchers is entirely dependent on the level of research-and-development effort. If resources remain limited, the advantage will lie with cooperative efforts that have reasonably stable funding such as the present French-German program.« less

  16. Accelerating Biomedical Discoveries through Rigor and Transparency.

    PubMed

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  17. PDF4LHC recommendations for LHC Run II

    DOE PAGES

    Butterworth, Jon; Carrazza, Stefano; Cooper-Sarkar, Amanda; ...

    2016-01-06

    We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+αs uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. Lastly, we finally discuss tools which allow for themore » delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.« less

  18. Big Mysteries: The Higgs Mass

    ScienceCinema

    Lincoln, Don

    2018-01-16

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  19. Big Mysteries: The Higgs Mass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, Don

    2014-04-28

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed ideamore » is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.« less

  20. MoEDAL - a new light on the high-energy frontier

    NASA Astrophysics Data System (ADS)

    Fairbairn, Malcolm; Pinfold, James L.

    2017-01-01

    In 2010, the MoEDAL (MOnopole and Exotics Detector at the LHC) experiment at the Large Hadron Collider (LHC) was unanimously approved by European Centre for Nuclear Research's Research Board to start data taking in 2015. MoEDAL is a pioneering experiment designed to search for highly ionising manifestations of new physics such as magnetic monopoles or massive (pseudo-)stable charged particles. Its groundbreaking physics programme defines a number of scenarios that yield potentially revolutionary insights into such foundational questions as: are there extra dimensions or new symmetries; does magnetic charge exist; what is the nature of dark matter; and, how did the Big Bang develop. MoEDAL's purpose is to meet such far-reaching challenges at the frontier of the field. The innovative MoEDAL detector employs unconventional methodologies tuned to the prospect of discovery physics. The largely passive MoEDAL detector, deployed at Point 8 on the LHC ring, has a dual nature. First, it acts like a giant camera, comprised of nuclear track detectors - analysed offline by ultra fast scanning microscopes - sensitive only to new physics. Second, it is uniquely able to trap the particle messengers of physics beyond the Standard Model for further study. MoEDAL's radiation environment is monitored by a state-of-the-art real-time TimePix pixel detector array. A new MoEDAL sub-detector designed to extend MoEDAL reach to mini-charged, minimally ionising particles is under study.

Top