Sample records for interface cerne refletor

  1. CERN and 60 years of science for peace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heuer, Rolf-Dieter, E-mail: Rolf.Heuer@cern.ch

    2015-02-24

    This paper presents CERN as it celebrates its 60{sup th} Anniversary since its founding. The presentation first discusses the mission of CERN and its role as an inter-governmental Organization. The paper also reviews aspects of the particle physics research programme, looking at both current and future accelerator-based facilities at the high-energy and intensity frontiers. Finally, the paper considers issues beyond fundamental research, such as capacity-building and the interface between Art and Science.

  2. New directions in the CernVM file system

    NASA Astrophysics Data System (ADS)

    Blomer, Jakob; Buncic, Predrag; Ganis, Gerardo; Hardi, Nikola; Meusel, Rene; Popescu, Radu

    2017-10-01

    The CernVM File System today is commonly used to host and distribute application software stacks. In addition to this core task, recent developments expand the scope of the file system into two new areas. Firstly, CernVM-FS emerges as a good match for container engines to distribute the container image contents. Compared to native container image distribution (e.g. through the “Docker registry”), CernVM-FS massively reduces the network traffic for image distribution. This has been shown, for instance, by a prototype integration of CernVM-FS into Mesos developed by Mesosphere, Inc. We present a path for a smooth integration of CernVM-FS and Docker. Secondly, CernVM-FS recently raised new interest as an option for the distribution of experiment conditions data. Here, the focus is on improved versioning capabilities of CernVM-FS that allows to link the conditions data of a run period to the state of a CernVM-FS repository. Lastly, CernVM-FS has been extended to provide a name space for physics data for the LIGO and CMS collaborations. Searching through a data namespace is often done by a central, experiment specific database service. A name space on CernVM-FS can particularly benefit from an existing, scalable infrastructure and from the POSIX file system interface.

  3. INTEGRATED OPERATIONAL DOSIMETRY SYSTEM AT CERN.

    PubMed

    Dumont, Gérald; Pedrosa, Fernando Baltasar Dos Santos; Carbonez, Pierre; Forkel-Wirth, Doris; Ninin, Pierre; Fuentes, Eloy Reguero; Roesler, Stefan; Vollaire, Joachim

    2017-04-01

    CERN, the European Organization for Nuclear Research, upgraded its operational dosimetry system in March 2013 to be prepared for the first Long Shutdown of CERN's facilities. The new system allows the immediate and automatic checking and recording of the dosimetry data before and after interventions in radiation areas. To facilitate the analysis of the data in context of CERN's approach to As Low As Reasonably Achievable (ALARA), this new system is interfaced to the Intervention Management Planning and Coordination Tool (IMPACT). IMPACT is a web-based application widely used in all CERN's accelerators and their associated technical infrastructures for the planning, the coordination and the approval of interventions (work permit principle). The coupling of the operational dosimetry database with the IMPACT repository allows a direct and almost immediate comparison of the actual dose with the estimations, in addition to enabling the configuration of alarm levels in the dosemeter in function of the intervention to be performed. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. FwWebViewPlus: integration of web technologies into WinCC OA based Human-Machine Interfaces at CERN

    NASA Astrophysics Data System (ADS)

    Golonka, Piotr; Fabian, Wojciech; Gonzalez-Berges, Manuel; Jasiun, Piotr; Varela-Rodriguez, Fernando

    2014-06-01

    The rapid growth in popularity of web applications gives rise to a plethora of reusable graphical components, such as Google Chart Tools and JQuery Sparklines, implemented in JavaScript and run inside a web browser. In the paper we describe the tool that allows for seamless integration of web-based widgets into WinCC Open Architecture, the SCADA system used commonly at CERN to build complex Human-Machine Interfaces. Reuse of widely available widget libraries and pushing the development efforts to a higher abstraction layer based on a scripting language allow for significant reduction in maintenance of the code in multi-platform environments compared to those currently used in C++ visualization plugins. Adequately designed interfaces allow for rapid integration of new web widgets into WinCC OA. At the same time, the mechanisms familiar to HMI developers are preserved, making the use of new widgets "native". Perspectives for further integration between the realms of WinCC OA and Web development are also discussed.

  5. The Cortex project A quasi-real-time information system to build control systems for high energy physics experiments

    NASA Astrophysics Data System (ADS)

    Barillere, R.; Cabel, H.; Chan, B.; Goulas, I.; Le Goff, J. M.; Vinot, L.; Willmott, C.; Milcent, H.; Huuskonen, P.

    1994-12-01

    The Cortex control information system framework is being developed at CERN. It offers basic functions to allow the sharing of information, control and analysis functions; it presents a uniform human interface for such information and functions; it permits upgrades and additions without code modification and it is sufficiently generic to allow its use by most of the existing or future control systems at CERN. Services will include standard interfaces to user-supplied functions, analysis, archive and event management. Cortex does not attempt to carry out the direct data acquisition or control of the devices; these are activities which are highly specific to the application and are best done by commercial systems or user-written programs. Instead, Cortex integrates these application-specific pieces and supports them by supplying other commonly needed facilities such as collaboration, analysis, diagnosis and user assistance.

  6. Data acquisition using the 168/E. [CERN ISR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, J.T.; Cittolin, S.; Demoulin, M.

    1983-03-01

    Event sizes and data rates at the CERN anti p p collider compose a formidable environment for a high level trigger. A system using three 168/E processors for experiment UA1 real-time event selection is described. With 168/E data memory expanded to 512K bytes, each processor holds a complete event allowing a FORTRAN trigger algorithm access to data from the entire detector. A smart CAMAC interface reads five Remus branches in parallel transferring one word to the target processor every 0.5 ..mu..s. The NORD host computer can simultaneously read an accepted event from another processor.

  7. The evolution of the ISOLDE control system

    NASA Astrophysics Data System (ADS)

    Jonsson, O. C.; Catherall, R.; Deloose, I.; Drumm, P.; Evensen, A. H. M.; Gase, K.; Focker, G. J.; Fowler, A.; Kugler, E.; Lettry, J.; Olesen, G.; Ravn, H. L.; Isolde Collaboration

    The ISOLDE on-line mass separator facility is operating on a Personal Computer based control system since spring 1992. Front End Computers accessing the hardware are controlled from consoles running Microsoft Windows ™ through a Novell NetWare4 ™ local area network. The control system is transparently integrated in the CERN wide office network and makes heavy use of the CERN standard office application programs to control and to document the running of the ISOLDE isotope separators. This paper recalls the architecture of the control system, shows its recent developments and gives some examples of its graphical user interface.

  8. The evolution of the ISOLDE control system

    NASA Astrophysics Data System (ADS)

    Jonsson, O. C.; Catherall, R.; Deloose, I.; Evensen, A. H. M.; Gase, K.; Focker, G. J.; Fowler, A.; Kugler, E.; Lettry, J.; Olesen, G.; Ravn, H. L.; Drumm, P.

    1996-04-01

    The ISOLDE on-line mass separator facility is operating on a Personal Computer based control system since spring 1992. Front End Computers accessing the hardware are controlled from consoles running Microsoft Windows® through a Novell NetWare4® local area network. The control system is transparently integrated in the CERN wide office network and makes heavy use of the CERN standard office application programs to control and to document the running of the ISOLDE isotope separators. This paper recalls the architecture of the control system, shows its recent developments and gives some examples of its graphical user interface.

  9. Open access for ALICE analysis based on virtualization technology

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  10. CERN alerter—RSS based system for information broadcast to all CERN offices

    NASA Astrophysics Data System (ADS)

    Otto, R.

    2008-07-01

    Nearly every large organization uses a tool to broadcast messages and information across the internal campus (messages like alerts announcing interruption in services or just information about upcoming events). These tools typically allow administrators (operators) to send 'targeted' messages which are sent only to specific groups of users or computers, e/g only those located in a specified building or connected to a particular computing service. CERN has a long history of such tools: CERNVMS's SPM_quotMESSAGE command, Zephyr [2] and the most recent the NICE Alerter based on the NNTP protocol. The NICE Alerter used on all Windows-based computers had to be phased out as a consequence of phasing out NNTP at CERN. The new solution to broadcast information messages on the CERN campus continues to provide the service based on cross-platform technologies, hence minimizing custom developments and relying on commercial software as much as possible. The new system, called CERN Alerter, is based on RSS (Really Simple Syndication) [9] for the transport protocol and uses Microsoft SharePoint as the backend for database and posting interface. The windows-based client relies on Internet Explorer 7.0 with custom code to trigger the window pop-ups and the notifications for new events. Linux and Mac OS X clients could also rely on any RSS readers to subscribe to targeted notifications. The paper covers the architecture and implementation aspects of the new system.

  11. The Shock and Vibration Digest. Volume 4. Number 7, July 1972.

    DTIC Science & Technology

    1972-07-01

    who are con- structural analysis program cerned with maximum reliability NASTRAN will be discussed, of missiles, aircraft, submarines, Contact...within a designated epsilon at the interface between air and the first fluid. Trial solutions are made until the desired solution is bracketed and then

  12. Deploying the ATLAS Metadata Interface (AMI) on the cloud with Jenkins

    NASA Astrophysics Data System (ADS)

    Lambert, F.; Odier, J.; Fulachier, J.; ATLAS Collaboration

    2017-10-01

    The ATLAS Metadata Interface (AMI) is a mature application of more than 15 years of existence. Mainly used by the ATLAS experiment at CERN, it consists of a very generic tool ecosystem for metadata aggregation and cataloguing. AMI is used by the ATLAS production system, therefore the service must guarantee a high level of availability. We describe our monitoring and administration systems, and the Jenkins-based strategy used to dynamically test and deploy cloud OpenStack nodes on demand.

  13. The keys to CERN conference rooms - Managing local collaboration facilities in large organisations

    NASA Astrophysics Data System (ADS)

    Baron, T.; Domaracky, M.; Duran, G.; Fernandes, J.; Ferreira, P.; Gonzalez Lopez, J. B.; Jouberjean, F.; Lavrut, L.; Tarocco, N.

    2014-06-01

    For a long time HEP has been ahead of the curve in its usage of remote collaboration tools, like videoconference and webcast, while the local CERN collaboration facilities were somewhat behind the expected quality standards for various reasons. This time is now over with the creation by the CERN IT department in 2012 of an integrated conference room service which provides guidance and installation services for new rooms (either equipped for videoconference or not), as well as maintenance and local support. Managing now nearly half of the 246 meeting rooms available on the CERN sites, this service has been built to cope with the management of all CERN rooms with limited human resources. This has been made possible by the intensive use of professional software to manage and monitor all the room equipment, maintenance and activity. This paper focuses on presenting these packages, either off-the-shelf commercial products (asset and maintenance management tool, remote audio-visual equipment monitoring systems, local automation devices, new generation touch screen interfaces for interacting with the room) when available or locally developed integration and operational layers (generic audio-visual control and monitoring framework) and how they help overcoming the challenges presented by such a service. The aim is to minimise local human interventions while preserving the highest service quality and placing the end user back in the centre of this collaboration platform.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute Particle Cosmology which will take placemore » from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line.« less

  15. COSMO 09

    ScienceCinema

    None

    2018-02-13

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute Particle Cosmology which will take place from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line.

  16. HPC in a HEP lab: lessons learned from setting up cost-effective HPC clusters

    NASA Astrophysics Data System (ADS)

    Husejko, Michal; Agtzidis, Ioannis; Baehler, Pierre; Dul, Tadeusz; Evans, John; Himyr, Nils; Meinhard, Helge

    2015-12-01

    In this paper we present our findings gathered during the evaluation and testing of Windows Server High-Performance Computing (Windows HPC) in view of potentially using it as a production HPC system for engineering applications. The Windows HPC package, an extension of Microsofts Windows Server product, provides all essential interfaces, utilities and management functionality for creating, operating and monitoring a Windows-based HPC cluster infrastructure. The evaluation and test phase was focused on verifying the functionalities of Windows HPC, its performance, support of commercial tools and the integration with the users work environment. We describe constraints imposed by the way the CERN Data Centre is operated, licensing for engineering tools and scalability and behaviour of the HPC engineering applications used at CERN. We will present an initial set of requirements, which were created based on the above constraints and requests from the CERN engineering user community. We will explain how we have configured Windows HPC clusters to provide job scheduling functionalities required to support the CERN engineering user community, quality of service, user- and project-based priorities, and fair access to limited resources. Finally, we will present several performance tests we carried out to verify Windows HPC performance and scalability.

  17. Accelerator controls at CERN: Some converging trends

    NASA Astrophysics Data System (ADS)

    Kuiper, B.

    1990-08-01

    CERN's growing services to the high-energy physics community using frozen resources has led to the implementation of "Technical Boards", mandated to assist the management by making recommendations for rationalizations in various technological domains. The Board on Process Control and Electronics for Accelerators, TEBOCO, has emphasized four main lines which might yield economy in resources. First, a common architecture for accelerator controls has been agreed between the three accelerator divisions. Second, a common hardware/software kit has been defined, from which the large majority of future process interfacing may be composed. A support service for this kit is an essential part of the plan. Third, high-level protocols have been developed for standardizing access to process devices. They derive from agreed standard models of the devices and involve a standard control message. This should ease application development and mobility of equipment. Fourth, a common software engineering methodology and a commercial package of application development tools have been adopted. Some rationalization in the field of the man-machine interface and in matters of synchronization is also under way.

  18. Commissioning the CERN IT Agile Infrastructure with experiment workloads

    NASA Astrophysics Data System (ADS)

    Medrano Llamas, Ramón; Harald Barreiro Megino, Fernando; Kucharczyk, Katarzyna; Kamil Denis, Marek; Cinquilli, Mattia

    2014-06-01

    In order to ease the management of their infrastructure, most of the WLCG sites are adopting cloud based strategies. In the case of CERN, the Tier 0 of the WLCG, is completely restructuring the resource and configuration management of their computing center under the codename Agile Infrastructure. Its goal is to manage 15,000 Virtual Machines by means of an OpenStack middleware in order to unify all the resources in CERN's two datacenters: the one placed in Meyrin and the new on in Wigner, Hungary. During the commissioning of this infrastructure, CERN IT is offering an attractive amount of computing resources to the experiments (800 cores for ATLAS and CMS) through a private cloud interface. ATLAS and CMS have joined forces to exploit them by running stress tests and simulation workloads since November 2012. This work will describe the experience of the first deployments of the current experiment workloads on the CERN private cloud testbed. The paper is organized as follows: the first section will explain the integration of the experiment workload management systems (WMS) with the cloud resources. The second section will revisit the performance and stress testing performed with HammerCloud in order to evaluate and compare the suitability for the experiment workloads. The third section will go deeper into the dynamic provisioning techniques, such as the use of the cloud APIs directly by the WMS. The paper finishes with a review of the conclusions and the challenges ahead.

  19. Integrating Containers in the CERN Private Cloud

    NASA Astrophysics Data System (ADS)

    Noel, Bertrand; Michelino, Davide; Velten, Mathieu; Rocha, Ricardo; Trigazis, Spyridon

    2017-10-01

    Containers remain a hot topic in computing, with new use cases and tools appearing every day. Basic functionality such as spawning containers seems to have settled, but topics like volume support or networking are still evolving. Solutions like Docker Swarm, Kubernetes or Mesos provide similar functionality but target different use cases, exposing distinct interfaces and APIs. The CERN private cloud is made of thousands of nodes and users, with many different use cases. A single solution for container deployment would not cover every one of them, and supporting multiple solutions involves repeating the same process multiple times for integration with authentication services, storage services or networking. In this paper we describe OpenStack Magnum as the solution to offer container management in the CERN cloud. We will cover its main functionality and some advanced use cases using Docker Swarm and Kubernetes, highlighting some relevant differences between the two. We will describe the most common use cases in HEP and how we integrated popular services like CVMFS or AFS in the most transparent way possible, along with some limitations found. Finally we will look into ongoing work on advanced scheduling for both Swarm and Kubernetes, support for running batch like workloads and integration of container networking technologies with the CERN infrastructure.

  20. Evolution of the architecture of the ATLAS Metadata Interface (AMI)

    NASA Astrophysics Data System (ADS)

    Odier, J.; Aidel, O.; Albrand, S.; Fulachier, J.; Lambert, F.

    2015-12-01

    The ATLAS Metadata Interface (AMI) is now a mature application. Over the years, the number of users and the number of provided functions has dramatically increased. It is necessary to adapt the hardware infrastructure in a seamless way so that the quality of service re - mains high. We describe the AMI evolution since its beginning being served by a single MySQL backend database server to the current state having a cluster of virtual machines at French Tier1, an Oracle database at Lyon with complementary replication to the Oracle DB at CERN and AMI back-up server.

  1. A MOdular System for Acquisition, Interface and Control (MOSAIC) of detectors and their related electronics for high energy physics experiment

    NASA Astrophysics Data System (ADS)

    Robertis, G. De; Fanizzi, G.; Loddo, F.; Manzari, V.; Rizzi, M.

    2018-02-01

    In this work the MOSAIC ("MOdular System for Acquisition, Interface and Control") board, designed for the readout and testing of the pixel modules for the silicon tracker upgrade of the ALICE (A Large Ion Collider Experiment) experiment at teh CERN LHC, is described. It is based on an Artix7 Field Programmable Gate Array device by Xilinx and is compliant with the six unit "Versa Modular Eurocard" standard (6U-VME) for easy housing in a standard VMEbus crate from which it takes only power supplies and cooling.

  2. The ALICE Software Release Validation cluster

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Krzewicki, M.

    2015-12-01

    One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future.

  3. EOS developments

    NASA Astrophysics Data System (ADS)

    Sindrilaru, Elvin A.; Peters, Andreas J.; Adde, Geoffray M.; Duellmann, Dirk

    2017-10-01

    CERN has been developing and operating EOS as a disk storage solution successfully for over 6 years. The CERN deployment provides 135 PB and stores 1.2 billion replicas distributed over two computer centres. Deployment includes four LHC instances, a shared instance for smaller experiments and since last year an instance for individual user data as well. The user instance represents the backbone of the CERNBOX service for file sharing. New use cases like synchronisation and sharing, the planned migration to reduce AFS usage at CERN and the continuous growth has brought EOS to new challenges. Recent developments include the integration and evaluation of various technologies to do the transition from a single active in-memory namespace to a scale-out implementation distributed over many meta-data servers. The new architecture aims to separate the data from the application logic and user interface code, thus providing flexibility and scalability to the namespace component. Another important goal is to provide EOS as a CERN-wide mounted filesystem with strong authentication making it a single storage repository accessible via various services and front- ends (/eos initiative). This required new developments in the security infrastructure of the EOS FUSE implementation. Furthermore, there were a series of improvements targeting the end-user experience like tighter consistency and latency optimisations. In collaboration with Seagate as Openlab partner, EOS has a complete integration of OpenKinetic object drive cluster as a high-throughput, high-availability, low-cost storage solution. This contribution will discuss these three main development projects and present new performance metrics.

  4. The third level trigger and output event unit of the UA1 data-acquisition system

    NASA Astrophysics Data System (ADS)

    Cittolin, S.; Demoulin, M.; Fucci, A.; Haynes, W.; Martin, B.; Porte, J. P.; Sphicas, P.

    1989-12-01

    The upgraded UA1 experiment utilizes twelve 3081/E emulators for its third-level trigger system. The system is interfaced to VME, and is controlled by 68000 microprocessor VME boards on the input and output. The output controller communicates with an IBM 9375 mainframe via the CERN-IBM developed VICI interface. The events selected by the emulators are output on IBM-3480 cassettes. The user interface to this system is based on a series of Macintosh personal computer connected to the VME bus. These Macs are also used for developing software for the emulators and for monitoring the entire system. The same configuration has also been used for offline event reconstruction. A description of the system, together with details of both the online and offline modes of operation and an eveluation of its performance are presented.

  5. ATLAS Eventlndex monitoring system using the Kibana analytics and visualization platform

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Cárdenas Zárate, S. E.; Favareto, A.; Fernandez Casani, A.; Gallas, E. J.; Garcia Montoro, C.; Gonzalez de la Hoz, S.; Hrivnac, J.; Malon, D.; Prokoshin, F.; Salt, J.; Sanchez, J.; Toebbicke, R.; Yuan, R.; ATLAS Collaboration

    2016-10-01

    The ATLAS EventIndex is a data catalogue system that stores event-related metadata for all (real and simulated) ATLAS events, on all processing stages. As it consists of different components that depend on other applications (such as distributed storage, and different sources of information) we need to monitor the conditions of many heterogeneous subsystems, to make sure everything is working correctly. This paper describes how we gather information about the EventIndex components and related subsystems: the Producer-Consumer architecture for data collection, health parameters from the servers that run EventIndex components, EventIndex web interface status, and the Hadoop infrastructure that stores EventIndex data. This information is collected, processed, and then displayed using CERN service monitoring software based on the Kibana analytic and visualization package, provided by CERN IT Department. EventIndex monitoring is used both by the EventIndex team and ATLAS Distributed Computing shifts crew.

  6. Tape SCSI monitoring and encryption at CERN

    NASA Astrophysics Data System (ADS)

    Laskaridis, Stefanos; Bahyl, V.; Cano, E.; Leduc, J.; Murray, S.; Cancio, G.; Kruse, D.

    2017-10-01

    CERN currently manages the largest data archive in the HEP domain; over 180PB of custodial data is archived across 7 enterprise tape libraries containing more than 25,000 tapes and using over 100 tape drives. Archival storage at this scale requires a leading edge monitoring infrastructure that acquires live and lifelong metrics from the hardware in order to assess and proactively identify potential drive and media level issues. In addition, protecting the privacy of sensitive archival data is becoming increasingly important and with it the need for a scalable, compute-efficient and cost-effective solution for data encryption. In this paper, we first describe the implementation of acquiring tape medium and drive related metrics reported by the SCSI interface and its integration with our monitoring system. We then address the incorporation of tape drive real-time encryption with dedicated drive hardware into the CASTOR [1] hierarchical mass storage system.

  7. Perfmon2: a leap forward in performance monitoring

    NASA Astrophysics Data System (ADS)

    Jarp, S.; Jurga, R.; Nowak, A.

    2008-07-01

    This paper describes the software component, perfmon2, that is about to be added to the Linux kernel as the standard interface to the Performance Monitoring Unit (PMU) on common processors, including x86 (AMD and Intel), Sun SPARC, MIPS, IBM Power and Intel Itanium. It also describes a set of tools for doing performance monitoring in practice and details how the CERN openlab team has participated in the testing and development of these tools.

  8. The Evolution of CERN EDMS

    NASA Astrophysics Data System (ADS)

    Wardzinska, Aleksandra; Petit, Stephan; Bray, Rachel; Delamare, Christophe; Garcia Arza, Griselda; Krastev, Tsvetelin; Pater, Krzysztof; Suwalska, Anna; Widegren, David

    2015-12-01

    Large-scale long-term projects such as the LHC require the ability to store, manage, organize and distribute large amounts of engineering information, covering a wide spectrum of fields. This information is a living material, evolving in time, following specific lifecycles. It has to reach the next generations of engineers so they understand how their predecessors designed, crafted, operated and maintained the most complex machines ever built. This is the role of CERN EDMS. The Engineering and Equipment Data Management Service has served the High Energy Physics Community for over 15 years. It is CERN's official PLM (Product Lifecycle Management), supporting engineering communities in their collaborations inside and outside the laboratory. EDMS is integrated with the CAD (Computer-aided Design) and CMMS (Computerized Maintenance Management) systems used at CERN providing tools for engineers who work in different domains and who are not PLM specialists. Over the years, human collaborations and machines grew in size and complexity. So did EDMS: it is currently home to more than 2 million files and documents, and has over 6 thousand active users. In April 2014 we released a new major version of EDMS, featuring a complete makeover of the web interface, improved responsiveness and enhanced functionality. Following the results of user surveys and building upon feedback received from key users group, we brought what we think is a system that is more attractive and makes it easy to perform complex tasks. In this paper we will describe the main functions and the architecture of EDMS. We will discuss the available integration options, which enable further evolution and automation of engineering data management. We will also present our plans for the future development of EDMS.

  9. Monitoring tools of COMPASS experiment at CERN

    NASA Astrophysics Data System (ADS)

    Bodlak, M.; Frolov, V.; Huber, S.; Jary, V.; Konorov, I.; Levit, D.; Novy, J.; Salac, R.; Tomsa, J.; Virius, M.

    2015-12-01

    This paper briefly introduces the data acquisition system of the COMPASS experiment and is mainly focused on the part that is responsible for the monitoring of the nodes in the whole newly developed data acquisition system of this experiment. The COMPASS is a high energy particle experiment with a fixed target located at the SPS of the CERN laboratory in Geneva, Switzerland. The hardware of the data acquisition system has been upgraded to use FPGA cards that are responsible for data multiplexing and event building. The software counterpart of the system includes several processes deployed in heterogenous network environment. There are two processes, namely Message Logger and Message Browser, taking care of monitoring. These tools handle messages generated by nodes in the system. While Message Logger collects and saves messages to the database, the Message Browser serves as a graphical interface over the database containing these messages. For better performance, certain database optimizations have been used. Lastly, results of performance tests are presented.

  10. Detector Control System for the AFP detector in ATLAS experiment at CERN

    NASA Astrophysics Data System (ADS)

    Banaś, E.; Caforio, D.; Czekierda, S.; Hajduk, Z.; Olszowska, J.; Seabra, L.; Šícho, P.

    2017-10-01

    The ATLAS Forward Proton (AFP) detector consists of two forward detectors located at 205 m and 217 m on either side of the ATLAS experiment. The aim is to measure the momenta and angles of diffractively scattered protons. In 2016, two detector stations on one side of the ATLAS interaction point were installed and commissioned. The detector infrastructure and necessary services were installed and are supervised by the Detector Control System (DCS), which is responsible for the coherent and safe operation of the detector. A large variety of used equipment represents a considerable challenge for the AFP DCS design. Industrial Supervisory Control and Data Acquisition (SCADA) product Siemens WinCCOA, together with the CERN Joint Control Project (JCOP) framework and standard industrial and custom developed server applications and protocols are used for reading, processing, monitoring and archiving of the detector parameters. Graphical user interfaces allow for overall detector operation and visualization of the detector status. Parameters, important for the detector safety, are used for alert generation and interlock mechanisms.

  11. Data Aggregation System: A system for information retrieval on demand over relational and non-relational distributed data sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ball, G.; Kuznetsov, V.; Evans, D.

    We present the Data Aggregation System, a system for information retrieval and aggregation from heterogenous sources of relational and non-relational data for the Compact Muon Solenoid experiment on the CERN Large Hadron Collider. The experiment currently has a number of organically-developed data sources, including front-ends to a number of different relational databases and non-database data services which do not share common data structures or APIs (Application Programming Interfaces), and cannot at this stage be readily converged. DAS provides a single interface for querying all these services, a caching layer to speed up access to expensive underlying calls and the abilitymore » to merge records from different data services pertaining to a single primary key.« less

  12. Engineering the CernVM-Filesystem as a High Bandwidth Distributed Filesystem for Auxiliary Physics Data

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Bockelman, B.; Blomer, J.; Herner, K.; Levshina, T.; Slyz, M.

    2015-12-01

    A common use pattern in the computing models of particle physics experiments is running many distributed applications that read from a shared set of data files. We refer to this data is auxiliary data, to distinguish it from (a) event data from the detector (which tends to be different for every job), and (b) conditions data about the detector (which tends to be the same for each job in a batch of jobs). Relatively speaking, conditions data also tends to be relatively small per job where both event data and auxiliary data are larger per job. Unlike event data, auxiliary data comes from a limited working set of shared files. Since there is spatial locality of the auxiliary data access, the use case appears to be identical to that of the CernVM- Filesystem (CVMFS). However, we show that distributing auxiliary data through CVMFS causes the existing CVMFS infrastructure to perform poorly. We utilize a CVMFS client feature called "alien cache" to cache data on existing local high-bandwidth data servers that were engineered for storing event data. This cache is shared between the worker nodes at a site and replaces caching CVMFS files on both the worker node local disks and on the site's local squids. We have tested this alien cache with the dCache NFSv4.1 interface, Lustre, and the Hadoop Distributed File System (HDFS) FUSE interface, and measured performance. In addition, we use high-bandwidth data servers at central sites to perform the CVMFS Stratum 1 function instead of the low-bandwidth web servers deployed for the CVMFS software distribution function. We have tested this using the dCache HTTP interface. As a result, we have a design for an end-to-end high-bandwidth distributed caching read-only filesystem, using existing client software already widely deployed to grid worker nodes and existing file servers already widely installed at grid sites. Files are published in a central place and are soon available on demand throughout the grid and cached locally on the site with a convenient POSIX interface. This paper discusses the details of the architecture and reports performance measurements.

  13. Engineering the CernVM-Filesystem as a High Bandwidth Distributed Filesystem for Auxiliary Physics Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykstra, D.; Bockelman, B.; Blomer, J.

    A common use pattern in the computing models of particle physics experiments is running many distributed applications that read from a shared set of data files. We refer to this data is auxiliary data, to distinguish it from (a) event data from the detector (which tends to be different for every job), and (b) conditions data about the detector (which tends to be the same for each job in a batch of jobs). Relatively speaking, conditions data also tends to be relatively small per job where both event data and auxiliary data are larger per job. Unlike event data, auxiliarymore » data comes from a limited working set of shared files. Since there is spatial locality of the auxiliary data access, the use case appears to be identical to that of the CernVM- Filesystem (CVMFS). However, we show that distributing auxiliary data through CVMFS causes the existing CVMFS infrastructure to perform poorly. We utilize a CVMFS client feature called 'alien cache' to cache data on existing local high-bandwidth data servers that were engineered for storing event data. This cache is shared between the worker nodes at a site and replaces caching CVMFS files on both the worker node local disks and on the site's local squids. We have tested this alien cache with the dCache NFSv4.1 interface, Lustre, and the Hadoop Distributed File System (HDFS) FUSE interface, and measured performance. In addition, we use high-bandwidth data servers at central sites to perform the CVMFS Stratum 1 function instead of the low-bandwidth web servers deployed for the CVMFS software distribution function. We have tested this using the dCache HTTP interface. As a result, we have a design for an end-to-end high-bandwidth distributed caching read-only filesystem, using existing client software already widely deployed to grid worker nodes and existing file servers already widely installed at grid sites. Files are published in a central place and are soon available on demand throughout the grid and cached locally on the site with a convenient POSIX interface. This paper discusses the details of the architecture and reports performance measurements.« less

  14. CernVM WebAPI - Controlling Virtual Machines from the Web

    NASA Astrophysics Data System (ADS)

    Charalampidis, I.; Berzano, D.; Blomer, J.; Buncic, P.; Ganis, G.; Meusel, R.; Segal, B.

    2015-12-01

    Lately, there is a trend in scientific projects to look for computing resources in the volunteering community. In addition, to reduce the development effort required to port the scientific software stack to all the known platforms, the use of Virtual Machines (VMs)u is becoming increasingly popular. Unfortunately their use further complicates the software installation and operation, restricting the volunteer audience to sufficiently expert people. CernVM WebAPI is a software solution addressing this specific case in a way that opens wide new application opportunities. It offers a very simple API for setting-up, controlling and interfacing with a VM instance in the users computer, while in the same time offloading the user from all the burden of downloading, installing and configuring the hypervisor. WebAPI comes with a lightweight javascript library that guides the user through the application installation process. Malicious usage is prohibited by offering a per-domain PKI validation mechanism. In this contribution we will overview this new technology, discuss its security features and examine some test cases where it is already in use.

  15. Development of Network Interface Cards for TRIDAQ systems with the NaNet framework

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Cretaro, P.; Di Lorenzo, S.; Fiorini, M.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Neri, I.; Paolucci, P. S.; Pastorelli, E.; Piandani, R.; Pontisso, L.; Rossetti, D.; Simula, F.; Sozzi, M.; Valente, P.; Vicini, P.

    2017-03-01

    NaNet is a framework for the development of FPGA-based PCI Express (PCIe) Network Interface Cards (NICs) with real-time data transport architecture that can be effectively employed in TRIDAQ systems. Key features of the architecture are the flexibility in the configuration of the number and kind of the I/O channels, the hardware offloading of the network protocol stack, the stream processing capability, and the zero-copy CPU and GPU Remote Direct Memory Access (RDMA). Three NIC designs have been developed with the NaNet framework: NaNet-1 and NaNet-10 for the CERN NA62 low level trigger and NaNet3 for the KM3NeT-IT underwater neutrino telescope DAQ system. We will focus our description on the NaNet-10 design, as it is the most complete of the three in terms of capabilities and integrated IPs of the framework.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, K.; Chen, H.; Wu, W.

    We present that in the upgrade of ATLAS experiment, the front-end electronics components are subjected to a large radiation background. Meanwhile high speed optical links are required for the data transmission between the on-detector and off-detector electronics. The GBT architecture and the Versatile Link (VL) project are designed by CERN to support the 4.8 Gbps line rate bidirectional high-speed data transmission which is called GBT link. In the ATLAS upgrade, besides the link with on-detector, the GBT link is also used between different off-detector systems. The GBTX ASIC is designed for the on-detector front-end, correspondingly for the off-detector electronics, themore » GBT architecture is implemented in Field Programmable Gate Arrays (FPGA). CERN launches the GBT-FPGA project to provide examples in different types of FPGA. In the ATLAS upgrade framework, the Front-End LInk eXchange (FELIX) system is used to interface the front end electronics of several ATLAS subsystems. The GBT link is used between them, to transfer the detector data and the timing, trigger, control and monitoring information. The trigger signal distributed in the down-link from FELIX to the front-end requires a fixed and low latency. In this paper, several optimizations on the GBT-FPGA IP core are introduced, to achieve a lower fixed latency. For FELIX, a common firmware will be used to interface different front-ends with support of both GBT modes: the forward error correction mode and the wide mode. The modified GBT-FPGA core has the ability to switch between the GBT modes without FPGA reprogramming. Finally, the system clock distribution of the multi-channel FELIX firmware is also discussed in this paper.« less

  17. Optimization on fixed low latency implementation of the GBT core in FPGA

    DOE PAGES

    Chen, K.; Chen, H.; Wu, W.; ...

    2017-07-11

    We present that in the upgrade of ATLAS experiment, the front-end electronics components are subjected to a large radiation background. Meanwhile high speed optical links are required for the data transmission between the on-detector and off-detector electronics. The GBT architecture and the Versatile Link (VL) project are designed by CERN to support the 4.8 Gbps line rate bidirectional high-speed data transmission which is called GBT link. In the ATLAS upgrade, besides the link with on-detector, the GBT link is also used between different off-detector systems. The GBTX ASIC is designed for the on-detector front-end, correspondingly for the off-detector electronics, themore » GBT architecture is implemented in Field Programmable Gate Arrays (FPGA). CERN launches the GBT-FPGA project to provide examples in different types of FPGA. In the ATLAS upgrade framework, the Front-End LInk eXchange (FELIX) system is used to interface the front end electronics of several ATLAS subsystems. The GBT link is used between them, to transfer the detector data and the timing, trigger, control and monitoring information. The trigger signal distributed in the down-link from FELIX to the front-end requires a fixed and low latency. In this paper, several optimizations on the GBT-FPGA IP core are introduced, to achieve a lower fixed latency. For FELIX, a common firmware will be used to interface different front-ends with support of both GBT modes: the forward error correction mode and the wide mode. The modified GBT-FPGA core has the ability to switch between the GBT modes without FPGA reprogramming. Finally, the system clock distribution of the multi-channel FELIX firmware is also discussed in this paper.« less

  18. Optimization on fixed low latency implementation of the GBT core in FPGA

    NASA Astrophysics Data System (ADS)

    Chen, K.; Chen, H.; Wu, W.; Xu, H.; Yao, L.

    2017-07-01

    In the upgrade of ATLAS experiment [1], the front-end electronics components are subjected to a large radiation background. Meanwhile high speed optical links are required for the data transmission between the on-detector and off-detector electronics. The GBT architecture and the Versatile Link (VL) project are designed by CERN to support the 4.8 Gbps line rate bidirectional high-speed data transmission which is called GBT link [2]. In the ATLAS upgrade, besides the link with on-detector, the GBT link is also used between different off-detector systems. The GBTX ASIC is designed for the on-detector front-end, correspondingly for the off-detector electronics, the GBT architecture is implemented in Field Programmable Gate Arrays (FPGA). CERN launches the GBT-FPGA project to provide examples in different types of FPGA [3]. In the ATLAS upgrade framework, the Front-End LInk eXchange (FELIX) system [4, 5] is used to interface the front-end electronics of several ATLAS subsystems. The GBT link is used between them, to transfer the detector data and the timing, trigger, control and monitoring information. The trigger signal distributed in the down-link from FELIX to the front-end requires a fixed and low latency. In this paper, several optimizations on the GBT-FPGA IP core are introduced, to achieve a lower fixed latency. For FELIX, a common firmware will be used to interface different front-ends with support of both GBT modes: the forward error correction mode and the wide mode. The modified GBT-FPGA core has the ability to switch between the GBT modes without FPGA reprogramming. The system clock distribution of the multi-channel FELIX firmware is also discussed in this paper.

  19. Back-end and interface implementation of the STS-XYTER2 prototype ASIC for the CBM experiment

    NASA Astrophysics Data System (ADS)

    Kasinski, K.; Szczygiel, R.; Zabolotny, W.

    2016-11-01

    Each front-end readout ASIC for the High-Energy Physics experiments requires robust and effective hit data streaming and control mechanism. A new STS-XYTER2 full-size prototype chip for the Silicon Tracking System and Muon Chamber detectors in the Compressed Baryonic Matter experiment at Facility for Antiproton and Ion Research (FAIR, Germany) is a 128-channel time and amplitude measuring solution for silicon microstrip and gas detectors. It operates at 250 kHit/s/channel hit rate, each hit producing 27 bits of information (5-bit amplitude, 14-bit timestamp, position and diagnostics data). The chip back-end implements fast front-end channel read-out, timestamp-wise hit sorting, and data streaming via a scalable interface implementing the dedicated protocol (STS-HCTSP) for chip control and hit transfer with data bandwidth from 9.7 MHit/s up to 47 MHit/s. It also includes multiple options for link diagnostics, failure detection, and throttling features. The back-end is designed to operate with the data acquisition architecture based on the CERN GBTx transceivers. This paper presents the details of the back-end and interface design and its implementation in the UMC 180 nm CMOS process.

  20. Managing operational documentation in the ALICE Detector Control System

    NASA Astrophysics Data System (ADS)

    Lechman, M.; Augustinus, A.; Bond, P.; Chochula, P.; Kurepin, A.; Pinazza, O.; Rosinsky, P.

    2012-12-01

    ALICE (A Large Ion Collider Experiment) is one of the big LHC (Large Hadron Collider) experiments at CERN in Geneve, Switzerland. The experiment is composed of 18 sub-detectors controlled by an integrated Detector Control System (DCS) that is implemented using the commercial SCADA package PVSSII. The DCS includes over 1200 network devices, over 1,000,000 monitored parameters and numerous custom made software components that are prepared by over 100 developers from all around the world. This complex system is controlled by a single operator via a central user interface. One of his/her main tasks is the recovery of anomalies and errors that may occur during operation. Therefore, clear, complete and easily accessible documentation is essential to guide the shifter through the expert interfaces of different subsystems. This paper describes the idea of the management of the operational documentation in ALICE using a generic repository that is built on a relational database and is integrated with the control system. The experience gained and the conclusions drawn from the project are also presented.

  1. The LHCb Run Control

    NASA Astrophysics Data System (ADS)

    Alessio, F.; Barandela, M. C.; Callot, O.; Duval, P.-Y.; Franek, B.; Frank, M.; Galli, D.; Gaspar, C.; Herwijnen, E. v.; Jacobsson, R.; Jost, B.; Neufeld, N.; Sambade, A.; Schwemmer, R.; Somogyi, P.

    2010-04-01

    LHCb has designed and implemented an integrated Experiment Control System. The Control System uses the same concepts and the same tools to control and monitor all parts of the experiment: the Data Acquisition System, the Timing and the Trigger Systems, the High Level Trigger Farm, the Detector Control System, the Experiment's Infrastructure and the interaction with the CERN Technical Services and the Accelerator. LHCb's Run Control, the main interface used by the experiment's operator, provides access in a hierarchical, coherent and homogeneous manner to all areas of the experiment and to all its sub-detectors. It allows for automated (or manual) configuration and control, including error recovery, of the full experiment in its different running modes. Different instances of the same Run Control interface are used by the various sub-detectors for their stand-alone activities: test runs, calibration runs, etc. The architecture and the tools used to build the control system, the guidelines and components provided to the developers, as well as the first experience with the usage of the Run Control will be presented

  2. Data Mining as a Service (DMaaS)

    NASA Astrophysics Data System (ADS)

    Tejedor, E.; Piparo, D.; Mascetti, L.; Moscicki, J.; Lamanna, M.; Mato, P.

    2016-10-01

    Data Mining as a Service (DMaaS) is a software and computing infrastructure that allows interactive mining of scientific data in the cloud. It allows users to run advanced data analyses by leveraging the widely adopted Jupyter notebook interface. Furthermore, the system makes it easier to share results and scientific code, access scientific software, produce tutorials and demonstrations as well as preserve the analyses of scientists. This paper describes how a first pilot of the DMaaS service is being deployed at CERN, starting from the notebook interface that has been fully integrated with the ROOT analysis framework, in order to provide all the tools for scientists to run their analyses. Additionally, we characterise the service backend, which combines a set of IT services such as user authentication, virtual computing infrastructure, mass storage, file synchronisation, development portals or batch systems. The added value acquired by the combination of the aforementioned categories of services is discussed, focusing on the opportunities offered by the CERNBox synchronisation service and its massive storage backend, EOS.

  3. Section Editors

    NASA Astrophysics Data System (ADS)

    Groep, D. L.; Bonacorsi, D.

    2014-06-01

    1. Data Acquisition, Trigger and Controls Niko NeufeldCERNniko.neufeld@cern.ch Tassos BeliasDemokritosbelias@inp.demokritos.gr Andrew NormanFNALanorman@fnal.gov Vivian O'DellFNALodell@fnal.gov 2. Event Processing, Simulation and Analysis Rolf SeusterTRIUMFseuster@cern.ch Florian UhligGSIf.uhlig@gsi.de Lorenzo MonetaCERNLorenzo.Moneta@cern.ch Pete ElmerPrincetonpeter.elmer@cern.ch 3. Distributed Processing and Data Handling Nurcan OzturkU Texas Arlingtonnurcan@uta.edu Stefan RoiserCERNstefan.roiser@cern.ch Robert IllingworthFNAL Davide SalomoniINFN CNAFDavide.Salomoni@cnaf.infn.it Jeff TemplonNikheftemplon@nikhef.nl 4. Data Stores, Data Bases, and Storage Systems David LangeLLNLlange6@llnl.gov Wahid BhimjiU Edinburghwbhimji@staffmail.ed.ac.uk Dario BarberisGenovaDario.Barberis@cern.ch Patrick FuhrmannDESYpatrick.fuhrmann@desy.de Igor MandrichenkoFNALivm@fnal.gov Mark van de SandenSURF SARA sanden@sara.nl 5. Software Engineering, Parallelism & Multi-Core Solveig AlbrandLPSC/IN2P3solveig.albrand@lpsc.in2p3.fr Francesco GiacominiINFN CNAFfrancesco.giacomini@cnaf.infn.it Liz SextonFNALsexton@fnal.gov Benedikt HegnerCERNbenedikt.hegner@cern.ch Simon PattonLBNLSJPatton@lbl.gov Jim KowalkowskiFNAL jbk@fnal.gov 6. Facilities, Infrastructures, Networking and Collaborative Tools Maria GironeCERNMaria.Girone@cern.ch Ian CollierSTFC RALian.collier@stfc.ac.uk Burt HolzmanFNALburt@fnal.gov Brian Bockelman U Nebraskabbockelm@cse.unl.edu Alessandro de SalvoRoma 1Alessandro.DeSalvo@ROMA1.INFN.IT Helge MeinhardCERN Helge.Meinhard@cern.ch Ray PasetesFNAL rayp@fnal.gov Steven GoldfarbU Michigan Steven.Goldfarb@cern.ch

  4. CERN and high energy physics, the grand picture

    ScienceCinema

    Heuer, Rolf-Dieter

    2018-05-24

    The lecture will touch on several topics, to illustrate the role of CERN in the present and future of high-energy physics: how does CERN work? What is the role of the scientific community, of bodies like Council and SPC, and of international cooperation, in the definition of CERN's scientific programme? What are the plans for the future of the LHC and of the non-LHC physics programme? What is the role of R&D; and technology transfer at CERN?

  5. Data Acquisition Software for Experiments at the MAMI-C Tagged Photon Facility

    NASA Astrophysics Data System (ADS)

    Oussena, Baya; Annand, John

    2013-10-01

    Tagged-photon experiments at Mainz use the electron beam of the MAMI (Mainzer MIcrotron) accelerator, in combination with the Glasgow Tagged Photon Spectrometer. The AcquDAQ DAQ system is implemented in the C + + language and makes use of CERN ROOT software libraries and tools. Electronic hardware is characterized in C + + classes, based on a general purpose class TDAQmodule and implementation in an object-oriented framework makes the system very flexible. The DAQ system provides slow control and event-by-event readout of the Photon Tagger, the Crystal Ball 4-pi electromagnetic calorimeter, central MWPC tracker and plastic-scintillator, particle-ID systems and the TAPS forward-angle calorimeter. A variety of front-end controllers running Linux are supported, reading data from VMEbus, FASTBUS and CAMAC systems. More specialist hardware, based on optical communication systems and developed for the COMPASS experiment at CERN, is also supported. AcquDAQ also provides an interface to configure and control the Mainz programmable trigger system, which uses FPGA-based hardware developed at GSI. Currently the DAQ system runs at data rates of up to 3MB/s and, with upgrades to both hardware and software later this year, we anticipate a doubling of that rate. This work was supported in part by the U.S. DOE Grant No. DE-FG02-99ER41110.

  6. Dissemination of CERN's Technology Transfer: Added Value from Regional Transfer Agents

    ERIC Educational Resources Information Center

    Hofer, Franz

    2005-01-01

    Technologies developed at CERN, the European Organization for Nuclear Research, are disseminated via a network of external technology transfer officers. Each of CERN's 20 member states has appointed at least one technology transfer officer to help establish links with CERN. This network has been in place since 2001 and early experiences indicate…

  7. Scaling the CERN OpenStack cloud

    NASA Astrophysics Data System (ADS)

    Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.

    2015-12-01

    CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.

  8. The LHC timeline: a personal recollection (1980-2012)

    NASA Astrophysics Data System (ADS)

    Maiani, Luciano; Bonolis, Luisa

    2017-12-01

    The objective of this interview is to study the history of the Large Hadron Collider in the LEP tunnel at CERN, from first ideas to the discovery of the Brout-Englert-Higgs boson, seen from the point of view of a member of CERN scientific committees, of the CERN Council and a former Director General of CERN in the years of machine construction.

  9. QM2017: Status and Key open Questions in Ultra-Relativistic Heavy-Ion Physics

    NASA Astrophysics Data System (ADS)

    Schukraft, Jurgen

    2017-11-01

    Almost exactly 3 decades ago, in the fall of 1986, the era of experimental ultra-relativistic E / m ≫ 1) heavy ion physics started simultaneously at the SPS at CERN and the AGS at Brookhaven with first beams of light Oxygen ions at fixed target energies of 200 GeV/A and 14.6 GeV/A, respectively. The event was announced by CERN [CERN's subatomic particle accelerators: Set up world-record in energy and break new ground for physics (CERN-PR-86-11-EN) (1986) 4 p, issued on 29 September 1986. URL (http://cds.cern.ch/record/855571)

  10. Update on CERN Search based on SharePoint 2013

    NASA Astrophysics Data System (ADS)

    Alvarez, E.; Fernandez, S.; Lossent, A.; Posada, I.; Silva, B.; Wagner, A.

    2017-10-01

    CERN’s enterprise Search solution “CERN Search” provides a central search solution for users and CERN service providers. A total of about 20 million public and protected documents from a wide range of document collections is indexed, including Indico, TWiki, Drupal, SharePoint, JACOW, E-group archives, EDMS, and CERN Web pages. In spring 2015, CERN Search was migrated to a new infrastructure based on SharePoint 2013. In the context of this upgrade, the document pre-processing and indexing process was redesigned and generalised. The new data feeding framework allows to profit from new functionality and it facilitates the long term maintenance of the system.

  11. Intelligent FPGA Data Acquisition Framework

    NASA Astrophysics Data System (ADS)

    Bai, Yunpeng; Gaisbauer, Dominic; Huber, Stefan; Konorov, Igor; Levit, Dmytro; Steffen, Dominik; Paul, Stephan

    2017-06-01

    In this paper, we present the field programmable gate arrays (FPGA)-based framework intelligent FPGA data acquisition (IFDAQ), which is used for the development of DAQ systems for detectors in high-energy physics. The framework supports Xilinx FPGA and provides a collection of IP cores written in very high speed integrated circuit hardware description language, which use the common interconnect interface. The IP core library offers functionality required for the development of the full DAQ chain. The library consists of Serializer/Deserializer (SERDES)-based time-to-digital conversion channels, an interface to a multichannel 80-MS/s 10-b analog-digital conversion, data transmission, and synchronization protocol between FPGAs, event builder, and slow control. The functionality is distributed among FPGA modules built in the AMC form factor: front end and data concentrator. This modular design also helps to scale and adapt the DAQ system to the needs of the particular experiment. The first application of the IFDAQ framework is the upgrade of the read-out electronics for the drift chambers and the electromagnetic calorimeters (ECALs) of the COMPASS experiment at CERN. The framework will be presented and discussed in the context of this paper.

  12. Status of the Space Radiation Monte Carlos Simulation Based on FLUKA and ROOT

    NASA Technical Reports Server (NTRS)

    Andersen, Victor; Carminati, Federico; Empl, Anton; Ferrari, Alfredo; Pinsky, Lawrence; Sala, Paola; Wilson, Thomas L.

    2002-01-01

    The NASA-funded project reported on at the first IWSSRR in Arona to develop a Monte-Carlo simulation program for use in simulating the space radiation environment based on the FLUKA and ROOT codes is well into its second year of development, and considerable progress has been made. The general tasks required to achieve the final goals include the addition of heavy-ion interactions into the FLUKA code and the provision of a ROOT-based interface to FLUKA. The most significant progress to date includes the incorporation of the DPMJET event generator code within FLUKA to handle heavy-ion interactions for incident projectile energies greater than 3GeV/A. The ongoing effort intends to extend the treatment of these interactions down to 10 MeV, and at present two alternative approaches are being explored. The ROOT interface is being pursued in conjunction with the CERN LHC ALICE software team through an adaptation of their existing AliROOT software. As a check on the validity of the code, a simulation of the recent data taken by the ATIC experiment is underway.

  13. Big Bang Day: The Making of CERN (Episode 1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2009-10-06

    A two-part history of the CERN project. Quentin Cooper explores the fifty-year history of CERN, the European particle physics laboratory in Switzerland. The institution was created to bring scientists together after WW2 .......

  14. Big Bang Day: The Making of CERN (Episode 1)

    ScienceCinema

    None

    2017-12-09

    A two-part history of the CERN project. Quentin Cooper explores the fifty-year history of CERN, the European particle physics laboratory in Switzerland. The institution was created to bring scientists together after WW2 .......

  15. CERN welcomes new members

    NASA Astrophysics Data System (ADS)

    2017-08-01

    Lithuania is on course to become an associate member of CERN, pending final approval by the Lithuanian parliament. Associate membership will allow representatives of the Baltic nation to take part in meetings of the CERN Council, which oversees the Geneva-based physics lab.

  16. Analysis of CERN computing infrastructure and monitoring data

    NASA Astrophysics Data System (ADS)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  17. Common Readout Unit (CRU) - A new readout architecture for the ALICE experiment

    NASA Astrophysics Data System (ADS)

    Mitra, J.; Khan, S. A.; Mukherjee, S.; Paul, R.

    2016-03-01

    The ALICE experiment at the CERN Large Hadron Collider (LHC) is presently going for a major upgrade in order to fully exploit the scientific potential of the upcoming high luminosity run, scheduled to start in the year 2021. The high interaction rate and the large event size will result in an experimental data flow of about 1 TB/s from the detectors, which need to be processed before sending to the online computing system and data storage. This processing is done in a dedicated Common Readout Unit (CRU), proposed for data aggregation, trigger and timing distribution and control moderation. It act as common interface between sub-detector electronic systems, computing system and trigger processors. The interface links include GBT, TTC-PON and PCIe. GBT (Gigabit transceiver) is used for detector data payload transmission and fixed latency path for trigger distribution between CRU and detector readout electronics. TTC-PON (Timing, Trigger and Control via Passive Optical Network) is employed for time multiplex trigger distribution between CRU and Central Trigger Processor (CTP). PCIe (Peripheral Component Interconnect Express) is the high-speed serial computer expansion bus standard for bulk data transport between CRU boards and processors. In this article, we give an overview of CRU architecture in ALICE, discuss the different interfaces, along with the firmware design and implementation of CRU on the LHCb PCIe40 board.

  18. EFQPSK Versus CERN: A Comparative Study

    NASA Technical Reports Server (NTRS)

    Borah, Deva K.; Horan, Stephen

    2001-01-01

    This report presents a comparative study on Enhanced Feher's Quadrature Phase Shift Keying (EFQPSK) and Constrained Envelope Root Nyquist (CERN) techniques. These two techniques have been developed in recent times to provide high spectral and power efficiencies under nonlinear amplifier environment. The purpose of this study is to gain insights into these techniques and to help system planners and designers with an appropriate set of guidelines for using these techniques. The comparative study presented in this report relies on effective simulation models and procedures. Therefore, a significant part of this report is devoted to understanding the mathematical and simulation models of the techniques and their set-up procedures. In particular, mathematical models of EFQPSK and CERN, effects of the sampling rate in discrete time signal representation, and modeling of nonlinear amplifiers and predistorters have been considered in detail. The results of this study show that both EFQPSK and CERN signals provide spectrally efficient communications compared to filtered conventional linear modulation techniques when a nonlinear power amplifier is used. However, there are important differences. The spectral efficiency of CERN signals, with a small amount of input backoff, is significantly better than that of EFQPSK signals if the nonlinear amplifier is an ideal clipper. However, to achieve such spectral efficiencies with a practical nonlinear amplifier, CERN processing requires a predistorter which effectively translates the amplifier's characteristics close to those of an ideal clipper. Thus, the spectral performance of CERN signals strongly depends on the predistorter. EFQPSK signals, on the other hand, do not need such predistorters since their spectra are almost unaffected by the nonlinear amplifier, Ibis report discusses several receiver structures for EFQPSK signals. It is observed that optimal receiver structures can be realized for both coded and uncoded EFQPSK signals with not too much increase in computational complexity. When a nonlinear amplifier is used, the bit error rate (BER) performance of the CERN signals with a matched filter receiver is found to be more than one decibel (dB) worse compared to the bit error performance of EFQPSK signals. Although channel coding is found to provide BER performance improvement for both EFQPSK and CERN signals, the performance of EFQPSK signals remains better than that of CERN. Optimal receiver structures for CERN signals with nonlinear equalization is left as a possible future work. Based on the numerical results, it is concluded that, in nonlinear channels, CERN processing leads towards better bandwidth efficiency with a compromise in power efficiency. Hence for bandwidth efficient communications needs, CERN is a good solution provided effective adaptive predistorters can be realized. On the other hand, EFQPSK signals provide a good power efficient solution with a compromise in band width efficiency.

  19. Sharing scientific discovery globally: toward a CERN virtual visit service

    NASA Astrophysics Data System (ADS)

    Goldfarb, S.; Hatzifotiadou, D.; Lapka, M.; Papanestis, A.

    2017-10-01

    The installation of virtual visit services by the LHC collaborations began shortly after the first high-energy collisions were provided by the CERN accelerator in 2010. The experiments: ATLAS [1], CMS [2], LHCb [3], and ALICE [4] have all joined in this popular and effective method to bring the excitement of scientific exploration and discovery into classrooms and other public venues around the world. Their programmes, which use a combination of video conference, webcast, and video recording to communicate with remote audiences have already reached tens of thousands of viewers, and the demand only continues to grow. Other venues, such as the CERN Control Centre, are also considering similar permanent installations. We present a summary of the development of the various systems in use around CERN today, including the technology deployed and a variety of use cases. We then lay down the arguments for the creation of a CERN-wide service that would support these programmes in a more coherent and effective manner. Potential services include a central booking system and operational management similar to what is currently provided for the common CERN video conference facilities. Certain choices in technology could be made to support programmes based on popular tools including (but not limited to) Skype™ [5], Google Hangouts [6], Facebook Live [7], and Periscope [8]. Successful implementation of the project, which relies on close partnership between the experiments, CERN IT CDA [9], and CERN IR ECO [10], has the potential to reach an even larger, global audience, more effectively than ever before.

  20. COSMO 09

    ScienceCinema

    None

    2018-06-20

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take place from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin). List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle. Dark matter, convenor: Marco Cirelli. Dark energy and modified gravity, convenor: Kazuya Koyama. CMB, LSS and cosmological parameters/models, convenor: Licia Verde. String cosmology, convenor: Jim Cline. Baryogenesis and leptogenesis, convenor: Mariano Quiros. The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees. [Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise. The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference (price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.

  1. COSMO 09

    ScienceCinema

    Peiris, Hiranya

    2018-06-12

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take place from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle Dark matter, convenor: Marco Cirelli Dark energy and modified gravity, convenor: Kazuya Koyama CMB, LSS and cosmological parameters/models, convenor: Licia Verde String cosmology, convenor: Jim Cline Baryogenesis and leptogenesis, convenor: Mariano Quiros The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees. [Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise.The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference (price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.

  2. COSMO 09

    ScienceCinema

    Knapp, Johannes

    2018-06-14

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take place from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle Dark matter, convenor: Marco Cirelli Dark energy and modified gravity, convenor: Kazuya Koyama CMB, LSS and cosmological parameters/models, convenor: Licia Verde String cosmology, convenor: Jim Cline Baryogenesis and leptogenesis, convenor: Mariano Quiros The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees. [Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise. The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference(price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.

  3. COSMO 09

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take placemore » from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin). List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle. Dark matter, convenor: Marco Cirelli. Dark energy and modified gravity, convenor: Kazuya Koyama. CMB, LSS and cosmological parameters/models, convenor: Licia Verde. String cosmology, convenor: Jim Cline. Baryogenesis and leptogenesis, convenor: Mariano Quiros. The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees. [Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise. The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference (price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.« less

  4. COSMO 09

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take placemore » from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle Dark matter, convenor: Marco Cirelli Dark energy and modified gravity, convenor: Kazuya Koyama CMB, LSS and cosmological parameters/models, convenor: Licia Verde String cosmology, convenor: Jim Cline Baryogenesis and leptogenesis, convenor: Mariano Quiros The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees. [Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise. The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference (price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.« less

  5. COSMO 09

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peiris, Hiranya

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take placemore » from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle Dark matter, convenor: Marco Cirelli Dark energy and modified gravity, convenor: Kazuya Koyama CMB, LSS and cosmological parameters/models, convenor: Licia Verde String cosmology, convenor: Jim Cline Baryogenesis and leptogenesis, convenor: Mariano Quiros The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees. [Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise.The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference (price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.« less

  6. COSMO 09

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salati, Pierre

    Part 5 lecture. Outline 1) Evidence for primary cosmic ray positrons 2) DM species with quite special properties 3) The effect of clumpiness on DM annihilaion 4) Decaying dark matter 5) perpectives more than conclusions. This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, themore » Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take place from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle Dark matter, convenor: Marco Cirelli Dark energy and modified gravity, convenor: Kazuya Koyama CMB, LSS and cosmological parameters/models, convenor: Licia Verde String cosmology, convenor: Jim Cline Baryogenesis and leptogenesis, convenor: Mariano Quiros The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees.[Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise. The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference (price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.« less

  7. COSMO 09

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knapp, Johannes

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take placemore » from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle Dark matter, convenor: Marco Cirelli Dark energy and modified gravity, convenor: Kazuya Koyama CMB, LSS and cosmological parameters/models, convenor: Licia Verde String cosmology, convenor: Jim Cline Baryogenesis and leptogenesis, convenor: Mariano Quiros The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees. [Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise. The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference(price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.« less

  8. COSMO 09

    ScienceCinema

    None

    2018-06-13

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take place from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle Dark matter, convenor: Marco Cirelli Dark energy and modified gravity, convenor: Kazuya Koyama CMB, LSS and cosmological parameters/models, convenor: Licia Verde String cosmology, convenor: Jim Cline Baryogenesis and leptogenesis, convenor: Mariano Quiros The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees. [Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise. The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference (price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.

  9. COSMO 09

    ScienceCinema

    Salati, Pierre

    2018-05-24

    Part 5 lecture. Outline 1) Evidence for primary cosmic ray positrons 2) DM species with quite special properties 3) The effect of clumpiness on DM annihilaion 4) Decaying dark matter 5) perpectives more than conclusions. This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute "Particle Cosmology" which will take place from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line. Select "Preliminary programme" in the left menu and click on each plenary session to see details. Parallel sessions: Inflation, convenor: Andrew Liddle Dark matter, convenor: Marco Cirelli Dark energy and modified gravity, convenor: Kazuya Koyama CMB, LSS and cosmological parameters/models, convenor: Licia Verde String cosmology, convenor: Jim Cline Baryogenesis and leptogenesis, convenor: Mariano Quiros The submission of talk proposals is closed by now. The parallel session program is available on-line. Select "Preliminary programme" in the left menu and click on each parallel session title to see details. Posters. Participants willing to present a poster will be offered the opportunity to hang it in the hall, next to the main auditorium. The poster application is closed by now. The poster list is available on-line. Registration. On-line registration is open from January 16 till August 31 (click on the link in the left menu). There will be no registration fees.[Thanks to the generosity of EU's network "UniverseNet", we have some limited funds available for supporting the visit of a few young scientists who could not attend otherwise. The application for funding is closed by now. All applicants have already been informed of the success of their application.] Accomodation. Participants are expected to arrange their accomodation by themselves: some rooms with shower, wc and washbasin have been blocked in the CERN hostel for the conference (price: 58CHF/night). Unfortunately, all these rooms have already been booked. You can book a hotel in Geneva or in the area surrounding CERN using this list. If you book a hotel on the French side, be sure to have a passport or a visa valid also in France. All participants are expected to be in possession of a passport or a visa valid in Swizerland (if relevant), and to be covered by their own health insurance during their visit. Sponsors. This conference is receiving support from the European Community's Marie Curie Research and Training Network UniverseNet.

  10. Learning with the ATLAS Experiment at CERN

    ERIC Educational Resources Information Center

    Barnett, R. M.; Johansson, K. E.; Kourkoumelis, C.; Long, L.; Pequenao, J.; Reimers, C.; Watkins, P.

    2012-01-01

    With the start of the LHC, the new particle collider at CERN, the ATLAS experiment is also providing high-energy particle collisions for educational purposes. Several education projects--education scenarios--have been developed and tested on students and teachers in several European countries within the Learning with ATLAS@CERN project. These…

  11. First experience with the new .cern Top Level Domain

    NASA Astrophysics Data System (ADS)

    Alvarez, E.; Malo de Molina, M.; Salwerowicz, M.; Silva De Sousa, B.; Smith, T.; Wagner, A.

    2017-10-01

    In October 2015, CERN’s core website has been moved to a new address, http://home.cern, marking the launch of the brand new top-level domain .cern. In combination with a formal governance and registration policy, the IT infrastructure needed to be extended to accommodate the hosting of Web sites in this new top level domain. We will present the technical implementation in the framework of the CERN Web Services that allows to provide virtual hosting, a reverse proxy solution and that also includes the provisioning of SSL server certificates for secure communications.

  12. Analysis of counting data: Development of the SATLAS Python package

    NASA Astrophysics Data System (ADS)

    Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.

    2018-01-01

    For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.

  13. BnmrOffice: A Free Software for β-nmr Data Analysis

    NASA Astrophysics Data System (ADS)

    Saadaoui, Hassan

    A data-analysis framework with a graphical user interface (GUI) is developed to analyze β-nmr spectra in an automated and intuitive way. This program, named BnmrOffice is written in C++ and employs the QT libraries and tools for designing the GUI, and the CERN's Minuit optimization routines for minimization. The program runs under multiple platforms, and is available for free under the terms of the GNU GPL standards. The GUI is structured in tabs to search, plot and analyze data, along other functionalities. The user can tweak the minimization options; and fit multiple data files (or runs) using single or global fitting routines with pre-defined or new models. Currently, BnmrOffice reads TRIUMF's MUD data and ASCII files, and can be extended to other formats.

  14. Hangout with CERN: a direct conversation with the public

    NASA Astrophysics Data System (ADS)

    Rao, Achintya; Goldfarb, Steven; Kahle, Kate

    2016-04-01

    Hangout with CERN refers to a weekly, half-hour-long, topical webcast hosted at CERN. The aim of the programme is threefold: (i) to provide a virtual tour of various locations and facilities at CERN, (ii) to discuss the latest scientific results from the laboratory, and, most importantly, (iii) to engage in conversation with the public and answer their questions. For each ;episode;, scientists gather around webcam-enabled computers at CERN and partner institutes/universities, connecting to one another using the Google+ social network's ;Hangouts; tool. The show is structured as a conversation mediated by a host, usually a scientist, and viewers can ask questions to the experts in real time through a Twitter hashtag or YouTube comments. The history of Hangout with CERN can be traced back to ICHEP 2012, where several physicists crowded in front of a laptop connected to Google+, using a ;Hangout On Air; webcast to explain to the world the importance of the discovery of the Higgs-like boson, announced just two days before at the same conference. Hangout with CERN has also drawn inspiration from two existing outreach endeavours: (i) ATLAS Virtual Visits, which connected remote visitors with scientists in the ATLAS Control Room via video conference, and (ii) the Large Hangout Collider, in which CMS scientists gave underground tours via Hangouts to groups of schools and members of the public around the world. In this paper, we discuss the role of Hangout with CERN as a bi-directional outreach medium and an opportunity to train scientists in effective communication.

  15. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-15

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network Constituents, Fundamental Forces and Symmetries of the Universe. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva.

  16. PREFACE: International Conference on Computing in High Energy and Nuclear Physics (CHEP 2010)

    NASA Astrophysics Data System (ADS)

    Lin, Simon C.; Shen, Stella; Neufeld, Niko; Gutsche, Oliver; Cattaneo, Marco; Fisk, Ian; Panzer-Steindel, Bernd; Di Meglio, Alberto; Lokajicek, Milos

    2011-12-01

    The International Conference on Computing in High Energy and Nuclear Physics (CHEP) was held at Academia Sinica in Taipei from 18-22 October 2010. CHEP is a major series of international conferences for physicists and computing professionals from the worldwide High Energy and Nuclear Physics community, Computer Science, and Information Technology. The CHEP conference provides an international forum to exchange information on computing progress and needs for the community, and to review recent, ongoing and future activities. CHEP conferences are held at roughly 18 month intervals, alternating between Europe, Asia, America and other parts of the world. Recent CHEP conferences have been held in Prauge, Czech Republic (2009); Victoria, Canada (2007); Mumbai, India (2006); Interlaken, Switzerland (2004); San Diego, California(2003); Beijing, China (2001); Padova, Italy (2000) CHEP 2010 was organized by Academia Sinica Grid Computing Centre. There was an International Advisory Committee (IAC) setting the overall themes of the conference, a Programme Committee (PC) responsible for the content, as well as Conference Secretariat responsible for the conference infrastructure. There were over 500 attendees with a program that included plenary sessions of invited speakers, a number of parallel sessions comprising around 260 oral and 200 poster presentations, and industrial exhibitions. We thank all the presenters, for the excellent scientific content of their contributions to the conference. Conference tracks covered topics on Online Computing, Event Processing, Software Engineering, Data Stores, and Databases, Distributed Processing and Analysis, Computing Fabrics and Networking Technologies, Grid and Cloud Middleware, and Collaborative Tools. The conference included excursions to various attractions in Northern Taiwan, including Sanhsia Tsu Shih Temple, Yingko, Chiufen Village, the Northeast Coast National Scenic Area, Keelung, Yehliu Geopark, and Wulai Aboriginal Village, as well as two banquets held at the Grand Hotel and Grand Formosa Regent in Taipei. The next CHEP conference will be held in New York, the United States on 21-25 May 2012. We would like to thank the National Science Council of Taiwan, the EU ACEOLE project, commercial sponsors, and the International Advisory Committee and the Programme Committee members for all their support and help. Special thanks to the Programme Committee members for their careful choice of conference contributions and enormous effort in reviewing and editing about 340 post conference proceedings papers. Simon C Lin CHEP 2010 Conference Chair and Proceedings Editor Taipei, Taiwan November 2011 Track Editors/ Programme Committee Chair Simon C Lin, Academia Sinica, Taiwan Online Computing Track Y H Chang, National Central University, Taiwan Harry Cheung, Fermilab, USA Niko Neufeld, CERN, Switzerland Event Processing Track Fabio Cossutti, INFN Trieste, Italy Oliver Gutsche, Fermilab, USA Ryosuke Itoh, KEK, Japan Software Engineering, Data Stores, and Databases Track Marco Cattaneo, CERN, Switzerland Gang Chen, Chinese Academy of Sciences, China Stefan Roiser, CERN, Switzerland Distributed Processing and Analysis Track Kai-Feng Chen, National Taiwan University, Taiwan Ulrik Egede, Imperial College London, UK Ian Fisk, Fermilab, USA Fons Rademakers, CERN, Switzerland Torre Wenaus, BNL, USA Computing Fabrics and Networking Technologies Track Harvey Newman, Caltech, USA Bernd Panzer-Steindel, CERN, Switzerland Antonio Wong, BNL, USA Ian Fisk, Fermilab, USA Niko Neufeld, CERN, Switzerland Grid and Cloud Middleware Track Alberto Di Meglio, CERN, Switzerland Markus Schulz, CERN, Switzerland Collaborative Tools Track Joao Correia Fernandes, CERN, Switzerland Philippe Galvez, Caltech, USA Milos Lokajicek, FZU Prague, Czech Republic International Advisory Committee Chair: Simon C. Lin , Academia Sinica, Taiwan Members: Mohammad Al-Turany , FAIR, Germany Sunanda Banerjee, Fermilab, USA Dario Barberis, CERN & Genoa University/INFN, Switzerland Lothar Bauerdick, Fermilab, USA Ian Bird, CERN, Switzerland Amber Boehnlein, US Department of Energy, USA Kors Bos, CERN, Switzerland Federico Carminati, CERN, Switzerland Philippe Charpentier, CERN, Switzerland Gang Chen, Institute of High Energy Physics, China Peter Clarke, University of Edinburgh, UK Michael Ernst, Brookhaven National Laboratory, USA David Foster, CERN, Switzerland Merino Gonzalo, CIEMAT, Spain John Gordon, STFC-RAL, UK Volker Guelzow, Deutsches Elektronen-Synchrotron DESY, Hamburg, Germany John Harvey, CERN, Switzerland Frederic Hemmer, CERN, Switzerland Hafeez Hoorani, NCP, Pakistan Viatcheslav Ilyin, Moscow State University, Russia Matthias Kasemann, DESY, Germany Nobuhiko Katayama, KEK, Japan Milos Lokajícek, FZU Prague, Czech Republic David Malon, ANL, USA Pere Mato Vila, CERN, Switzerland Mirco Mazzucato, INFN CNAF, Italy Richard Mount, SLAC, USA Harvey Newman, Caltech, USA Mitsuaki Nozaki, KEK, Japan Farid Ould-Saada, University of Oslo, Norway Ruth Pordes, Fermilab, USA Hiroshi Sakamoto, The University of Tokyo, Japan Alberto Santoro, UERJ, Brazil Jim Shank, Boston University, USA Alan Silverman, CERN, Switzerland Randy Sobie , University of Victoria, Canada Dongchul Son, Kyungpook National University, South Korea Reda Tafirout , TRIUMF, Canada Victoria White, Fermilab, USA Guy Wormser, LAL, France Frank Wuerthwein, UCSD, USA Charles Young, SLAC, USA

  17. CERN@school: bringing CERN into the classroom

    NASA Astrophysics Data System (ADS)

    Whyntie, T.; Cook, J.; Coupe, A.; Fickling, R. L.; Parker, B.; Shearer, N.

    2016-04-01

    CERN@school brings technology from CERN into the classroom to aid with the teaching of particle physics. It also aims to inspire the next generation of physicists and engineers by giving participants the opportunity to be part of a national collaboration of students, teachers and academics, analysing data obtained from detectors based on the ground and in space to make new, curiosity-driven discoveries at school. CERN@school is based around the Timepix hybrid silicon pixel detector developed by the Medipix 2 Collaboration, which features a 300 μm thick silicon sensor bump-bonded to a Timepix readout ASIC. This defines a 256-by-256 grid of pixels with a pitch of 55 μm, the data from which can be used to visualise ionising radiation in a very accessible way. Broadly speaking, CERN@school consists of a web portal that allows access to data collected by the Langton Ultimate Cosmic ray Intensity Detector (LUCID) experiment in space and the student-operated Timepix detectors on the ground; a number of Timepix detector kits for ground-based experiments, to be made available to schools for both teaching and research purposes; and educational resources for teachers to use with LUCID data and detector kits in the classroom. By providing access to cutting-edge research equipment, raw data from ground and space-based experiments, CERN@school hopes to provide the foundation for a programme that meets the many of the aims and objectives of CERN and the project's supporting academic and industrial partners. The work presented here provides an update on the status of the programme as supported by the UK Science and Technology Facilities Council (STFC) and the Royal Commission for the Exhibition of 1851. This includes recent results from work with the GridPP Collaboration on using grid resources with schools to run GEANT4 simulations of CERN@school experiments.

  18. News Conference: Serbia hosts teachers' seminar Resources: Teachers TV website closes for business Festival: Science takes to the stage in Denmark Research: How noise affects learning in secondary schools CERN: CERN visit inspires new teaching ideas Education: PLS aims to improve perception of science for school students Conference: Scientix conference discusses challenges in science education

    NASA Astrophysics Data System (ADS)

    2011-07-01

    Conference: Serbia hosts teachers' seminar Resources: Teachers TV website closes for business Festival: Science takes to the stage in Denmark Research: How noise affects learning in secondary schools CERN: CERN visit inspires new teaching ideas Education: PLS aims to improve perception of science for school students Conference: Scientix conference discusses challenges in science education

  19. News Particle Physics: ATLAS unveils mural at CERN Prize: Corti Trust invites essay entries Astrophysics: CERN holds cosmic-ray conference Researchers in Residence: Lord Winston returns to school Music: ATLAS scientists record physics music Conference: Champagne flows at Reims event Competition: Students triumph at physics olympiad Teaching: Physics proves popular in Japanese schools Forthcoming Events

    NASA Astrophysics Data System (ADS)

    2011-01-01

    Particle Physics: ATLAS unveils mural at CERN Prize: Corti Trust invites essay entries Astrophysics: CERN holds cosmic-ray conference Researchers in Residence: Lord Winston returns to school Music: ATLAS scientists record physics music Conference: Champagne flows at Reims event Competition: Students triumph at physics olympiad Teaching: Physics proves popular in Japanese schools Forthcoming Events

  20. Signature CERN-URSS

    ScienceCinema

    None

    2017-12-09

    Le DG W.Jentschke souhaite la bienvenue à l'assemblée et aux invités pour la signature du protocole entre le Cern et l'URSS qui est un événement important. C'est en 1955 que 55 visiteurs soviétiques ont visité le Cern pour la première fois. Le premier DG au Cern, F.Bloch, et Mons.Amaldi sont aussi présents. Tandis que le discours anglais de W.Jentschke est traduit en russe, le discours russe de Mons.Morozov est traduit en anglais.

  1. LEMON - LHC Era Monitoring for Large-Scale Infrastructures

    NASA Astrophysics Data System (ADS)

    Marian, Babik; Ivan, Fedorko; Nicholas, Hook; Hector, Lansdale Thomas; Daniel, Lenkes; Miroslav, Siket; Denis, Waldron

    2011-12-01

    At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.

  2. Hands on CERN: A Well-Used Physics Education Project

    ERIC Educational Resources Information Center

    Johansson, K. E.

    2006-01-01

    The "Hands on CERN" education project makes it possible for students and teachers to get close to the forefront of scientific research. The project confronts the students with contemporary physics at its most fundamental level with the help of particle collisions from the DELPHI particle physics experiment at CERN. It now exists in 14 languages…

  3. Indico 1.0

    NASA Astrophysics Data System (ADS)

    Gonzalez Lopez, J. B.; Avilés, A.; Baron, T.; Ferreira, P.; Kolobara, B.; Pugh, M. A.; Resco, A.; Trzaskoma, J. P.

    2014-06-01

    Indico has evolved into the main event organization software, room booking tool and collaboration hub for CERN. The growth in its usage has only accelerated during the past 9 years, and today Indico holds more that 215,000 events and 1,100,000 files. The growth was also substantial in terms of functionalities and improvements. In the last year alone, Indico has matured considerably in 3 key areas: enhanced usability, optimized performance and additional features, especially those related to meeting collaboration. Along the course of 2012, much activity has centred around consolidating all this effort and investment into "version 1.0", recently released in 2013.Version 1.0 brings along new features, such as the Microsoft Exchange calendar synchronization for participants, many new and clean interfaces (badges and poster generation, list of contributions, abstracts, etc) and so forth. But most importantly, it brings a message: Indico is now stable, consolidated and mature after more than 10 years of non-stop development. This message is addressed not only to CERN users but also to the many organisations, in or outside HEP, which have already installed the software, and to others who might soon join this community. In this document, we describe the current state of the art of Indico, and how it was built. This does not mean that the Indico software is complete, far from it! We have plenty of new ideas and projects that we are working on and which we have shared during CHEP 2013.

  4. A New Concept of Controller for Accelerators' Magnet Power Supplies

    NASA Astrophysics Data System (ADS)

    Visintini, Roberto; Cleva, Stefano; Cautero, Marco; Ciesla, Tomasz

    2016-04-01

    The complexity of a particle accelerator implies the remote control of very large numbers of devices, with many different typologies, either distributed along the accelerator or concentrated in locations, often far away from each other. Local and global control systems handle the devices through dedicated communication channels and interfaces. Each controlled device is practically a “smart node” performing a specific task. In addition, very often, those tasks are managed in real-time mode. The performances required to the control interface has an influence on the cost of the distributed nodes as well as on their hardware and software implementation. In large facilities (e.g. CERN) the “smart nodes” derive from specific in-house developments. Alternatively, it is possible to find on the market commercial devices, whose performances (and prices) are spread over a broad range, and spanning from proprietary design (customizable to the user's needs) to open source/design. In this paper, we will describe some applications of smart nodes in the particle accelerators field, with special focus on the power supplies for magnets. In modern accelerators, in fact, magnets and their associated power supplies constitute systems distributed along the accelerator itself, and strongly interfaced with the remote control system as well as with more specific (and often more demanding) orbit/trajectory feedback systems. We will give examples of actual systems, installed and operational on two light sources, Elettra and FERMI, located in the Elettra Research Center in Trieste, Italy.

  5. 25th Birthday Cern- Amphi

    ScienceCinema

    None

    2017-12-09

    Cérémonie du 25ème anniversaire du Cern avec 2 orateurs: le Prof.Weisskopf parle de la signification et le rôle du Cern et le Prof.Casimir(?) fait un exposé sur les rélations entre la science pure et la science appliquée et la "big science" (science légère)

  6. 1987 Nuclear Science Symposium, 34th, and 1987 Symposium on Nuclear Power Systems, 19th, San Francisco, CA, Oct. 21-23, 1987, Proceedings

    NASA Astrophysics Data System (ADS)

    Armantrout, Guy A.

    1988-02-01

    The present conference consideres topics in radiation detectors, advanced electronic circuits, data acquisition systems, radiation detector systems, high-energy and nuclear physics radiation detection, spaceborne instrumentation, health physics and environmental radiation detection, nuclear medicine, nuclear well logging, and nuclear reactor instrumentation. Attention is given to the response of scintillators to heavy ions, phonon-mediated particle detection, ballistic deficits in pulse-shaping amplifiers, fast analog ICs for particle physics, logic cell arrays, the CERN host interface, high performance data buses, a novel scintillating glass for high-energy physics applications, background events in microchannel plates, a tritium accelerator mass spectrometer, a novel positron tomograph, advancements in PET, cylindrical positron tomography, nuclear techniques in subsurface geology, REE borehole neutron activation, and a continuous tritium monitor for aqueous process streams.

  7. A multi-port 10GbE PCIe NIC featuring UDP offload and GPUDirect capabilities.

    NASA Astrophysics Data System (ADS)

    Ammendola, Roberto; Biagioni, Andrea; Frezza, Ottorino; Lamanna, Gianluca; Lo Cicero, Francesca; Lonardo, Alessandro; Martinelli, Michele; Stanislao Paolucci, Pier; Pastorelli, Elena; Pontisso, Luca; Rossetti, Davide; Simula, Francesco; Sozzi, Marco; Tosoratto, Laura; Vicini, Piero

    2015-12-01

    NaNet-10 is a four-ports 10GbE PCIe Network Interface Card designed for low-latency real-time operations with GPU systems. To this purpose the design includes an UDP offload module, for fast and clock-cycle deterministic handling of the transport layer protocol, plus a GPUDirect P2P/RDMA engine for low-latency communication with NVIDIA Tesla GPU devices. A dedicated module (Multi-Stream) can optionally process input UDP streams before data is delivered through PCIe DMA to their destination devices, re-organizing data from different streams guaranteeing computational optimization. NaNet-10 is going to be integrated in the NA62 CERN experiment in order to assess the suitability of GPGPU systems as real-time triggers; results and lessons learned while performing this activity will be reported herein.

  8. Global EOS: exploring the 300-ms-latency region

    NASA Astrophysics Data System (ADS)

    Mascetti, L.; Jericho, D.; Hsu, C.-Y.

    2017-10-01

    EOS, the CERN open-source distributed disk storage system, provides the highperformance storage solution for HEP analysis and the back-end for various work-flows. Recently EOS became the back-end of CERNBox, the cloud synchronisation service for CERN users. EOS can be used to take advantage of wide-area distributed installations: for the last few years CERN EOS uses a common deployment across two computer centres (Geneva-Meyrin and Budapest-Wigner) about 1,000 km apart (∼20-ms latency) with about 200 PB of disk (JBOD). In late 2015, the CERN-IT Storage group and AARNET (Australia) set-up a challenging R&D project: a single EOS instance between CERN and AARNET with more than 300ms latency (16,500 km apart). This paper will report about the success in deploy and run a distributed storage system between Europe (Geneva, Budapest), Australia (Melbourne) and later in Asia (ASGC Taipei), allowing different type of data placement and data access across these four sites.

  9. Introduction to CERN

    ScienceCinema

    Heuer, R.-D.

    2018-02-19

    Summer Student Lecture Programme Introduction. The mission of CERN; push back the frontiers of knowledge, e.g. the secrets of the Big Bang...what was the matter like within the first moments of the Universe's existence? You have to develop new technologies for accelerators and detectors (also information technology--the Web and the GRID and medicine--diagnosis and therapy). There are three key technology areas at CERN; accelerating, particle detection, large-scale computing.

  10. HIGH ENERGY PHYSICS: Bulgarians Sue CERN for Leniency.

    PubMed

    Koenig, R

    2000-10-13

    In cash-strapped Bulgaria, scientists are wondering whether a ticket for a front-row seat in high-energy physics is worth the price: Membership dues in CERN, the European particle physics lab, nearly equal the country's entire budget for competitive research grants. Faced with that grim statistic and a plea for leniency from Bulgaria's government, CERN's governing council is considering slashing the country's membership dues for the next 2 years.

  11. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    McAllister, Liam

    2018-05-14

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  12. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-22

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons.Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  13. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-06-28

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  14. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-23

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  15. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2017-12-09

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  16. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    McAllister, Liam

    2018-05-24

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions".This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher

  17. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    Sen, Ashoke

    2018-04-27

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network". The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher.

  18. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-23

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  19. Service management at CERN with Service-Now

    NASA Astrophysics Data System (ADS)

    Toteva, Z.; Alvarez Alonso, R.; Alvarez Granda, E.; Cheimariou, M.-E.; Fedorko, I.; Hefferman, J.; Lemaitre, S.; Clavo, D. Martin; Martinez Pedreira, P.; Pera Mira, O.

    2012-12-01

    The Information Technology (IT) and the General Services (GS) departments at CERN have decided to combine their extensive experience in support for IT and non-IT services towards a common goal - to bring the services closer to the end user based on Information Technology Infrastructure Library (ITIL) best practice. The collaborative efforts have so far produced definitions for the incident and the request fulfilment processes which are based on a unique two-dimensional service catalogue that combines both the user and the support team views of all services. After an extensive evaluation of the available industrial solutions, Service-now was selected as the tool to implement the CERN Service-Management processes. The initial release of the tool provided an attractive web portal for the users and successfully implemented two basic ITIL processes; the incident management and the request fulfilment processes. It also integrated with the CERN personnel databases and the LHC GRID ticketing system. Subsequent releases continued to integrate with other third-party tools like the facility management systems of CERN as well as to implement new processes such as change management. Independently from those new development activities it was decided to simplify the request fulfilment process in order to achieve easier acceptance by the CERN user community. We believe that due to the high modularity of the Service-now tool, the parallel design of ITIL processes e.g., event management and non-ITIL processes, e.g., computer centre hardware management, will be easily achieved. This presentation will describe the experience that we have acquired and the techniques that were followed to achieve the CERN customization of the Service-Now tool.

  20. Vidyo@CERN: A Service Update

    NASA Astrophysics Data System (ADS)

    Fernandes, J.; Baron, T.

    2015-12-01

    We will present an overview of the current real-time video service offering for the LHC, in particular the operation of the CERN Vidyo service will be described in terms of consolidated performance and scale: The service is an increasingly critical part of the daily activity of the LHC collaborations, topping recently more than 50 million minutes of communication in one year, with peaks of up to 852 simultaneous connections. We will elaborate on the improvement of some front-end key features such as the integration with CERN Indico, or the enhancements of the Unified Client and also on new ones, released or in the pipeline, such as a new WebRTC client and CERN SSO/Federated SSO integration. An overview of future infrastructure improvements, such as virtualization techniques of Vidyo routers and geo-location mechanisms for load-balancing and optimum user distribution across the service infrastructure will also be discussed. The work done by CERN to improve the monitoring of its Vidyo network will also be presented and demoed. As a last point, we will touch the roadmap and strategy established by CERN and Vidyo with a clear objective of optimizing the service both on the end client and backend infrastructure to make it truly universal, to serve Global Science. To achieve those actions, the introduction of the multitenant concept to serve different communities is needed. This is one of the consequences of CERN's decision to offer the Vidyo service currently operated for the LHC, to other Sciences, Institutions and Virtual Organizations beyond HEP that might express interest for it.

  1. Public Lecture

    ScienceCinema

    None

    2017-12-09

    An outreach activity is being organized by the Turkish community at CERN, on 5 June 2010 at CERN Main Auditorium. The activity consists of several talks that will take 1.5h in total. The main goal of the activity will be describing the CERN based activities and experiments as well as stimulating the public's attention to the science related topics. We believe the wide communication of the event has certain advantages especially for the proceeding membership process of Turkey.

  2. Prospects for observation at CERN in NA62

    NASA Astrophysics Data System (ADS)

    Hahn, F.; NA62 Collaboration; Aglieri Rinella, G.; Aliberti, R.; Ambrosino, F.; Angelucci, B.; Antonelli, A.; Anzivino, G.; Arcidiacono, R.; Azhinenko, I.; Balev, S.; Bendotti, J.; Biagioni, A.; Biino, C.; Bizzeti, A.; Blazek, T.; Blik, A.; Bloch-Devaux, B.; Bolotov, V.; Bonaiuto, V.; Bragadireanu, M.; Britton, D.; Britvich, G.; Brook, N.; Bucci, F.; Butin, F.; Capitolo, E.; Capoccia, C.; Capussela, T.; Carassiti, V.; Cartiglia, N.; Cassese, A.; Catinaccio, A.; Cecchetti, A.; Ceccucci, A.; Cenci, P.; Cerny, V.; Cerri, C.; Chikilev, O.; Ciaranfi, R.; Collazuol, G.; Cooke, P.; Cooper, P.; Corradi, G.; Cortina Gil, E.; Costantini, F.; Cotta Ramusino, A.; Coward, D.; D'Agostini, G.; Dainton, J.; Dalpiaz, P.; Danielsson, H.; Degrange, J.; De Simone, N.; Di Filippo, D.; Di Lella, L.; Dixon, N.; Doble, N.; Duk, V.; Elsha, V.; Engelfried, J.; Enik, T.; Falaleev, V.; Fantechi, R.; Federici, L.; Fiorini, M.; Fry, J.; Fucci, A.; Fulton, L.; Gallorini, S.; Gatignon, L.; Gianoli, A.; Giudici, S.; Glonti, L.; Goncalves Martins, A.; Gonnella, F.; Goudzovski, E.; Guida, R.; Gushchin, E.; Hahn, F.; Hallgren, B.; Heath, H.; Herman, F.; Hutchcroft, D.; Iacopini, E.; Jamet, O.; Jarron, P.; Kampf, K.; Kaplon, J.; Karjavin, V.; Kekelidze, V.; Kholodenko, S.; Khoriauli, G.; Khudyakov, A.; Kiryushin, Yu; Kleinknecht, K.; Kluge, A.; Koval, M.; Kozhuharov, V.; Krivda, M.; Kudenko, Y.; Kunze, J.; Lamanna, G.; Lazzeroni, C.; Leitner, R.; Lenci, R.; Lenti, M.; Leonardi, E.; Lichard, P.; Lietava, R.; Litov, L.; Lomidze, D.; Lonardo, A.; Lurkin, N.; Madigozhin, D.; Maire, G.; Makarov, A.; Mannelli, I.; Mannocchi, G.; Mapelli, A.; Marchetto, F.; Massarotti, P.; Massri, K.; Matak, P.; Mazza, G.; Menichetti, E.; Mirra, M.; Misheva, M.; Molokanova, N.; Morant, J.; Morel, M.; Moulson, M.; Movchan, S.; Munday, D.; Napolitano, M.; Newson, F.; Norton, A.; Noy, M.; Nuessle, G.; Obraztsov, V.; Padolski, S.; Page, R.; Palladino, V.; Pardons, A.; Pedreschi, E.; Pepe, M.; Perez Gomez, F.; Perrin-Terrin, M.; Petrov, P.; Petrucci, F.; Piandani, R.; Piccini, M.; Pietreanu, D.; Pinzino, J.; Pivanti, M.; Polenkevich, I.; Popov, I.; Potrebenikov, Yu; Protopopescu, D.; Raffaelli, F.; Raggi, M.; Riedler, P.; Romano, A.; Rubin, P.; Ruggiero, G.; Russo, V.; Ryjov, V.; Salamon, A.; Salina, G.; Samsonov, V.; Santovetti, E.; Saracino, G.; Sargeni, F.; Schifano, S.; Semenov, V.; Sergi, A.; Serra, M.; Shkarovskiy, S.; Sotnikov, A.; Sougonyaev, V.; Sozzi, M.; Spadaro, T.; Spinella, F.; Staley, R.; Statera, M.; Sutcliffe, P.; Szilasi, N.; Tagnani, D.; Valdata-Nappi, M.; Valente, P.; Vasile, M.; Vassilieva, V.; Velghe, B.; Veltri, M.; Venditti, S.; Vormstein, M.; Wahl, H.; Wanke, R.; Wertelaers, P.; Winhart, A.; Winston, R.; Wrona, B.; Yushchenko, O.; Zamkovsky, M.; Zinchenko, A.

    2015-07-01

    The rare decays are excellent processes to probe the Standard Model and indirectly search for new physics complementary to the direct LHC searches. The NA62 experiment at CERN SPS aims to collect and analyse O(1013) kaon decays before the CERN long-shutdown 2 (in 2018). This will allow to measure the branching ratio to a level of 10% accuracy. The experimental apparatus has been commissioned during a first run in autumn 2014.

  3. The trigger system for K0→2 π0 decays of the NA48 experiment at CERN

    NASA Astrophysics Data System (ADS)

    Mikulec, I.

    1998-02-01

    A fully pipelined 40 MHz "dead-time-free" trigger system for neutral K0 decays for the NA48 experiment at CERN is described. The NA48 experiment studies CP-violation using the high intensity beam of the CERN SPS accelerator. The trigger system sums, digitises, filters and processes signals from 13 340 channels of the liquid krypton electro-magnetic calorimeter. In 1996 the calorimeter and part of the trigger electronics were installed and tested. In 1997 the system was completed and prepared to be used in the first NA48 physics data taking period. Cagliari, Cambridge, CERN, Dubna, Edinburgh, Ferrara, Firenze, Mainz, Orsay, Perugia, Pisa, Saclay, Siegen, Torino, Warszawa, Wien Collaboration.

  4. Meeting Jentschke

    ScienceCinema

    None

    2018-05-18

    After an introduction about the latest research and news at CERN, the DG W. Jentschke speaks about future management of CERN with two new general managers, who will be in charge for the next 5 years: Dr. J.B. Adams who will focus on the administration of CERN and also the construction of buildings and equipment, and Dr. L. Van Hove who will be responsible for research activities. The DG speaks about expected changes, shared services, different divisions and their leaders, etc.

  5. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    Sen, Ashoke

    2017-12-18

    Part 7.The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  6. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-02-09

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental InteractionS". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  7. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-01-22

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  8. NA61/SHINE facility at the CERN SPS: beams and detector system

    NASA Astrophysics Data System (ADS)

    Abgrall, N.; Andreeva, O.; Aduszkiewicz, A.; Ali, Y.; Anticic, T.; Antoniou, N.; Baatar, B.; Bay, F.; Blondel, A.; Blumer, J.; Bogomilov, M.; Bogusz, M.; Bravar, A.; Brzychczyk, J.; Bunyatov, S. A.; Christakoglou, P.; Cirkovic, M.; Czopowicz, T.; Davis, N.; Debieux, S.; Dembinski, H.; Diakonos, F.; Di Luise, S.; Dominik, W.; Drozhzhova, T.; Dumarchez, J.; Dynowski, K.; Engel, R.; Efthymiopoulos, I.; Ereditato, A.; Fabich, A.; Feofilov, G. A.; Fodor, Z.; Fulop, A.; Gaździcki, M.; Golubeva, M.; Grebieszkow, K.; Grzeszczuk, A.; Guber, F.; Haesler, A.; Hasegawa, T.; Hierholzer, M.; Idczak, R.; Igolkin, S.; Ivashkin, A.; Jokovic, D.; Kadija, K.; Kapoyannis, A.; Kaptur, E.; Kielczewska, D.; Kirejczyk, M.; Kisiel, J.; Kiss, T.; Kleinfelder, S.; Kobayashi, T.; Kolesnikov, V. I.; Kolev, D.; Kondratiev, V. P.; Korzenev, A.; Koversarski, P.; Kowalski, S.; Krasnoperov, A.; Kurepin, A.; Larsen, D.; Laszlo, A.; Lyubushkin, V. V.; Maćkowiak-Pawłowska, M.; Majka, Z.; Maksiak, B.; Malakhov, A. I.; Maletic, D.; Manglunki, D.; Manic, D.; Marchionni, A.; Marcinek, A.; Marin, V.; Marton, K.; Mathes, H.-J.; Matulewicz, T.; Matveev, V.; Melkumov, G. L.; Messina, M.; Mrówczyński, St.; Murphy, S.; Nakadaira, T.; Nirkko, M.; Nishikawa, K.; Palczewski, T.; Palla, G.; Panagiotou, A. D.; Paul, T.; Peryt, W.; Petukhov, O.; Pistillo, C.; Płaneta, R.; Pluta, J.; Popov, B. A.; Posiadala, M.; Puławski, S.; Puzovic, J.; Rauch, W.; Ravonel, M.; Redij, A.; Renfordt, R.; Richter-Was, E.; Robert, A.; Röhrich, D.; Rondio, E.; Rossi, B.; Roth, M.; Rubbia, A.; Rustamov, A.; Rybczyński, M.; Sadovsky, A.; Sakashita, K.; Savic, M.; Schmidt, K.; Sekiguchi, T.; Seyboth, P.; Sgalaberna, D.; Shibata, M.; Sipos, R.; Skrzypczak, E.; Słodkowski, M.; Sosin, Z.; Staszel, P.; Stefanek, G.; Stepaniak, J.; Stroebele, H.; Susa, T.; Szuba, M.; Tada, M.; Tereshchenko, V.; Tolyhi, T.; Tsenov, R.; Turko, L.; Ulrich, R.; Unger, M.; Vassiliou, M.; Veberic, D.; Vechernin, V. V.; Vesztergombi, G.; Vinogradov, L.; Wilczek, A.; Włodarczyk, Z.; Wojtaszek-Szwarz, A.; Wyszyński, O.; Zambelli, L.; Zipper, W.

    2014-06-01

    NA61/SHINE (SPS Heavy Ion and Neutrino Experiment) is a multi-purpose experimental facility to study hadron production in hadron-proton, hadron-nucleus and nucleus-nucleus collisions at the CERN Super Proton Synchrotron. It recorded the first physics data with hadron beams in 2009 and with ion beams (secondary 7Be beams) in 2011. NA61/SHINE has greatly profited from the long development of the CERN proton and ion sources and the accelerator chain as well as the H2 beamline of the CERN North Area. The latter has recently been modified to also serve as a fragment separator as needed to produce the Be beams for NA61/SHINE. Numerous components of the NA61/SHINE set-up were inherited from its predecessors, in particular, the last one, the NA49 experiment. Important new detectors and upgrades of the legacy equipment were introduced by the NA61/SHINE Collaboration. This paper describes the state of the NA61/SHINE facility — the beams and the detector system — before the CERN Long Shutdown I, which started in March 2013.

  9. Reconfigurable PCI Express cards for low-latency data transport in HEP experiments

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Cretaro, P.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Paolucci, P. S.; Pastorelli, E.; Pontisso, L.; Simula, F.; Vicini, P.

    2017-01-01

    State-of-the-art technology supports the High Energy Physics community in addressing the problem of managing an overwhelming amount of experimental data. From the point of view of communication between the detectors' readout system and computing nodes, the critical issues are the following: latency, moving data in a deterministic and low amount of time; bandwidth, guaranteeing the maximum capability of the link and communication protocol adopted; endpoint consolidation, tight aggregation of channels on a single board. This contribution describes the status and performances of the NaNet project, whose goal is the design of a family of FPGA-based PCIe network interface cards. The efforts of the team are focused on implementing a low-latency, real-time data transport mechanism between the board network multi-channel system and CPU and GPU accelerators memories on the host. Several opportunities concerning technical solutions and scientific applications have been explored: NaNet-1 with a single GbE I/O interface, and NaNet-10, offering four 10GbE ports, for activities related to the GPU-based real-time trigger of NA62 experiment at CERN; NaNet ^3 , with four 2.5Gbit optical channels, developed for the KM3NeT-ITALIA underwater neutrino telescope.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network Constituents, Fundamental Forces and Symmetries of the Universe. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva.« less

  11. CERN Collider, France-Switzerland

    NASA Image and Video Library

    2013-08-23

    This image, acquired by NASA Terra spacecraft, is of the CERN Large Hadron Collider, the world largest and highest-energy particle accelerator laying beneath the French-Swiss border northwest of Geneva yellow circle.

  12. CERN: A European laboratory for a global project

    NASA Astrophysics Data System (ADS)

    Voss, Rüdiger

    2015-06-01

    In the most important shift of paradigm of its membership rules in 60 years, CERN in 2010 introduced a policy of “Geographical Enlargement” which for the first time opened the door for membership of non-European States in the Organization. This short article reviews briefly the history of CERN's membership rules, discusses the rationale behind the new policy, its relationship with the emerging global roadmap of particle physics, and gives a short overview of the status of the enlargement process.

  13. Review of CERN Data Centre Infrastructure

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Bell, T.; van Eldik, J.; McCance, G.; Panzer-Steindel, B.; Coelho dos Santos, M.; Traylen and, S.; Schwickerath, U.

    2012-12-01

    The CERN Data Centre is reviewing strategies for optimizing the use of the existing infrastructure and expanding to a new data centre by studying how other large sites are being operated. Over the past six months, CERN has been investigating modern and widely-used tools and procedures used for virtualisation, clouds and fabric management in order to reduce operational effort, increase agility and support unattended remote data centres. This paper gives the details on the project's motivations, current status and areas for future investigation.

  14. PARTICLE PHYSICS: CERN Gives Higgs Hunters Extra Month to Collect Data.

    PubMed

    Morton, O

    2000-09-22

    After 11 years of banging electrons and positrons together at higher energies than any other machine in the world, CERN, the European laboratory for particle physics, had decided to shut down the Large Electron-Positron collider (LEP) and install a new machine, the Large Hadron Collider (LHC), in its 27-kilometer tunnel. In 2005, the LHC will start bashing protons together at even higher energies. But tantalizing hints of a long-sought fundamental particle have forced CERN managers to grant LEP a month's reprieve.

  15. Assessment of thermal loads in the CERN SPS crab cavities cryomodule1

    NASA Astrophysics Data System (ADS)

    Carra, F.; Apeland, J.; Calaga, R.; Capatina, O.; Capelli, T.; Verdú-Andrés, S.; Zanoni, C.

    2017-07-01

    As a part of the HL-LHC upgrade, a cryomodule is designed to host two crab cavities for a first test with protons in the SPS machine. The evaluation of the cryomodule heat loads is essential to dimension the cryogenic infrastructure of the system. The current design features two cryogenic circuits. The first circuit adopts superfluid helium at 2 K to maintain the cavities in the superconducting state. The second circuit, based on helium gas at a temperature between 50 K and 70 K, is connected to the thermal screen, also serving as heat intercept for all the interfaces between the cold mass and the external environment. An overview of the heat loads to both circuits, and the combined numerical and analytical estimations, is presented. The heat load of each element is detailed for the static and dynamic scenarios, with considerations on the design choices for the thermal optimization of the most critical components.

  16. A World Wide Web (WWW) server database engine for an organelle database, MitoDat.

    PubMed

    Lemkin, P F; Chipperfield, M; Merril, C; Zullo, S

    1996-03-01

    We describe a simple database search engine "dbEngine" which may be used to quickly create a searchable database on a World Wide Web (WWW) server. Data may be prepared from spreadsheet programs (such as Excel, etc.) or from tables exported from relationship database systems. This Common Gateway Interface (CGI-BIN) program is used with a WWW server such as available commercially, or from National Center for Supercomputer Algorithms (NCSA) or CERN. Its capabilities include: (i) searching records by combinations of terms connected with ANDs or ORs; (ii) returning search results as hypertext links to other WWW database servers; (iii) mapping lists of literature reference identifiers to the full references; (iv) creating bidirectional hypertext links between pictures and the database. DbEngine has been used to support the MitoDat database (Mendelian and non-Mendelian inheritance associated with the Mitochondrion) on the WWW.

  17. Basic concepts and architectural details of the Delphi trigger system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bocci, V.; Booth, P.S.L.; Bozzo, M.

    1995-08-01

    Delphi (DEtector with Lepton, Photon and Hadron Identification) is one of the four experiments of the LEP (Large Electron Positron) collider at CERN. The detector is laid out to provide a nearly 4 {pi} coverage for charged particle tracking, electromagnetic, hadronic calorimetry and extended particle identification. The trigger system consists of four levels. The first two are synchronous with the BCO (Beam Cross Over) and rely on hardwired control units, while the last two are performed asynchronously with respect to the BCO and are driven by the Delphi host computers. The aim of this paper is to give a comprehensivemore » global view of the trigger system architecture, presenting in detail the first two levels, their various hardware components and the latest modifications introduced in order to improve their performance and make more user friendly the whole software user interface.« less

  18. Réunion publique HR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-04-30

    Chers Collègues,Je me permets de vous rappeler qu'une réunion publique organisée par le Département HR se tiendra aujourd'hui:Vendredi 30 avril 2010 à 9h30 dans l'Amphithéâtre principal (café offert dès 9h00).Durant cette réunion, des informations générales seront données sur:le CERN Admin e-guide, qui est un nouveau guide des procédures administratives du CERN ayant pour but de faciliter la recherche d'informations pratiques et d'offrir un format de lecture convivial;le régime d'Assurance Maladie de l'Organisation (présentation effectuée par Philippe Charpentier, Président du CHIS Board) et;la Caisse de Pensions (présentation effectuée par Théodore Economou, Administrateur de la Caisse de Pensions du CERN).Une transmission simultanéemore » de cette réunion sera assurée dans l'Amphithéâtre BE de Prévessin et également disponible à l'adresse suivante: http://webcast.cern.chJe me réjouis de votre participation!Meilleures salutations,Anne-Sylvie CatherinChef du Département des Ressources humaines__________________________________________________________________________________Dear Colleagues,I should like to remind you that a plublic meeting organised by HR Department will be held today:Friday 30 April 2010 at 9:30 am in the Main Auditorium (coffee from 9:00 am).During this meeting, general information will be given about:the CERN Admin e-guide which is a new guide to the Organization's administrative procedures, drawn up to facilitate the retrieval of practical information and to offer a user-friendly format;the CERN Health Insurance System (presentation by Philippe Charpentier, President of the CHIS Board) and;the Pension Fund (presentation by Theodore Economou, Administrator of the CERN Pension Fund).A simultaneous transmission of this meeting will be broadcast in the BE Auditorium at Prévessin and will also be available at the following address. http://webcast.cern.chI look forward to your participation!Best regards,Anne-Sylvie CatherinHead, Human Resources Department« less

  19. Réunion publique HR

    ScienceCinema

    None

    2017-12-09

    Chers Collègues,Je me permets de vous rappeler qu'une réunion publique organisée par le Département HR se tiendra aujourd'hui:Vendredi 30 avril 2010 à 9h30 dans l'Amphithéâtre principal (café offert dès 9h00).Durant cette réunion, des informations générales seront données sur:le CERN Admin e-guide, qui est un nouveau guide des procédures administratives du CERN ayant pour but de faciliter la recherche d'informations pratiques et d'offrir un format de lecture convivial;le régime d'Assurance Maladie de l'Organisation (présentation effectuée par Philippe Charpentier, Président du CHIS Board) et;la Caisse de Pensions (présentation effectuée par Théodore Economou, Administrateur de la Caisse de Pensions du CERN).Une transmission simultanée de cette réunion sera assurée dans l'Amphithéâtre BE de Prévessin et également disponible à l'adresse suivante: http://webcast.cern.chJe me réjouis de votre participation!Meilleures salutations,Anne-Sylvie CatherinChef du Département des Ressources humaines__________________________________________________________________________________Dear Colleagues,I should like to remind you that a plublic meeting organised by HR Department will be held today:Friday 30 April 2010 at 9:30 am in the Main Auditorium (coffee from 9:00 am).During this meeting, general information will be given about:the CERN Admin e-guide which is a new guide to the Organization's administrative procedures, drawn up to facilitate the retrieval of practical information and to offer a user-friendly format;the CERN Health Insurance System (presentation by Philippe Charpentier, President of the CHIS Board) and;the Pension Fund (presentation by Theodore Economou, Administrator of the CERN Pension Fund).A simultaneous transmission of this meeting will be broadcast in the BE Auditorium at Prévessin and will also be available at the following address. http://webcast.cern.chI look forward to your participation!Best regards,Anne-Sylvie CatherinHead, Human Resources Department

  20. CERN launches high-school internship programme

    NASA Astrophysics Data System (ADS)

    Johnston, Hamish

    2017-07-01

    The CERN particle-physics lab has hosted 22 high-school students from Hungary in a pilot programme designed to show teenagers how science, technology, engineering and mathematics is used at the particle-physics lab.

  1. Commissioning of a CERN Production and Analysis Facility Based on xrootd

    NASA Astrophysics Data System (ADS)

    Campana, Simone; van der Ster, Daniel C.; Di Girolamo, Alessandro; Peters, Andreas J.; Duellmann, Dirk; Coelho Dos Santos, Miguel; Iven, Jan; Bell, Tim

    2011-12-01

    The CERN facility hosts the Tier-0 of the four LHC experiments, but as part of WLCG it also offers a platform for production activities and user analysis. The CERN CASTOR storage technology has been extensively tested and utilized for LHC data recording and exporting to external sites according to experiments computing model. On the other hand, to accommodate Grid data processing activities and, more importantly, chaotic user analysis, it was realized that additional functionality was needed including a different throttling mechanism for file access. This paper will describe the xroot-based CERN production and analysis facility for the ATLAS experiment and in particular the experiment use case and data access scenario, the xrootd redirector setup on top of the CASTOR storage system, the commissioning of the system and real life experience for data processing and data analysis.

  2. OBITUARY: Maurice Jacob (1933 2007)

    NASA Astrophysics Data System (ADS)

    Quercigh, Emanuele; Šándor, Ladislav

    2008-04-01

    Maurice Jacob passed away on 2 May 2007. With his death, we have lost one of the founding fathers of the ultra-relativistic heavy ion programme. His interest in high-energy nuclear physics started in 1981 when alpha alpha collisions could first be studied in the CERN ISR. An enthusiastic supporter of ion beam experiments at CERN, Maurice was at the origin of the 1982 Quark Matter meeting in Bielefeld [1] which brought together more than 100 participants from both sides of the Atlantic, showing a good enthusiastic constituency for such research. There were twice as many the following year at Brookhaven. Finally in the mid-eighties, a heavy ion programme was approved both at CERN and at Brookhaven involving as many nuclear as particle physicists. It was the start of a fruitful interdisciplinary collaboration which is nowadays continuing both at RHIC and at LHC. Maurice followed actively the development of this field, reporting at a number of conferences and meetings (Les Arcs, Bielefeld, Beijing, Brookhaven, Lenox, Singapore, Taormina,...). This activity culminated in 2000, when Maurice, together with Ulrich Heinz, summarized the main results of the CERN SPS heavy-ion experiments and the evidence was obtained for a new state of matter [2]. Maurice was a brilliant theoretical physicist. His many contributions have been summarized in a recent article in the CERN Courier by two leading CERN theorists, John Ellis and Andre Martin [3]. The following is an excerpt from their article: `He began his research career at Saclay and, while still a PhD student, he continued brilliantly during a stay at Brookhaven. It was there in 1959 that Maurice, together with Giancarlo Wick, developed the helicity amplitude formalism that is the basis of many modern theoretical calculations. Maurice obtained his PhD in 1961 and, after a stay at Caltech, returned to Saclay. A second American foray was to SLAC, where he and Sam Berman made the crucial observation that the point-like structures (partons) seen in deep-inelastic scattering implied the existence of high-transverse-momentum processes in proton proton collisions, as the ISR at CERN subsequently discovered. In 1967 Maurice joined CERN, where he remained, apart from influential visits to Yale, Fermilab and elsewhere, until his retirement in 1998. He became one of the most respected international experts on the phenomenology of strong interactions, including diffraction, scaling, high-transverse-momentum processes and the formation of quark gluon plasma. In particular, he pioneered the studies of inclusive hadron-production processes, including scaling and its violations. Also, working with Ron Horgan, he made detailed predictions for the production of jets at CERN's proton antiproton collider. The UA2 and UA1 experiments subsequently discovered these. He was also interested in electron positron colliders, making pioneering calculations, together with Tai Wu, of radiation in high-energy collisions. Maurice was one of the scientific pillars of CERN, working closely with experimental colleagues in predicting and interpreting results from successive CERN colliders. He was indefatigable in organizing regular meetings on ISR physics, bringing together theorists and experimentalists to debate the meaning of new results and propose new measurements. He was one of the strongest advocates of Carlo Rubbia's proposal for a proton antiproton collider at CERN, and was influential in preparing and advertising its physics. In 1978 he organized the Les Houches workshop that brought the LEP project to the attention of the wider European particle physics community. He also organized the ECFA workshop at Lausanne in 1984 that made the first exploration of the possible physics of the LHC. It is a tragedy that Maurice has not lived to enjoy data from the LHC.' References [1] Maurice Jacob and Helmut Satz (eds) 1982 Proc. Workshop on Quark Matter Formation and Heavy Ion Collisions, Bielefeld, 10 14 May 1982 (Singapore: World Scientific Publishing) [2] Heinz Ulrich W and Jacob Maurice 2000 Evidence for a new state of matter: An assessment of the results from the CERN lead beam program. Preprint nucl-th/0002042 [3] Ellis J and Martin A 2007 CERN Courier 47 issue 6

  3. CERN automatic audio-conference service

    NASA Astrophysics Data System (ADS)

    Sierra Moral, Rodrigo

    2010-04-01

    Scientists from all over the world need to collaborate with CERN on a daily basis. They must be able to communicate effectively on their joint projects at any time; as a result telephone conferences have become indispensable and widely used. Managed by 6 operators, CERN already has more than 20000 hours and 5700 audio-conferences per year. However, the traditional telephone based audio-conference system needed to be modernized in three ways. Firstly, to provide the participants with more autonomy in the organization of their conferences; secondly, to eliminate the constraints of manual intervention by operators; and thirdly, to integrate the audio-conferences into a collaborative working framework. The large number, and hence cost, of the conferences prohibited externalization and so the CERN telecommunications team drew up a specification to implement a new system. It was decided to use a new commercial collaborative audio-conference solution based on the SIP protocol. The system was tested as the first European pilot and several improvements (such as billing, security, redundancy...) were implemented based on CERN's recommendations. The new automatic conference system has been operational since the second half of 2006. It is very popular for the users and has doubled the number of conferences in the past two years.

  4. CERN openlab: Engaging industry for innovation in the LHC Run 3-4 R&D programme

    NASA Astrophysics Data System (ADS)

    Girone, M.; Purcell, A.; Di Meglio, A.; Rademakers, F.; Gunne, K.; Pachou, M.; Pavlou, S.

    2017-10-01

    LHC Run3 and Run4 represent an unprecedented challenge for HEP computing in terms of both data volume and complexity. New approaches are needed for how data is collected and filtered, processed, moved, stored and analysed if these challenges are to be met with a realistic budget. To develop innovative techniques we are fostering relationships with industry leaders. CERN openlab is a unique resource for public-private partnership between CERN and leading Information Communication and Technology (ICT) companies. Its mission is to accelerate the development of cutting-edge solutions to be used by the worldwide HEP community. In 2015, CERN openlab started its phase V with a strong focus on tackling the upcoming LHC challenges. Several R&D programs are ongoing in the areas of data acquisition, networks and connectivity, data storage architectures, computing provisioning, computing platforms and code optimisation and data analytics. This paper gives an overview of the various innovative technologies that are currently being explored by CERN openlab V and discusses the long-term strategies that are pursued by the LHC communities with the help of industry in closing the technological gap in processing and storage needs expected in Run3 and Run4.

  5. Measurements and FLUKA simulations of bismuth and aluminium activation at the CERN Shielding Benchmark Facility (CSBF)

    NASA Astrophysics Data System (ADS)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.

    2018-03-01

    The CERN High Energy AcceleRator Mixed field facility (CHARM) is located in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5 ṡ1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7 ṡ1010 p/s that then impacts on the CHARM target. The shielding of the CHARM facility also includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target. This facility consists of 80 cm of cast iron and 360 cm of concrete with barite concrete in some places. Activation samples of bismuth and aluminium were placed in the CSBF and in the CHARM access corridor in July 2015. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields for these samples. The results estimated by FLUKA Monte Carlo simulations are compared to activation measurements of these samples. The comparison between FLUKA simulations and the measured values from γ-spectrometry gives an agreement better than a factor of 2.

  6. Monte Carlo Methods in Materials Science Based on FLUKA and ROOT

    NASA Technical Reports Server (NTRS)

    Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor

    2003-01-01

    A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the CERN ALICE (A Large Ion Collisions Experiment) software team through an adaptation of their existing AliROOT (ALICE Using ROOT) architecture. In order to check our progress against actual data, we have chosen to simulate the ATIC14 (Advanced Thin Ionization Calorimeter) cosmic-ray astrophysics balloon payload as well as neutron fluences in the Mir spacecraft. This paper contains a summary of status of this project, and a roadmap to its successful completion.

  7. Memorial W.Gentner

    ScienceCinema

    None

    2018-05-25

    The DG H. Schopper gives an introduction for the commemoration and ceremony of the life and work of Professor Wolfgang Gentner. W. Gentner, German physicist, born in 1906 in Frankfurt and died in September 1980 in Heidelberg, was director of CERN from 1955 to 1960, president of the Scientific Policy Committee from 1968 to 1971 and president of the Council of CERN from 1972 to 1974. He was one of the founders of CERN and four people who knew him well pay tribute to him, among others one of his students, as well as J.B. Adams and O. Sheffard.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The DG H. Schopper gives an introduction for the commemoration and ceremony of the life and work of Professor Wolfgang Gentner. W. Gentner, German physicist, born in 1906 in Frankfurt and died in September 1980 in Heidelberg, was director of CERN from 1955 to 1960, president of the Scientific Policy Committee from 1968 to 1971 and president of the Council of CERN from 1972 to 1974. He was one of the founders of CERN and four people who knew him well pay tribute to him, among others one of his students, as well as J.B. Adams and O. Sheffard.

  9. OPERA - First Beam Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamura, M.

    2008-02-21

    OPERA is a long base-line neutrino oscillation experiment to detect tau-neutrino appearance and to prove that the origin of the atmospheric muon neutrino deficit observed by Kamiokande is the neutrino oscillation. A Hybrid emulsion detector, of which weight is about 1.3 kton, has been installed in Gran Sasso laboratory. New muon neutrino beam line, CNGS, has been constructed at CERN to send neutrinos to Gran Sasso, 730 km apart from CERN. In 2006, first neutrinos were sent from CERN to LNGS and were detected by the OPERA detector successfully as planned.

  10. Membership Finland

    ScienceCinema

    None

    2018-05-18

    The DG C. Rubbia and the vice president of the council of CERN gives a warm welcome to the membership of Finland, as the 15th member of CERN since January 1 1991 in the presence of the Secretary-General and the ambassador.

  11. Visit CD

    ScienceCinema

    None

    2017-12-09

    Le DG H.Schopper souhaite la bienvenue aux ambassadeurs des pays membres et aux représentants des pays avec lesquels le Cern entretient des relations proches et fait un exposé sur les activités au Cern

  12. Terbium Radionuclides for Theranostics Applications: A Focus On MEDICIS-PROMED

    NASA Astrophysics Data System (ADS)

    Cavaier, R. Formento; Haddad, F.; Sounalet, T.; Stora, T.; Zahi, I.

    A new facility, named CERN-MEDICIS, is under construction at CERN to produce radionuclides for medical applications. In parallel, the MEDICIS-PROMED, a Marie Sklodowska-Curie innovative training network of the Horizon 2020 European Commission's program, is being coordinated by CERN to train young scientists on the production and use of innovative radionuclides and develop a network of experts within Europe. One program within MEDICIS-PROMED is to determine the feasibility of producing innovative radioisotopes for theranostics using a commercial middle-sized high-current cyclotron and the mass separation technology developed at CERN-MEDICIS. This will allow the production of high specific activity radioisotopes not achievable with the common post-processing by chemical separation. Radioisotopes of scandium, copper, arsenic and terbium have been identified. Preliminary studies of activation yield and irradiation parameters optimization for the production of Tb-149 will be described.

  13. LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    NASA Astrophysics Data System (ADS)

    Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor

    2017-12-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.

  14. Cryogenic Control System Migration and Developments towards the UNICOS CERN Standard at INFN

    NASA Astrophysics Data System (ADS)

    Modanese, Paolo; Calore, Andrea; Contran, Tiziano; Friso, Alessandro; Pengo, Marco; Canella, Stefania; Burioli, Sergio; Gallese, Benedetto; Inglese, Vitaliano; Pezzetti, Marco; Pengo, Ruggero

    The cryogenic control systems at Laboratori Nazionali di Legnaro (LNL) are undergoing an important and radical modernization, allowing all the plants controls and supervision systems to be renewed in a homogeneous way towards the CERN-UNICOS standard. Before the UNICOS migration project started there were as many as 7 different types of PLC and 7 different types of SCADA, each one requiring its own particular programming language. In these conditions, even a simple modification and/or integration on the program or on the supervision, required the intervention of a system integrator company, specialized in its specific control system. Furthermore it implied that the operators have to be trained to learn the different types of control systems. The CERN-UNICOS invented for LHC [1] has been chosen due to its reliability and planned to run and be maintained for decades on. The complete migration is part of an agreement between CERN and INFN.

  15. PREFACE: Lectures from the CERN Winter School on Strings, Supergravity and Gauge Theories, CERN, 9-13 February 2009 Lectures from the CERN Winter School on Strings, Supergravity and Gauge Theories, CERN, 9-13 February 2009

    NASA Astrophysics Data System (ADS)

    Uranga, A. M.

    2009-11-01

    This special section is devoted to the proceedings of the conference `Winter School on Strings, Supergravity and Gauge Theories', which took place at CERN, the European Centre for Nuclear Research, in Geneva, Switzerland 9-13 February 2009. This event is part of a yearly series of scientific schools, which represents a well established tradition. Previous events have been held at SISSA, in Trieste, Italy, in February 2005 and at CERN in January 2006, January 2007 and January 2008, and were funded by the European Mobility Research and Training Network `Constituents, Fundamental Forces and Symmetries of the Universe'. The next event will take place again at CERN, in January 2010. The school was primarily meant for young doctoral students and postdoctoral researchers working in the area of string theory. It consisted of several general lectures of four hours each, whose notes are published in this special section, and six working group discussion sessions, focused on specific topics of the network research program. It was well attended by over 200 participants. The topics of the lectures were chosen to provide an introduction to some of the areas of recent progress, and to the open problems, in string theory. One of the most active areas in string theory in recent years has been the AdS/CFT or gauge/gravity correspondence, which proposes the complete equivalence of string theory on (asymptotically) anti de Sitter spacetimes with certain quantum (gauge) field theories. The duality has recently been applied to understanding the hydrodynamical properties of a hot plasma in gauge theories (like the quark-gluon plasma created in heavy ion collisions at the RHIC experiment at Brookhaven, and soon at the LHC at CERN) in terms of a dual gravitational AdS theory in the presence of a black hole. These developments were reviewed in the lecture notes by M Rangamani. In addition, the AdS/CFT duality has been proposed as a tool to study interesting physical properties in other physical systems described by quantum field theory, for instance in the context of a condensed matter system. The lectures by S Hartnoll provided an introduction to this recent development with an emphasis on the dual holographic description of superconductivity. Finally, ideas inspired by the AdS/CFT correspondence are yielding deep insights into fundamental questions of quantum gravity, like the entropy of black holes and its interpretation in terms of microstates. The lectures by S Mathur reviewed the black hole entropy and information paradox, and the proposal for its resolution in terms of `fuzzball' microstates. Further sets of lectures, not included in this special section, by F Zwirner and V Mukhanov, covered phenomenological aspects of high energy physics beyond the Standard Model and of cosmology. The coming experimental data in these two fields are expected to foster new developments in connecting string theory to the real world. The conference was financially supported by CERN and partially by the Arnold Sommerfeld Center for Theoretical Physics of the Ludwig Maximilians University of Munich. It is a great pleasure for us to warmly thank the Theory Unit of CERN for its very kind hospitality and for the high quality of the assistance and the infrastructures that it has provided. A M Uranga CERN, Switzerland Guest Editor

  16. Helix Nebula and CERN: A Symbiotic approach to exploiting commercial clouds

    NASA Astrophysics Data System (ADS)

    Barreiro Megino, Fernando H.; Jones, Robert; Kucharczyk, Katarzyna; Medrano Llamas, Ramón; van der Ster, Daniel

    2014-06-01

    The recent paradigm shift toward cloud computing in IT, and general interest in "Big Data" in particular, have demonstrated that the computing requirements of HEP are no longer globally unique. Indeed, the CERN IT department and LHC experiments have already made significant R&D investments in delivering and exploiting cloud computing resources. While a number of technical evaluations of interesting commercial offerings from global IT enterprises have been performed by various physics labs, further technical, security, sociological, and legal issues need to be address before their large-scale adoption by the research community can be envisaged. Helix Nebula - the Science Cloud is an initiative that explores these questions by joining the forces of three European research institutes (CERN, ESA and EMBL) with leading European commercial IT enterprises. The goals of Helix Nebula are to establish a cloud platform federating multiple commercial cloud providers, along with new business models, which can sustain the cloud marketplace for years to come. This contribution will summarize the participation of CERN in Helix Nebula. We will explain CERN's flagship use-case and the model used to integrate several cloud providers with an LHC experiment's workload management system. During the first proof of concept, this project contributed over 40.000 CPU-days of Monte Carlo production throughput to the ATLAS experiment with marginal manpower required. CERN's experience, together with that of ESA and EMBL, is providing a great insight into the cloud computing industry and highlighted several challenges that are being tackled in order to ease the export of the scientific workloads to the cloud environments.

  17. Offering Global Collaboration Services beyond CERN and HEP

    NASA Astrophysics Data System (ADS)

    Fernandes, J.; Ferreira, P.; Baron, T.

    2015-12-01

    The CERN IT department has built over the years a performant and integrated ecosystem of collaboration tools, from videoconference and webcast services to event management software. These services have been designed and evolved in very close collaboration with the various communities surrounding the laboratory and have been massively adopted by CERN users. To cope with this very heavy usage, global infrastructures have been deployed which take full advantage of CERN's international and global nature. If these services and tools are instrumental in enabling the worldwide collaboration which generates major HEP breakthroughs, they would certainly also benefit other sectors of science in which globalization has already taken place. Some of these services are driven by commercial software (Vidyo or Wowza for example), some others have been developed internally and have already been made available to the world as Open Source Software in line with CERN's spirit and mission. Indico for example is now installed in 100+ institutes worldwide. But providing the software is often not enough and institutes, collaborations and project teams do not always possess the expertise, or human or material resources that are needed to set up and maintain such services. Regional and national institutions have to answer needs, which are growingly global and often contradict their operational capabilities or organizational mandate and so are looking at existing worldwide service offers such as CERN's. We believe that the accumulated experience obtained through the operation of a large scale worldwide collaboration service combined with CERN's global network and its recently- deployed Agile Infrastructure would allow the Organization to set up and operate collaborative services, such as Indico and Vidyo, at a much larger scale and on behalf of worldwide research and education institutions and thus answer these pressing demands while optimizing resources at a global level. Such services would be built over a robust and massively scalable Indico server to which the concept of communities would be added, and which would then serve as a hub for accessing other collaboration services such as Vidyo, on the same simple and successful model currently in place for CERN users. This talk will describe this vision, its benefits and the steps that have already been taken to make it come to life.

  18. The 11 T dipole for HL-LHC: Status and plan

    DOE PAGES

    Savary, F.; Barzi, E.; Bordini, B.; ...

    2016-06-01

    The upgrade of the Large Hadron Collider (LHC) collimation system includes additional collimators in the LHC lattice. The longitudinal space for these collimators will be created by replacing some of the LHC main dipoles with shorter but stronger dipoles compatible with the LHC lattice and main systems. The project plan comprises the construction of two cryoassemblies containing each of the two 11-T dipoles of 5.5-m length for possible installation on either side of interaction point 2 of LHC in the years 2018-2019 for ion operation, and the installation of two cryoassemblies on either side of interaction point 7 of LHCmore » in the years 2023-2024 for proton operation. The development program conducted in conjunction between the Fermilab and CERN magnet groups is progressing well. The development activities carried out on the side of Fermilab were concluded in the middle of 2015 with the fabrication and test of a 1-m-long two-in-one model and those on the CERN side are ramping up with the construction of 2-m-long models and the preparation of the tooling for the fabrication of the first full-length prototype. The engineering design of the cryomagnet is well advanced, including the definition of the various interfaces, e.g., with the collimator, powering, protection, and vacuum systems. Several practice coils of 5.5-m length have been already fabricated. This paper describes the overall progress of the project, the final design of the cryomagnet, and the performance of the most recent models. Furthermore, the overall plan toward the fabrication of the series magnets for the two phases of the upgrade of the LHC collimation system is also presented.« less

  19. H4DAQ: a modern and versatile data-acquisition package for calorimeter prototypes test-beams

    NASA Astrophysics Data System (ADS)

    Marini, A. C.

    2018-02-01

    The upgrade of the particle detectors for the HL-LHC or for future colliders requires an extensive program of tests to qualify different detector prototypes with dedicated test beams. A common data-acquisition system, H4DAQ, was developed for the H4 test beam line at the North Area of the CERN SPS in 2014 and it has since been adopted in various applications for the CMS experiment and AIDA project. Several calorimeter prototypes and precision timing detectors have used our system from 2014 to 2017. H4DAQ has proven to be a versatile application and has been ported to many other beam test environments. H4DAQ is fast, simple, modular and can be configured to support various kinds of setup. The functionalities of the DAQ core software are split into three configurable finite state machines: data readout, run control, and event builder. The distribution of information and data between the various computers is performed using ZEROMQ (0MQ) sockets. Plugins are available to read different types of hardware, including VME crates with many types of boards, PADE boards, custom front-end boards and beam instrumentation devices. The raw data are saved as ROOT files, using the CERN C++ ROOT libraries. A Graphical User Interface, based on the python gtk libraries, is used to operate the H4DAQ and an integrated data quality monitoring (DQM), written in C++, allows for fast processing of the events for quick feedback to the user. As the 0MQ libraries are also available for the National Instruments LabVIEW program, this environment can easily be integrated within H4DAQ applications.

  20. An on-line acoustic fluorocarbon coolant mixture analyzer for the ATLAS silicon tracker

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, R.; Battistin, M.; Berry, S.

    2011-07-01

    The ATLAS silicon tracker community foresees an upgrade from the present octafluoro-propane (C{sub 3}F{sub 8}) evaporative cooling fluid - to a composite fluid with a probable 10-20% admixture of hexafluoro-ethane (C{sub 2}F{sub 6}). Such a fluid will allow a lower evaporation temperature and will afford the tracker silicon substrates a better safety margin against leakage current-induced thermal runaway caused by cumulative radiation damage as the luminosity profile at the CERN Large Hadron Collider increases. Central to the use of this new fluid is a new custom-developed speed-of-sound instrument for continuous real-time measurement of the C{sub 3}F{sub 8}/C{sub 2}F{sub 6} mixturemore » ratio and flow. An acoustic vapour mixture analyzer/flow meter with new custom electronics allowing ultrasonic frequency transmission through gas mixtures has been developed for this application. Synchronous with the emission of an ultrasound 'chirp' from an acoustic transmitter, a fast readout clock (40 MHz) is started. The clock is stopped on receipt of an above threshold sound pulse at the receiver. Sound is alternately transmitted parallel and anti-parallel with the vapour flow for volume flow measurement from transducers that can serve as acoustic transmitters or receivers. In the development version, continuous real-time measurement of C{sub 3}F{sub 8}/C{sub 2}F{sub 6} flow and calculation of the mixture ratio is performed within a graphical user interface developed in PVSS-II, the Supervisory, Control and Data Acquisition standard chosen for LHC and its experiments at CERN. The described instrument has numerous potential applications - including refrigerant leak detection, the analysis of hydrocarbons, vapour mixtures for semiconductor manufacture and anesthetic gas mixtures. (authors)« less

  1. An On-Line Acoustic Fluorocarbon Coolant Mixture Analyzer for the ATLAS Silicon Tracker

    NASA Astrophysics Data System (ADS)

    Bates, R.; Battistin, M.; Berry, S.; Bitadze, A.; Bonneau, P.; Bousson, N.; Boyd, G.; Botelho-Direito, J.; DiGirolamo, B.; Doubek, M.; Egorov, K.; Godlewski, J.; Hallewell, G.; Katunin, S.; Mathieu, M.; McMahon, S.; Nagai, K.; Perez-Rodriguez, E.; Rozanov, A.; Vacek, V.; Vitek, M.

    2012-10-01

    The ATLAS silicon tracker community foresees an upgrade from the present octafluoropropane (C3F8) evaporative cooling fluid to a composite fluid with a probable 10-20% admixture of hexafluoroethane (C2F6). Such a fluid will allow a lower evaporation temperature and will afford the tracker silicon substrates a better safety margin against leakage current-induced thermal runaway caused by cumulative radiation damage as the luminosity profile at the CERN Large Hadron Collider increases. Central to the use of this new fluid is a new custom-developed speed-of-sound instrument for continuous real-time measurement of the C3F8/C2F6 mixture ratio and flow. An acoustic vapour mixture analyzer/flow meter with new custom electronics allowing ultrasonic frequency transmission through gas mixtures has been developed for this application. Synchronous with the emission of an ultrasound `chirp' from an acoustic transmitter, a fast readout clock (40 MHz) is started. The clock is stopped on receipt of an above threshold sound pulse at the receiver. Sound is alternately transmitted parallel and anti-parallel with the vapour flow for volume flow measurement from transducers that can serve as acoustic transmitters or receivers. In the development version, continuous real-time measurement of C3F8/C2F6 flow and calculation of the mixture ratio is performed within a graphical user interface developed in PVSS-II, the Supervisory, Control and Data Acquisition standard chosen for LHC and its experiments at CERN. The described instrument has numerous potential applications - including refrigerant leak detection, the analysis of hydrocarbons, vapour mixtures for semi-conductor manufacture and anesthetic gas mixtures.

  2. The ATLAS Experiment at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    ATLAS Collaboration; Aad, G.; Abat, E.; Abdallah, J.; Abdelalim, A. A.; Abdesselam, A.; Abdinov, O.; Abi, B. A.; Abolins, M.; Abramowicz, H.; Acerbi, E.; Acharya, B. S.; Achenbach, R.; Ackers, M.; Adams, D. L.; Adamyan, F.; Addy, T. N.; Aderholz, M.; Adorisio, C.; Adragna, P.; Aharrouche, M.; Ahlen, S. P.; Ahles, F.; Ahmad, A.; Ahmed, H.; Aielli, G.; Åkesson, P. F.; Åkesson, T. P. A.; Akimov, A. V.; Alam, S. M.; Albert, J.; Albrand, S.; Aleksa, M.; Aleksandrov, I. N.; Aleppo, M.; Alessandria, F.; Alexa, C.; Alexander, G.; Alexopoulos, T.; Alimonti, G.; Aliyev, M.; Allport, P. P.; Allwood-Spiers, S. E.; Aloisio, A.; Alonso, J.; Alves, R.; Alviggi, M. G.; Amako, K.; Amaral, P.; Amaral, S. P.; Ambrosini, G.; Ambrosio, G.; Amelung, C.; Ammosov, V. V.; Amorim, A.; Amram, N.; Anastopoulos, C.; Anderson, B.; Anderson, K. J.; Anderssen, E. C.; Andreazza, A.; Andrei, V.; Andricek, L.; Andrieux, M.-L.; Anduaga, X. S.; Anghinolfi, F.; Antonaki, A.; Antonelli, M.; Antonelli, S.; Apsimon, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Arce, A. T. H.; Archambault, J. P.; Arguin, J.-F.; Arik, E.; Arik, M.; Arms, K. E.; Armstrong, S. R.; Arnaud, M.; Arnault, C.; Artamonov, A.; Asai, S.; Ask, S.; Åsman, B.; Asner, D.; Asquith, L.; Assamagan, K.; Astbury, A.; Athar, B.; Atkinson, T.; Aubert, B.; Auerbach, B.; Auge, E.; Augsten, K.; Aulchenko, V. M.; Austin, N.; Avolio, G.; Avramidou, R.; Axen, A.; Ay, C.; Azuelos, G.; Baccaglioni, G.; Bacci, C.; Bachacou, H.; Bachas, K.; Bachy, G.; Badescu, E.; Bagnaia, P.; Bailey, D. C.; Baines, J. T.; Baker, O. K.; Ballester, F.; Baltasar Dos Santos Pedrosa, F.; Banas, E.; Banfi, D.; Bangert, A.; Bansal, V.; Baranov, S. P.; Baranov, S.; Barashkou, A.; Barberio, E. L.; Barberis, D.; Barbier, G.; Barclay, P.; Bardin, D. Y.; Bargassa, P.; Barillari, T.; Barisonzi, M.; Barnett, B. M.; Barnett, R. M.; Baron, S.; Baroncelli, A.; Barone, M.; Barr, A. J.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Barrillon, P.; Barriuso Poy, A.; Barros, N.; Bartheld, V.; Bartko, H.; Bartoldus, R.; Basiladze, S.; Bastos, J.; Batchelor, L. E.; Bates, R. L.; Batley, J. R.; Batraneanu, S.; Battistin, M.; Battistoni, G.; Batusov, V.; Bauer, F.; Bauss, B.; Baynham, D. E.; Bazalova, M.; Bazan, A.; Beauchemin, P. H.; Beaugiraud, B.; Beccherle, R. B.; Beck, G. A.; Beck, H. P.; Becks, K. H.; Bedajanek, I.; Beddall, A. J.; Beddall, A.; Bednár, P.; Bednyakov, V. A.; Bee, C.; Behar Harpaz, S.; Belanger, G. A. N.; Belanger-Champagne, C.; Belhorma, B.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellachia, F.; Bellagamba, L.; Bellina, F.; Bellomo, G.; Bellomo, M.; Beltramello, O.; Belymam, A.; Ben Ami, S.; Ben Moshe, M.; Benary, O.; Benchekroun, D.; Benchouk, C.; Bendel, M.; Benedict, B. H.; Benekos, N.; Benes, J.; Benhammou, Y.; Benincasa, G. P.; Benjamin, D. P.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Beretta, M.; Berge, D.; Bergeaas, E.; Berger, N.; Berghaus, F.; Berglund, S.; Bergsma, F.; Beringer, J.; Bernabéu, J.; Bernardet, K.; Berriaud, C.; Berry, T.; Bertelsen, H.; Bertin, A.; Bertinelli, F.; Bertolucci, S.; Besson, N.; Beteille, A.; Bethke, S.; Bialas, W.; Bianchi, R. M.; Bianco, M.; Biebel, O.; Bieri, M.; Biglietti, M.; Bilokon, H.; Binder, M.; Binet, S.; Bingefors, N.; Bingul, A.; Bini, C.; Biscarat, C.; Bischof, R.; Bischofberger, M.; Bitadze, A.; Bizzell, J. P.; Black, K. M.; Blair, R. E.; Blaising, J. J.; Blanch, O.; Blanchot, G.; Blocker, C.; Blocki, J.; Blondel, A.; Blum, W.; Blumenschein, U.; Boaretto, C.; Bobbink, G. J.; Bocci, A.; Bocian, D.; Bock, R.; Boehm, M.; Boek, J.; Bogaerts, J. A.; Bogouch, A.; Bohm, C.; Bohm, J.; Boisvert, V.; Bold, T.; Boldea, V.; Bondarenko, V. G.; Bonino, R.; Bonis, J.; Bonivento, W.; Bonneau, P.; Boonekamp, M.; Boorman, G.; Boosten, M.; Booth, C. N.; Booth, P. S. L.; Booth, P.; Booth, J. R. A.; Borer, K.; Borisov, A.; Borjanovic, I.; Bos, K.; Boscherini, D.; Bosi, F.; Bosman, M.; Bosteels, M.; Botchev, B.; Boterenbrood, H.; Botterill, D.; Boudreau, J.; Bouhova-Thacker, E. V.; Boulahouache, C.; Bourdarios, C.; Boutemeur, M.; Bouzakis, K.; Boyd, G. R.; Boyd, J.; Boyer, B. H.; Boyko, I. R.; Bozhko, N. I.; Braccini, S.; Braem, A.; Branchini, P.; Brandenburg, G. W.; Brandt, A.; Brandt, O.; Bratzler, U.; Braun, H. M.; Bravo, S.; Brawn, I. P.; Brelier, B.; Bremer, J.; Brenner, R.; Bressler, S.; Breton, D.; Brett, N. D.; Breugnon, P.; Bright-Thomas, P. G.; Brochu, F. M.; Brock, I.; Brock, R.; Brodbeck, T. J.; Brodet, E.; Broggi, F.; Broklova, Z.; Bromberg, C.; Brooijmans, G.; Brouwer, G.; Broz, J.; Brubaker, E.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruni, A.; Bruni, G.; Bruschi, M.; Buanes, T.; Buchanan, N. J.; Buchholz, P.; Budagov, I. A.; Büscher, V.; Bugge, L.; Buira-Clark, D.; Buis, E. J.; Bujor, F.; Buran, T.; Burckhart, H.; Burckhart-Chromek, D.; Burdin, S.; Burns, R.; Busato, E.; Buskop, J. J. F.; Buszello, K. P.; Butin, F.; Butler, J. M.; Buttar, C. M.; Butterworth, J.; Butterworth, J. M.; Byatt, T.; Cabrera Urbán, S.; Cabruja Casas, E.; Caccia, M.; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calderón Terol, D.; Callahan, J.; Caloba, L. P.; Caloi, R.; Calvet, D.; Camard, A.; Camarena, F.; Camarri, P.; Cambiaghi, M.; Cameron, D.; Cammin, J.; Campabadal Segura, F.; Campana, S.; Canale, V.; Cantero, J.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Caprio, M.; Caracinha, D.; Caramarcu, C.; Carcagno, Y.; Cardarelli, R.; Cardeira, C.; Cardiel Sas, L.; Cardini, A.; Carli, T.; Carlino, G.; Carminati, L.; Caron, B.; Caron, S.; Carpentieri, C.; Carr, F. S.; Carter, A. A.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Cascella, M.; Caso, C.; Castelo, J.; Castillo Gimenez, V.; Castro, N.; Castrovillari, F.; Cataldi, G.; Cataneo, F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Caughron, S.; Cauz, D.; Cavallari, A.; Cavalleri, P.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerna, C.; Cernoch, C.; Cerqueira, A. S.; Cerri, A.; Cerutti, F.; Cervetto, M.; Cetin, S. A.; Cevenini, F.; Chalifour, M.; Chamizo llatas, M.; Chan, A.; Chapman, J. W.; Charlton, D. G.; Charron, S.; Chekulaev, S. V.; Chelkov, G. A.; Chen, H.; Chen, L.; Chen, T.; Chen, X.; Cheng, S.; Cheng, T. L.; Cheplakov, A.; Chepurnov, V. F.; Cherkaoui El Moursli, R.; Chesneanu, D.; Cheu, E.; Chevalier, L.; Chevalley, J. L.; Chevallier, F.; Chiarella, V.; Chiefari, G.; Chikovani, L.; Chilingarov, A.; Chiodini, G.; Chouridou, S.; Chren, D.; Christiansen, T.; Christidi, I. A.; Christov, A.; Chu, M. L.; Chudoba, J.; Chuguev, A. G.; Ciapetti, G.; Cicalini, E.; Ciftci, A. K.; Cindro, V.; Ciobotaru, M. D.; Ciocio, A.; Cirilli, M.; Citterio, M.; Ciubancan, M.; Civera, J. V.; Clark, A.; Cleland, W.; Clemens, J. C.; Clement, B. C.; Clément, C.; Clements, D.; Clifft, R. W.; Cobal, M.; Coccaro, A.; Cochran, J.; Coco, R.; Coe, P.; Coelli, S.; Cogneras, E.; Cojocaru, C. D.; Colas, J.; Colijn, A. P.; Collard, C.; Collins-Tooth, C.; Collot, J.; Coluccia, R.; Comune, G.; Conde Muiño, P.; Coniavitis, E.; Consonni, M.; Constantinescu, S.; Conta, C.; Conventi, F. A.; Cook, J.; Cooke, M.; Cooper-Smith, N. J.; Cornelissen, T.; Corradi, M.; Correard, S.; Corso-Radu, A.; Coss, J.; Costa, G.; Costa, M. J.; Costanzo, D.; Costin, T.; Coura Torres, R.; Courneyea, L.; Couyoumtzelis, C.; Cowan, G.; Cox, B. E.; Cox, J.; Cragg, D. A.; Cranmer, K.; Cranshaw, J.; Cristinziani, M.; Crosetti, G.; Cuenca Almenar, C.; Cuneo, S.; Cunha, A.; Curatolo, M.; Curtis, C. J.; Cwetanski, P.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; Da Rocha Gesualdi Mello, A.; Da Silva, P. V. M.; Da Silva, R.; Dabrowski, W.; Dael, A.; Dahlhoff, A.; Dai, T.; Dallapiccola, C.; Dallison, S. J.; Dalmau, J.; Daly, C. H.; Dam, M.; Damazio, D.; Dameri, M.; Danielsen, K. M.; Danielsson, H. O.; Dankers, R.; Dannheim, D.; Darbo, G.; Dargent, P.; Daum, C.; Dauvergne, J. P.; David, M.; Davidek, T.; Davidson, N.; Davidson, R.; Dawson, I.; Dawson, J. W.; Daya, R. K.; De, K.; de Asmundis, R.; de Boer, R.; DeCastro, S.; DeGroot, N.; de Jong, P.; de La Broise, X.; DeLa Cruz-Burelo, E.; DeLa Taille, C.; DeLotto, B.; DeOliveira Branco, M.; DePedis, D.; de Saintignon, P.; DeSalvo, A.; DeSanctis, U.; DeSanto, A.; DeVivie DeRegie, J. B.; DeZorzi, G.; Dean, S.; Dedes, G.; Dedovich, D. V.; Defay, P. O.; Degele, R.; Dehchar, M.; Deile, M.; DelPapa, C.; DelPeso, J.; DelPrete, T.; Delagnes, E.; Delebecque, P.; Dell'Acqua, A.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delpierre, P.; Delruelle, N.; Delsart, P. A.; Deluca Silberberg, C.; Demers, S.; Demichev, M.; Demierre, P.; Demirköz, B.; Deng, W.; Denisov, S. P.; Dennis, C.; Densham, C. J.; Dentan, M.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K. K.; Dewhurst, A.; Di Ciaccio, A.; Di Ciaccio, L.; Di Domenico, A.; Di Girolamo, A.; Di Girolamo, B.; Di Luise, S.; Di Mattia, A.; Di Simone, A.; Diaz Gomez, M. M.; Diehl, E. B.; Dietl, H.; Dietrich, J.; Dietsche, W.; Diglio, S.; Dima, M.; Dindar, K.; Dinkespiler, B.; Dionisi, C.; Dipanjan, R.; Dita, P.; Dita, S.; Dittus, F.; Dixon, S. D.; Djama, F.; Djilkibaev, R.; Djobava, T.; do Vale, M. A. B.; Dobbs, M.; Dobinson, R.; Dobos, D.; Dobson, E.; Dobson, M.; Dodd, J.; Dogan, O. B.; Doherty, T.; Doi, Y.; Dolejsi, J.; Dolenc, I.; Dolezal, Z.; Dolgoshein, B. A.; Domingo, E.; Donega, M.; Dopke, J.; Dorfan, D. E.; Dorholt, O.; Doria, A.; Dos Anjos, A.; Dosil, M.; Dotti, A.; Dova, M. T.; Dowell, J. D.; Doyle, A. T.; Drake, G.; Drakoulakos, D.; Drasal, Z.; Drees, J.; Dressnandt, N.; Drevermann, H.; Driouichi, C.; Dris, M.; Drohan, J. G.; Dubbert, J.; Dubbs, T.; Duchovni, E.; Duckeck, G.; Dudarev, A.; Dührssen, M.; Dür, H.; Duerdoth, I. P.; Duffin, S.; Duflot, L.; Dufour, M.-A.; Dumont Dayot, N.; Duran Yildiz, H.; Durand, D.; Dushkin, A.; Duxfield, R.; Dwuznik, M.; Dydak, F.; Dzahini, D.; Díez Cornell, S.; Düren, M.; Ebenstein, W. L.; Eckert, S.; Eckweiler, S.; Eerola, P.; Efthymiopoulos, I.; Egede, U.; Egorov, K.; Ehrenfeld, W.; Eifert, T.; Eigen, G.; Einsweiler, K.; Eisenhandler, E.; Ekelof, T.; Eklund, L. M.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Ely, R.; Emeliyanov, D.; Engelmann, R.; Engström, M.; Ennes, P.; Epp, B.; Eppig, A.; Epshteyn, V. S.; Ereditato, A.; Eremin, V.; Eriksson, D.; Ermoline, I.; Ernwein, J.; Errede, D.; Errede, S.; Escalier, M.; Escobar, C.; Espinal Curull, X.; Esposito, B.; Esteves, F.; Etienne, F.; Etienvre, A. I.; Etzion, E.; Evans, H.; Evdokimov, V. N.; Evtoukhovitch, P.; Eyring, A.; Fabbri, L.; Fabjan, C. W.; Fabre, C.; Faccioli, P.; Facius, K.; Fadeyev, V.; Fakhrutdinov, R. M.; Falciano, S.; Falleau, I.; Falou, A. C.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farrell, J.; Farthouat, P.; Fasching, D.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Fawzi, F.; Fayard, L.; Fayette, F.; Febbraro, R.; Fedin, O. L.; Fedorko, I.; Feld, L.; Feldman, G.; Feligioni, L.; Feng, C.; Feng, E. J.; Fent, J.; Fenyuk, A. B.; Ferencei, J.; Ferguson, D.; Ferland, J.; Fernando, W.; Ferrag, S.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferrer, A.; Ferrer, M. L.; Ferrere, D.; Ferretti, C.; Ferro, F.; Fiascaris, M.; Fichet, S.; Fiedler, F.; Filimonov, V.; Filipčič, A.; Filippas, A.; Filthaut, F.; Fincke-Keeler, M.; Finocchiaro, G.; Fiorini, L.; Firan, A.; Fischer, P.; Fisher, M. J.; Fisher, S. M.; Flaminio, V.; Flammer, J.; Flechl, M.; Fleck, I.; Flegel, W.; Fleischmann, P.; Fleischmann, S.; Fleta Corral, C. M.; Fleuret, F.; Flick, T.; Flix, J.; Flores Castillo, L. R.; Flowerdew, M. J.; Föhlisch, F.; Fokitis, M.; Fonseca Martin, T. M.; Fopma, J.; Forbush, D. A.; Formica, A.; Foster, J. M.; Fournier, D.; Foussat, A.; Fowler, A. J.; Fox, H.; Francavilla, P.; Francis, D.; Franz, S.; Fraser, J. T.; Fraternali, M.; Fratianni, S.; Freestone, J.; French, R. S.; Fritsch, K.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fulachier, J.; Fullana Torregrosa, E.; Fuster, J.; Gabaldon, C.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Gallas, E. J.; Gallas, M. V.; Gallop, B. J.; Gan, K. K.; Gannaway, F. C.; Gao, Y. S.; Gapienko, V. A.; Gaponenko, A.; Garciá, C.; Garcia-Sciveres, M.; Garcìa Navarro, J. E.; Garde, V.; Gardner, R. W.; Garelli, N.; Garitaonandia, H.; Garonne, V. G.; Garvey, J.; Gatti, C.; Gaudio, G.; Gaumer, O.; Gautard, V.; Gauzzi, P.; Gavrilenko, I. L.; Gay, C.; Gayde, J.-C.; Gazis, E. N.; Gazo, E.; Gee, C. N. P.; Geich-Gimbel, C.; Gellerstedt, K.; Gemme, C.; Genest, M. H.; Gentile, S.; George, M. A.; George, S.; Gerlach, P.; Gernizky, Y.; Geweniger, C.; Ghazlane, H.; Ghete, V. M.; Ghez, P.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giakoumopoulou, V.; Giangiobbe, V.; Gianotti, F.; Gibbard, B.; Gibson, A.; Gibson, M. D.; Gibson, S. M.; Gieraltowski, G. F.; Gil Botella, I.; Gilbert, L. M.; Gilchriese, M.; Gildemeister, O.; Gilewsky, V.; Gillman, A. R.; Gingrich, D. M.; Ginzburg, J.; Giokaris, N.; Giordani, M. P.; Girard, C. G.; Giraud, P. F.; Girtler, P.; Giugni, D.; Giusti, P.; Gjelsten, B. K.; Glasman, C.; Glazov, A.; Glitza, K. W.; Glonti, G. L.; Gnanvo, K. G.; Godlewski, J.; Göpfert, T.; Gössling, C.; Göttfert, T.; Goldfarb, S.; Goldin, D.; Goldschmidt, N.; Golling, T.; Gollub, N. P.; Golonka, P. J.; Golovnia, S. N.; Gomes, A.; Gomes, J.; Gonçalo, R.; Gongadze, A.; Gonidec, A.; Gonzalez, S.; González de la Hoz, S.; González Millán, V.; Gonzalez Silva, M. L.; Gonzalez-Pineiro, B.; González-Sevilla, S.; Goodrick, M. J.; Goodson, J. J.; Goossens, L.; Gorbounov, P. A.; Gordeev, A.; Gordon, H.; Gorelov, I.; Gorfine, G.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Gorokhov, S. A.; Gorski, B. T.; Goryachev, S. V.; Goryachev, V. N.; Gosselink, M.; Gostkin, M. I.; Gouanère, M.; Gough Eschrich, I.; Goujdami, D.; Goulette, M.; Gousakov, I.; Gouveia, J.; Gowdy, S.; Goy, C.; Grabowska-Bold, I.; Grabski, V.; Grafström, P.; Grah, C.; Grahn, K.-J.; Grancagnolo, F.; Grancagnolo, S.; Grassmann, H.; Gratchev, V.; Gray, H. M.; Graziani, E.; Green, B.; Greenall, A.; Greenfield, D.; Greenwood, D.; Gregor, I. M.; Grewal, A.; Griesmayer, E.; Grigalashvili, N.; Grigson, C.; Grillo, A. A.; Grimaldi, F.; Grimm, K.; Gris, P. L. Y.; Grishkevich, Y.; Groenstege, H.; Groer, L. S.; Grognuz, J.; Groh, M.; Gross, E.; Grosse-Knetter, J.; Grothe, M. E. M.; Grudzinski, J.; Gruse, C.; Gruwe, M.; Grybel, K.; Grybos, P.; Gschwendtner, E. M.; Guarino, V. J.; Guicheney, C. J.; Guilhem, G.; Guillemin, T.; Gunther, J.; Guo, B.; Gupta, A.; Gurriana, L.; Gushchin, V. N.; Gutierrez, P.; Guy, L.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haas, S.; Haber, C.; Haboubi, G.; Hackenburg, R.; Hadash, E.; Hadavand, H. K.; Haeberli, C.; Härtel, R.; Haggerty, R.; Hahn, F.; Haider, S.; Hajduk, Z.; Hakimi, M.; Hakobyan, H.; Hakobyan, H.; Haller, J.; Hallewell, G. D.; Hallgren, B.; Hamacher, K.; Hamilton, A.; Han, H.; Han, L.; Hanagaki, K.; Hance, M.; Hanke, P.; Hansen, C. J.; Hansen, F. H.; Hansen, J. R.; Hansen, J. B.; Hansen, J. D.; Hansen, P. H.; Hansl-Kozanecka, T.; Hanson, G.; Hansson, P.; Hara, K.; Harder, S.; Harel, A.; Harenberg, T.; Harper, R.; Hart, J. C.; Hart, R. G. G.; Hartjes, F.; Hartman, N.; Haruyama, T.; Harvey, A.; Hasegawa, Y.; Hashemi, K.; Hassani, S.; Hatch, M.; Hatley, R. W.; Haubold, T. G.; Hauff, D.; Haug, F.; Haug, S.; Hauschild, M.; Hauser, R.; Hauviller, C.; Havranek, M.; Hawes, B. M.; Hawkings, R. J.; Hawkins, D.; Hayler, T.; Hayward, H. S.; Haywood, S. J.; Hazen, E.; He, M.; He, Y. P.; Head, S. J.; Hedberg, V.; Heelan, L.; Heinemann, F. E. W.; Heldmann, M.; Hellman, S.; Helsens, C.; Henderson, R. C. W.; Hendriks, P. J.; Henriques Correia, A. M.; Henrot-Versille, S.; Henry-Couannier, F.; Henß, T.; Herten, G.; Hertenberger, R.; Hervas, L.; Hess, M.; Hessey, N. P.; Hicheur, A.; Hidvegi, A.; Higón-Rodriguez, E.; Hill, D.; Hill, J.; Hill, J. C.; Hill, N.; Hillier, S. J.; Hinchliffe, I.; Hindson, D.; Hinkelbein, C.; Hodges, T. A.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoffman, J.; Hoffmann, A. E.; Hoffmann, D.; Hoffmann, H. F.; Holder, M.; Hollins, T. I.; Hollyman, G.; Holmes, A.; Holmgren, S. O.; Holt, R.; Holtom, E.; Holy, T.; Homer, R. J.; Homma, Y.; Homola, P.; Honerbach, W.; Honma, A.; Hooton, I.; Horazdovsky, T.; Horn, C.; Horvat, S.; Hostachy, J.-Y.; Hott, T.; Hou, S.; Houlden, M. A.; Hoummada, A.; Hover, J.; Howell, D. F.; Hrivnac, J.; Hruska, I.; Hryn'ova, T.; Huang, G. S.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, B. T.; Hughes, E.; Hughes, G.; Hughes-Jones, R. E.; Hulsbergen, W.; Hurst, P.; Hurwitz, M.; Huse, T.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Ibbotson, M.; Ibragimov, I.; Ichimiya, R.; Iconomidou-Fayard, L.; Idarraga, J.; Idzik, M.; Iengo, P.; Iglesias Escudero, M. C.; Igonkina, O.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Ilyushenka, Y.; Imbault, D.; Imbert, P.; Imhaeuser, M.; Imori, M.; Ince, T.; Inigo-Golfin, J.; Inoue, K.; Ioannou, P.; Iodice, M.; Ionescu, G.; Ishii, K.; Ishino, M.; Ishizawa, Y.; Ishmukhametov, R.; Issever, C.; Ito, H.; Ivashin, A. V.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jackson, J.; Jackson, J. N.; Jaekel, M.; Jagielski, S.; Jahoda, M.; Jain, V.; Jakobs, K.; Jakubek, J.; Jansen, E.; Jansweijer, P. P. M.; Jared, R. C.; Jarlskog, G.; Jarp, S.; Jarron, P.; Jelen, K.; Jen-La Plante, I.; Jenni, P.; Jeremie, A.; Jez, P.; Jézéquel, S.; Jiang, Y.; Jin, G.; Jin, S.; Jinnouchi, O.; Joffe, D.; Johansen, L. G.; Johansen, M.; Johansson, K. E.; Johansson, P.; Johns, K. A.; Jon-And, K.; Jones, M.; Jones, R.; Jones, R. W. L.; Jones, T. W.; Jones, T. J.; Jones, A.; Jonsson, O.; Joo, K. K.; Joos, D.; Joos, M.; Joram, C.; Jorgensen, S.; Joseph, J.; Jovanovic, P.; Junnarkar, S. S.; Juranek, V.; Jussel, P.; Kabachenko, V. V.; Kabana, S.; Kaci, M.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagawa, S.; Kaiser, S.; Kajomovitz, E.; Kakurin, S.; Kalinovskaya, L. V.; Kama, S.; Kambara, H.; Kanaya, N.; Kandasamy, A.; Kandasamy, S.; Kaneda, M.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kaplon, J.; Karagounis, M.; Karagoz Unel, M.; Karr, K.; Karst, P.; Kartvelishvili, V.; Karyukhin, A. N.; Kashif, L.; Kasmi, A.; Kass, R. D.; Kastanas, A.; Kataoka, M.; Kataoka, Y.; Katsoufis, E.; Katunin, S.; Kawagoe, K.; Kawai, M.; Kawamoto, T.; Kayumov, F.; Kazanin, V. A.; Kazarinov, M. Y.; Kazarov, A.; Kazi, S. I.; Keates, J. R.; Keeler, R.; Keener, P. T.; Kehoe, R.; Keil, M.; Kekelidze, G. D.; Kelly, M.; Kennedy, J.; Kenyon, M.; Kepka, O.; Kerschen, N.; Kerševan, B. P.; Kersten, S.; Ketterer, C.; Khakzad, M.; Khalilzade, F.; Khandanyan, H.; Khanov, A.; Kharchenko, D.; Khodinov, A.; Kholodenko, A. G.; Khomich, A.; Khomutnikov, V. P.; Khoriauli, G.; Khovanskiy, N.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kieft, G.; Kierstead, J. A.; Kilvington, G.; Kim, H.; Kim, H.; Kim, S. H.; Kind, P.; King, B. T.; Kirk, J.; Kirsch, G. P.; Kirsch, L. E.; Kiryunin, A. E.; Kisielewska, D.; Kisielewski, B.; Kittelmann, T.; Kiver, A. M.; Kiyamura, H.; Kladiva, E.; Klaiber-Lodewigs, J.; Kleinknecht, K.; Klier, A.; Klimentov, A.; Kline, C. R.; Klingenberg, R.; Klinkby, E. B.; Klioutchnikova, T.; Klok, P. F.; Klous, S.; Kluge, E.-E.; Kluit, P.; Klute, M.; Kluth, S.; Knecht, N. K.; Kneringer, E.; Knezo, E.; Knobloch, J.; Ko, B. R.; Kobayashi, T.; Kobel, M.; Kodys, P.; König, A. C.; König, S.; Köpke, L.; Koetsveld, F.; Koffas, T.; Koffeman, E.; Kohout, Z.; Kohriki, T.; Kokott, T.; Kolachev, G. M.; Kolanoski, H.; Kolesnikov, V.; Koletsou, I.; Kollefrath, M.; Kolos, S.; Kolya, S. D.; Komar, A. A.; Komaragiri, J. R.; Kondo, T.; Kondo, Y.; Kondratyeva, N. V.; Kono, T.; Kononov, A. I.; Konoplich, R.; Konovalov, S. P.; Konstantinidis, N.; Kootz, A.; Koperny, S.; Kopikov, S. V.; Korcyl, K.; Kordas, K.; Koreshev, V.; Korn, A.; Korolkov, I.; Korotkov, V. A.; Korsmo, H.; Kortner, O.; Kostrikov, M. E.; Kostyukhin, V. V.; Kotamäki, M. J.; Kotchetkov, D.; Kotov, S.; Kotov, V. M.; Kotov, K. Y.; Kourkoumelis, C.; Koutsman, A.; Kovalenko, S.; Kowalewski, R.; Kowalski, H.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kral, V.; Kramarenko, V.; Kramberger, G.; Kramer, A.; Krasel, O.; Krasny, M. W.; Krasznahorkay, A.; Krepouri, A.; Krieger, P.; Krivkova, P.; Krobath, G.; Kroha, H.; Krstic, J.; Kruchonak, U.; Krüger, H.; Kruger, K.; Krumshteyn, Z. V.; Kubik, P.; Kubischta, W.; Kubota, T.; Kudin, L. G.; Kudlaty, J.; Kugel, A.; Kuhl, T.; Kuhn, D.; Kukhtin, V.; Kulchitsky, Y.; Kundu, N.; Kupco, A.; Kupper, M.; Kurashige, H.; Kurchaninov, L. L.; Kurochkin, Y. A.; Kus, V.; Kuykendall, W.; Kuzhir, P.; Kuznetsova, E. K.; Kvasnicka, O.; Kwee, R.; La Marra, D.; La Rosa, M.; La Rotonda, L.; Labarga, L.; Labbe, J. A.; Lacasta, C.; Lacava, F.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lamanna, E.; Lambacher, M.; Lambert, F.; Lampl, W.; Lancon, E.; Landgraf, U.; Landon, M. P. J.; Landsman, H.; Langstaff, R. R.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Lapin, V. V.; Laplace, S.; Laporte, J. F.; Lara, V.; Lari, T.; Larionov, A. V.; Lasseur, C.; Lau, W.; Laurelli, P.; Lavorato, A.; Lavrijsen, W.; Lazarev, A. B.; LeBihan, A.-C.; LeDortz, O.; LeManer, C.; LeVine, M.; Leahu, L.; Leahu, M.; Lebel, C.; Lechowski, M.; LeCompte, T.; Ledroit-Guillon, F.; Lee, H.; Lee, J. S. H.; Lee, S. C.; Lefebvre, M.; Lefevre, R. P.; Legendre, M.; Leger, A.; LeGeyt, B. C.; Leggett, C.; Lehmacher, M.; Lehmann Miotto, G.; Lehto, M.; Leitner, R.; Lelas, D.; Lellouch, D.; Leltchouk, M.; Lendermann, V.; Leney, K. J. C.; Lenz, T.; Lenzen, G.; Lepidis, J.; Leroy, C.; Lessard, J.-R.; Lesser, J.; Lester, C. G.; Letheren, M.; Fook Cheong, A. Leung; Levêque, J.; Levin, D.; Levinson, L. J.; Levitski, M. S.; Lewandowska, M.; Leyton, M.; Li, J.; Li, W.; Liabline, M.; Liang, Z.; Liang, Z.; Liberti, B.; Lichard, P.; Liebig, W.; Lifshitz, R.; Liko, D.; Lim, H.; Limper, M.; Lin, S. C.; Lindahl, A.; Linde, F.; Lindquist, L.; Lindsay, S. W.; Linhart, V.; Lintern, A. J.; Liolios, A.; Lipniacka, A.; Liss, T. M.; Lissauer, A.; List, J.; Litke, A. M.; Liu, S.; Liu, T.; Liu, Y.; Livan, M.; Lleres, A.; Llosá Llácer, G.; Lloyd, S. L.; Lobkowicz, F.; Loch, P.; Lockman, W. S.; Loddenkoetter, T.; Loebinger, F. K.; Loginov, A.; Loh, C. W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Loken, J.; Lokwitz, S.; Long, M. C.; Lopes, L.; Lopez Mateos, D.; Losty, M. J.; Lou, X.; Loureiro, K. F.; Lovas, L.; Love, J.; Lowe, A.; Lozano Fantoba, M.; Lu, F.; Lu, J.; Lu, L.; Lubatti, H. J.; Lucas, S.; Luci, C.; Lucotte, A.; Ludwig, A.; Ludwig, I.; Ludwig, J.; Luehring, F.; Lüke, D.; Luijckx, G.; Luisa, L.; Lumb, D.; Luminari, L.; Lund, E.; Lund-Jensen, B.; Lundberg, B.; Lundquist, J.; Lupi, A.; Lupu, N.; Lutz, G.; Lynn, D.; Lynn, J.; Lys, J.; Lysan, V.; Lytken, E.; López-Amengual, J. M.; Ma, H.; Ma, L. L.; Maaß en, M.; Maccarrone, G.; Mace, G. G. R.; Macina, D.; Mackeprang, R.; Macpherson, A.; MacQueen, D.; Macwaters, C.; Madaras, R. J.; Mader, W. F.; Maenner, R.; Maeno, T.; Mättig, P.; Mättig, S.; Magrath, C. A.; Mahalalel, Y.; Mahboubi, K.; Mahout, G.; Maidantchik, C.; Maio, A.; Mair, G. M.; Mair, K.; Makida, Y.; Makowiecki, D.; Malecki, P.; Maleev, V. P.; Malek, F.; Malon, D.; Maltezos, S.; Malychev, V.; Malyukov, S.; Mambelli, M.; Mameghani, R.; Mamuzic, J.; Manabe, A.; Manara, A.; Manca, G.; Mandelli, L.; Mandić, I.; Mandl, M.; Maneira, J.; Maneira, M.; Mangeard, P. S.; Mangin-Brinet, M.; Manjavidze, I. D.; Mann, W. A.; Manolopoulos, S.; Manousakis-Katsikakis, A.; Mansoulie, B.; Manz, A.; Mapelli, A.; Mapelli, L.; March, L.; Marchand, J. F.; Marchesotti, M.; Marcisovsky, M.; Marin, A.; Marques, C. N.; Marroquim, F.; Marshall, R.; Marshall, Z.; Martens, F. K.; Garcia, S. Marti i.; Martin, A. J.; Martin, B.; Martin, B.; Martin, F. F.; Martin, J. P.; Martin, Ph; Martinez, G.; Martínez Lacambra, C.; Martinez Outschoorn, V.; Martini, A.; Martins, J.; Maruyama, T.; Marzano, F.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Maß, M.; Massa, I.; Massaro, G.; Massol, N.; Mathes, M.; Matheson, J.; Matricon, P.; Matsumoto, H.; Matsunaga, H.; Maugain, J. M.; Maxfield, S. J.; May, E. N.; Mayer, J. K.; Mayri, C.; Mazini, R.; Mazzanti, M.; Mazzanti, P.; Mazzoni, E.; Mazzucato, F.; McKee, S. P.; McCarthy, R. L.; McCormick, C.; McCubbin, N. A.; McDonald, J.; McFarlane, K. W.; McGarvie, S.; McGlone, H.; McLaren, R. A.; McMahon, S. J.; McMahon, T. R.; McMahon, T. J.; McPherson, R. A.; Mechtel, M.; Meder-Marouelli, D.; Medinnis, M.; Meera-Lebbai, R.; Meessen, C.; Mehdiyev, R.; Mehta, A.; Meier, K.; Meinhard, H.; Meinhardt, J.; Meirosu, C.; Meisel, F.; Melamed-Katz, A.; Mellado Garcia, B. R.; Mendes Jorge, P.; Mendez, P.; Menke, S.; Menot, C.; Meoni, E.; Merkl, D.; Merola, L.; Meroni, C.; Merritt, F. S.; Messmer, I.; Metcalfe, J.; Meuser, S.; Meyer, J.-P.; Meyer, T. C.; Meyer, W. T.; Mialkovski, V.; Michelotto, M.; Micu, L.; Middleton, R.; Miele, P.; Migliaccio, A.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikestikova, M.; Mikulec, B.; Mikuž, M.; Miller, D. W.; Miller, R. J.; Miller, W.; Milosavljevic, M.; Milstead, D. A.; Mima, S.; Minaenko, A. A.; Minano, M.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Mir, L. M.; Mirabelli, G.; Miralles Verge, L.; Misawa, S.; Miscetti, S.; Misiejuk, A.; Mitra, A.; Mitrofanov, G. Y.; Mitsou, V. A.; Miyagawa, P. S.; Miyazaki, Y.; Mjörnmark, J. U.; Mkrtchyan, S.; Mladenov, D.; Moa, T.; Moch, M.; Mochizuki, A.; Mockett, P.; Modesto, P.; Moed, S.; Mönig, K.; Möser, N.; Mohn, B.; Mohr, W.; Mohrdieck-Möck, S.; Moisseev, A. M.; Moles Valls, R. M.; Molina-Perez, J.; Moll, A.; Moloney, G.; Mommsen, R.; Moneta, L.; Monnier, E.; Montarou, G.; Montesano, S.; Monticelli, F.; Moore, R. W.; Moore, T. B.; Moorhead, G. F.; Moraes, A.; Morel, J.; Moreno, A.; Moreno, D.; Morettini, P.; Morgan, D.; Morii, M.; Morin, J.; Morley, A. K.; Mornacchi, G.; Morone, M.-C.; Morozov, S. V.; Morris, E. J.; Morris, J.; Morrissey, M. C.; Moser, H. G.; Mosidze, M.; Moszczynski, A.; Mouraviev, S. V.; Mouthuy, T.; Moye, T. H.; Moyse, E. J. W.; Mueller, J.; Müller, M.; Muijs, A.; Muller, T. R.; Munar, A.; Munday, D. J.; Murakami, K.; Murillo Garcia, R.; Murray, W. J.; Myagkov, A. G.; Myska, M.; Nagai, K.; Nagai, Y.; Nagano, K.; Nagasaka, Y.; Nairz, A. M.; Naito, D.; Nakamura, K.; Nakamura, Y.; Nakano, I.; Nanava, G.; Napier, A.; Nassiakou, M.; Nasteva, I.; Nation, N. R.; Naumann, T.; Nauyock, F.; Nderitu, S. K.; Neal, H. A.; Nebot, E.; Nechaeva, P.; Neganov, A.; Negri, A.; Negroni, S.; Nelson, C.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Nesterov, S. Y.; Neukermans, L.; Nevski, P.; Newcomer, F. M.; Nichols, A.; Nicholson, C.; Nicholson, R.; Nickerson, R. B.; Nicolaidou, R.; Nicoletti, G.; Nicquevert, B.; Niculescu, M.; Nielsen, J.; Niinikoski, T.; Niinimaki, M. J.; Nikitin, N.; Nikolaev, K.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, H.; Nilsson, B. S.; Nilsson, P.; Nisati, A.; Nisius, R.; Nodulman, L. J.; Nomachi, M.; Nomoto, H.; Noppe, J.-M.; Nordberg, M.; Norniella Francisco, O.; Norton, P. R.; Novakova, J.; Nowak, M.; Nozaki, M.; Nunes, R.; Nunes Hanninger, G.; Nunnemann, T.; Nyman, T.; O'Connor, P.; O'Neale, S. W.; O'Neil, D. C.; O'Neill, M.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermaier, M.; Oberson, P.; Ochi, A.; Ockenfels, W.; Odaka, S.; Odenthal, I.; Odino, G. A.; Ogren, H.; Oh, S. H.; Ohshima, T.; Ohshita, H.; Okawa, H.; Olcese, M.; Olchevski, A. G.; Oliver, C.; Oliver, J.; Olivo Gomez, M.; Olszewski, A.; Olszowska, J.; Omachi, C.; Onea, A.; Onofre, A.; Oram, C. J.; Ordonez, G.; Oreglia, M. J.; Orellana, F.; Oren, Y.; Orestano, D.; Orlov, I. O.; Orr, R. S.; Orsini, F.; Osborne, L. S.; Osculati, B.; Osuna, C.; Otec, R.; Othegraven, R.; Ottewell, B.; Ould-Saada, F.; Ouraou, A.; Ouyang, Q.; Øye, O. K.; Ozcan, V. E.; Ozone, K.; Ozturk, N.; Pacheco Pages, A.; Padhi, S.; Padilla Aranda, C.; Paganis, E.; Paige, F.; Pailler, P. M.; Pajchel, K.; Palestini, S.; Palla, J.; Pallin, D.; Palmer, M. J.; Pan, Y. B.; Panikashvili, N.; Panin, V. N.; Panitkin, S.; Pantea, D.; Panuskova, M.; Paolone, V.; Paoloni, A.; Papadopoulos, I.; Papadopoulou, T.; Park, I.; Park, W.; Parker, M. A.; Parker, S.; Parkman, C.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pasqualucci, E.; Passardi, G.; Passeri, A.; Passmore, M. S.; Pastore, F.; Pastore, Fr; Pataraia, S.; Pate, D.; Pater, J. R.; Patricelli, S.; Pauly, T.; Pauna, E.; Peak, L. S.; Peeters, S. J. M.; Peez, M.; Pei, E.; Peleganchuk, S. V.; Pellegrini, G.; Pengo, R.; Pequenao, J.; Perantoni, M.; Perazzo, A.; Pereira, A.; Perepelkin, E.; Perera, V. J. O.; Perez Codina, E.; Perez Reale, V.; Peric, I.; Perini, L.; Pernegger, H.; Perrin, E.; Perrino, R.; Perrodo, P.; Perrot, G.; Perus, P.; Peshekhonov, V. D.; Petereit, E.; Petersen, J.; Petersen, T. C.; Petit, P. J. F.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petti, R.; Pezzetti, M.; Pfeifer, B.; Phan, A.; Phillips, A. W.; Phillips, P. W.; Piacquadio, G.; Piccinini, M.; Pickford, A.; Piegaia, R.; Pier, S.; Pilcher, J. E.; Pilkington, A. D.; Pimenta Dos Santos, M. A.; Pina, J.; Pinfold, J. L.; Ping, J.; Pinhão, J.; Pinto, B.; Pirotte, O.; Placakyte, R.; Placci, A.; Plamondon, M.; Plano, W. G.; Pleier, M.-A.; Pleskach, A. V.; Podkladkin, S.; Podlyski, F.; Poffenberger, P.; Poggioli, L.; Pohl, M.; Polak, I.; Polesello, G.; Policicchio, A.; Polini, A.; Polychronakos, V.; Pomarede, D. M.; Pommès, K.; Ponsot, P.; Pontecorvo, L.; Pope, B. G.; Popescu, R.; Popovic, D. S.; Poppleton, A.; Popule, J.; Portell Bueso, X.; Posch, C.; Pospelov, G. E.; Pospichal, P.; Pospisil, S.; Postranecky, M.; Potrap, I. N.; Potter, C. J.; Poulard, G.; Pousada, A.; Poveda, J.; Prabhu, R.; Pralavorio, P.; Prasad, S.; Prast, J.; Prat, S.; Prata, M.; Pravahan, R.; Preda, T.; Pretzl, K.; Pribyl, L.; Price, D.; Price, L. E.; Price, M. J.; Prichard, P. M.; Prieur, D.; Primavera, M.; Primor, D.; Prokofiev, K.; Prosso, E.; Proudfoot, J.; Przysiezniak, H.; Puigdengoles, C.; Purdham, J.; Purohit, M.; Puzo, P.; Pylaev, A. N.; Pylypchenko, Y.; Qi, M.; Qian, J.; Qian, W.; Qian, Z.; Qing, D.; Quadt, A.; Quarrie, D. R.; Quayle, W. B.; Rabbers, J. J.; Radeka, V.; Rafi, J. M.; Ragusa, F.; Rahimi, A. M.; Rahm, D.; Raine, C.; Raith, B.; Rajagopalan, S.; Rajek, S.; Rammer, H.; Ramstedt, M.; Rangod, S.; Ratoff, P. N.; Raufer, T.; Rauscher, F.; Rauter, E.; Raymond, M.; Reads, A. L.; Rebuzzi, D.; Redlinger, G. R.; Reeves, K.; Rehak, M.; Reichold, A.; Reinherz-Aronis, E.; Reisinger, I.; Reljic, D.; Rembser, C.; Ren, Z.; Renaudin-Crepe, S. R. C.; Renkel, P.; Rensch, B.; Rescia, S.; Rescigno, M.; Resconi, S.; Resende, B.; Rewiersma, P.; Rey, J.; Rey-Campagnolle, M.; Rezaie, E.; Reznicek, P.; Richards, R. A.; Richer, J.-P.; Richter, R. H.; Richter, R.; Richter-Was, E.; Ridel, M.; Riegler, W.; Rieke, S.; Rijpstra, M.; Rijssenbeek, M.; Rimoldi, A.; Rios, R. R.; Riu Dachs, I.; Rivline, M.; Rivoltella, G.; Rizatdinova, F.; Robertson, S. H.; Robichaud-Veronneau, A.; Robins, S.; Robinson, D.; Robson, A.; Rochford, J. H.; Roda, C.; Rodier, S.; Roe, S.; Røhne, O.; Rohrbach, F.; Roldán, J.; Rolli, S.; Romance, J. B.; Romaniouk, A.; Romanov, V. M.; Romeo, G.; Roos, L.; Ros, E.; Rosati, S.; Rosenbaum, F.; Rosenbaum, G. A.; Rosenberg, E. I.; Rosselet, L.; Rossi, L. P.; Rossi, L.; Rotaru, M.; Rothberg, J.; Rottländer, I.; Rousseau, D.; Rozanov, A.; Rozen, Y.; Ruber, R.; Ruckert, B.; Rudolph, G.; Rühr, F.; Ruggieri, F.; Ruggiero, G.; Ruiz, H.; Ruiz-Martinez, A.; Rulikowska-Zarebska, E.; Rumiantsev, V.; Rumyantsev, L.; Runge, K.; Runolfsson, O.; Rusakovich, N. A.; Rust, D. R.; Rutherfoord, J. P.; Ruwiedel, C.; Ryabov, Y. F.; Ryadovikov, V.; Ryan, P.; Rybkine, G.; da Costa, J. Sá; Saavedra, A. F.; Saboumazrag, S.; F-W Sadrozinski, H.; Sadykov, R.; Sakamoto, H.; Sala, P.; Salamon, A.; Saleem, M.; Salihagic, D.; Salt, J.; Saltó Bauza, O.; Salvachúa Ferrando, B. M.; Salvatore, D.; Salzburger, A.; Sampsonidis, D.; Samset, B. H.; Sánchez Sánchez, C. A.; Sanchis Lozano, M. A.; Sanchis Peris, E.; Sandaker, H.; Sander, H. G.; Sandhoff, M.; Sandvoss, S.; Sankey, D. P. C.; Sanny, B.; Sansone, S.; Sansoni, A.; Santamarina Rios, C.; Santander, J.; Santi, L.; Santoni, C.; Santonico, R.; Santos, J.; Sapinski, M.; Saraiva, J. G.; Sarri, F.; Sasaki, O.; Sasaki, T.; Sasao, N.; Satsounkevitch, I.; Sauvage, D.; Sauvage, G.; Savard, P.; Savine, A. Y.; Savinov, V.; Savoy-Navarro, A.; Savva, P.; Saxon, D. H.; Says, L. P.; Sbarra, C.; Sbrissa, E.; Sbrizzi, A.; Scannicchio, D. A.; Schaarschmidt, J.; Schacht, P.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schaller, M.; Schamov, A. G.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Scherzer, M. I.; Schiavi, C.; Schick, H.; Schieck, J.; Schieferdecker, P.; Schioppa, M.; Schlager, G.; Schlenker, S.; Schlereth, J. L.; Schmid, P.; Schmidt, M. P.; Schmitt, C.; Schmitt, K.; Schmitz, M.; Schmücker, H.; Schoerner, T.; Scholte, R. C.; Schott, M.; Schouten, D.; Schram, M.; Schricker, A.; Schroff, D.; Schuh, S.; Schuijlenburg, H. W.; Schuler, G.; Schultes, J.; Schultz-Coulon, H.-C.; Schumacher, J.; Schumacher, M.; Schune, Ph; Schwartzman, A.; Schweiger, D.; Schwemling, Ph; Schwick, C.; Schwienhorst, R.; Schwierz, R.; Schwindling, J.; Scott, W. G.; Secker, H.; Sedykh, E.; Seguin-Moreau, N.; Segura, E.; Seidel, S. C.; Seiden, A.; Seixas, J. M.; Sekhniaidze, G.; Seliverstov, D. M.; Selldén, B.; Seman, M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Seuster, R.; Severini, H.; Sevior, M. E.; Sexton, K. A.; Sfyrla, A.; Shah, T. P.; Shan, L.; Shank, J. T.; Shapiro, M.; Shatalov, P. B.; Shaver, L.; Shaw, C.; Shears, T. G.; Sherwood, P.; Shibata, A.; Shield, P.; Shilov, S.; Shimojima, M.; Shin, T.; Shiyakova, M.; Shmeleva, A.; Shoa, M.; Shochet, M. J.; Shupe, M. A.; Sicho, P.; Sidoti, A.; Siebel, A.; Siebel, M.; Siegrist, J.; Sijacki, D.; Silva, J.; Silverstein, S. B.; Simak, V.; Simic, Lj; Simion, S.; Simmons, B.; Simonyan, M.; Sinervo, P.; Sipica, V.; Siragusa, G.; Sisakyan, A. N.; Sivoklokov, S.; Sjölin, J.; Skubic, P.; Skvorodnev, N.; Slattery, P.; Slavicek, T.; Sliwa, K.; Sloan, T. J.; Sloper, J.; Smakhtin, V.; Small, A.; Smirnov, S. Yu; Smirnov, Y.; Smirnova, L.; Smirnova, O.; Smith, N. A.; Smith, B. C.; Smith, D. S.; Smith, J.; Smith, K. M.; Smith, B.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snow, S. W.; Snow, J.; Snuverink, J.; Snyder, S.; Soares, M.; Soares, S.; Sobie, R.; Sodomka, J.; Söderberg, M.; Soffer, A.; Solans, C. A.; Solar, M.; Sole, D.; Solfaroli Camillocci, E.; Solodkov, A. A.; Solov'yanov, O. V.; Soloviev, I.; Soluk, R.; Sondericker, J.; Sopko, V.; Sopko, B.; Sorbi, M.; Soret Medel, J.; Sosebee, M.; Sosnovtsev, V. V.; Sospedra Suay, L.; Soukharev, A.; Soukup, J.; Spagnolo, S.; Spano, F.; Speckmayer, P.; Spegel, M.; Spencer, E.; Spighi, R.; Spigo, G.; Spila, F.; Spiriti, E.; Spiwoks, R.; Spogli, L.; Spousta, M.; Sprachmann, G.; Spurlock, B.; St. Denis, R. D.; Stahl, T.; Staley, R. J.; Stamen, R.; Stancu, S. N.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stapnes, S.; Starchenko, E. A.; Staroba, P.; Stastny, J.; Staude, A.; Stavina, P.; Stavrianakou, M.; Stavropoulos, G.; Stefanidis, E.; Steffens, J. L.; Stekl, I.; Stelzer, H. J.; Stenzel, H.; Stewart, G.; Stewart, T. D.; Stiller, W.; Stockmanns, T.; Stodulski, M.; Stonjek, S.; Stradling, A.; Straessner, A.; Strandberg, J.; Strandlie, A.; Strauss, M.; Strickland, V.; Striegel, D.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Strong, J. A.; Stroynowski, R.; Stugu, B.; Stumer, I.; Su, D.; Subramania, S.; Suchkov, S. I.; Sugaya, Y.; Sugimoto, T.; Suk, M.; Sulin, V. V.; Sultanov, S.; Sun, Z.; Sundal, B.; Sushkov, S.; Susinno, G.; Sutcliffe, P.; Sutton, M. R.; Sviridov, Yu M.; Sykora, I.; Szczygiel, R. R.; Szeless, B.; Szymocha, T.; Sánchez, J.; Ta, D.; Taboada Gameiro, S.; Tadel, M.; Tafirout, R.; Taga, A.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Talby, M.; Talyshev, A.; Tamsett, M. C.; Tanaka, J.; Tanaka, K.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tanaka, Y.; Tappern, G. P.; Tapprogge, S.; Tarem, S.; Tarrade, F.; Tarrant, J.; Tartarelli, G.; Tas, P.; Tasevsky, M.; Tayalati, Y.; Taylor, F. E.; Taylor, G.; Taylor, G. N.; Taylor, R. P.; Tcherniatine, V.; Tegenfeldt, F.; Teixeira-Dias, P.; Ten Kate, H.; Teng, P. K.; Ter-Antonyan, R.; Terada, S.; Terron, J.; Terwort, M.; Teuscher, R. J.; Tevlin, C. M.; Thadome, J.; Thion, J.; Thioye, M.; Thomas, A.; Thomas, J. P.; Thomas, T. L.; Thomas, E.; Thompson, R. J.; Thompson, A. S.; Thun, R. P.; Tic, T.; Tikhomirov, V. O.; Tikhonov, Y. A.; Timm, S.; Timmermans, C. J. W. P.; Tipton, P.; Tique Aires Viegas, F. J.; Tisserant, S.; Titov, M.; Tobias, J.; Tocut, V. M.; Toczek, B.; Todorova-Nova, S.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tomasek, L.; Tomasek, M.; Tomasz, F.; Tomoto, M.; Tompkins, D.; Tompkins, L.; Toms, K.; Tonazzo, A.; Tong, G.; Tonoyan, A.; Topfel, C.; Topilin, N. D.; Torrence, E.; Torres Pais, J. G.; Toth, J.; Touchard, F.; Tovey, D. R.; Tovey, S. N.; Towndrow, E. F.; Trefzger, T.; Treichel, M.; Treis, J.; Tremblet, L.; Tribanek, W.; Tricoli, A.; Trigger, I. M.; Trilling, G.; Trincaz-Duvoid, S.; Tripiana, M. F.; Trischuk, W.; Trka, Z.; Trocmé, B.; Troncon, C.; C-L Tseng, J.; Tsiafis, I.; Tsiareshka, P. V.; Tsipolitis, G.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Turala, M.; Turk Cakir, I.; Turlay, E.; Tuts, P. M.; Twomey, M. S.; Tyndel, M.; Typaldos, D.; Tyrvainen, H.; Tzamarioudaki, E.; Tzanakos, G.; Ueda, I.; Uhrmacher, M.; Ukegawa, F.; Ullán Comes, M.; Unal, G.; Underwood, D. G.; Undrus, A.; Unel, G.; Unno, Y.; Urkovsky, E.; Usai, G.; Usov, Y.; Vacavant, L.; Vacek, V.; Vachon, B.; Vahsen, S.; Valderanis, C.; Valenta, J.; Valente, P.; Valero, A.; Valkar, S.; Valls Ferrer, J. A.; Van der Bij, H.; van der Graaf, H.; van der Kraaij, E.; Van Eijk, B.; van Eldik, N.; van Gemmeren, P.; van Kesteren, Z.; van Vulpen, I.; Van Berg, R.; Vandelli, W.; Vandoni, G.; Vaniachine, A.; Vannucci, F.; Varanda, M.; Varela Rodriguez, F.; Vari, R.; Varnes, E. W.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vassilakopoulos, V. I.; Vassilieva, L.; Vataga, E.; Vaz, L.; Vazeille, F.; Vedrine, P.; Vegni, G.; Veillet, J. J.; Vellidis, C.; Veloso, F.; Veness, R.; Veneziano, S.; Ventura, A.; Ventura, S.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vertogardov, L.; Vetterli, M. C.; Vichou, I.; Vickey, T.; Viehhauser, G. H. A.; Vigeolas, E.; Villa, M.; Villani, E. G.; Villate, J.; Villella, I.; Vilucchi, E.; Vincent, P.; Vincke, H.; Vincter, M. G.; Vinogradov, V. B.; Virchaux, M.; Viret, S.; Virzi, J.; Vitale, A.; Vivarelli, I.; Vives, R.; Vives Vaques, F.; Vlachos, S.; Vogt, H.; Vokac, P.; Vollmer, C. F.; Volpi, M.; Volpini, G.; von Boehn-Buchholz, R.; von der Schmitt, H.; von Toerne, E.; Vorobel, V.; Vorobiev, A. P.; Vorozhtsov, A. S.; Vorozhtsov, S. B.; Vos, M.; Voss, K. C.; Voss, R.; Vossebeld, J. H.; Vovenko, A. S.; Vranjes, N.; Vrba, V.; Vreeswijk, M.; Anh, T. Vu; Vuaridel, B.; Vudragovic, M.; Vuillemin, V.; Vuillermet, R.; Wänanen, A.; Wahlen, H.; Walbersloh, J.; Walker, R.; Walkowiak, W.; Wall, R.; Wallny, R. S.; Walsh, S.; Wang, C.; Wang, J. C.; Wappler, F.; Warburton, A.; Ward, C. P.; Warner, G. P.; Warren, M.; Warsinsky, M.; Wastie, R.; Watkins, P. M.; Watson, A. T.; Watts, G.; Waugh, A. T.; Waugh, B. M.; Weaverdyck, C.; Webel, M.; Weber, G.; Weber, J.; Weber, M.; Weber, P.; Weidberg, A. R.; Weilhammer, P. M.; Weingarten, J.; Weiser, C.; Wellenstein, H.; Wellisch, H. P.; Wells, P. S.; Wemans, A.; Wen, M.; Wenaus, T.; Wendler, S.; Wengler, T.; Wenig, S.; Wermes, N.; Werneke, P.; Werner, P.; Werthenbach, U.; Wheeler-Ellis, S. J.; Whitaker, S. P.; White, A.; White, M. J.; White, S.; Whittington, D.; Wicek, F.; Wicke, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiesmann, M.; Wiesmann, M.; Wijnen, T.; Wildauer, A.; Wilhelm, I.; Wilkens, H. G.; Williams, H. H.; Willis, W.; Willocq, S.; Wilmut, I.; Wilson, J. A.; Wilson, A.; Wingerter-Seez, I.; Winton, L.; Witzeling, W.; Wlodek, T.; Woehrling, E.; Wolter, M. W.; Wolters, H.; Wosiek, B.; Wotschack, J.; Woudstra, M. J.; Wright, C.; Wu, S. L.; Wu, X.; Wuestenfeld, J.; Wunstorf, R.; Xella-Hansen, S.; Xiang, A.; Xie, S.; Xie, Y.; Xu, G.; Xu, N.; Yamamoto, A.; Yamamoto, S.; Yamaoka, H.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, J. C.; Yang, S.; Yang, U. K.; Yang, Y.; Yang, Z.; Yao, W.-M.; Yao, Y.; Yarradoddi, K.; Yasu, Y.; Ye, J.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, H.; Yoshida, R.; Young, C.; Youssef, S. P.; Yu, D.; Yu, J.; Yu, M.; Yu, X.; Yuan, J.; Yurkewicz, A.; Zaets, V. G.; Zaidan, R.; Zaitsev, A. M.; Zajac, J.; Zajacova, Z.; Zalite, A. Yu; Zalite, Yo K.; Zanello, L.; Zarzhitsky, P.; Zaytsev, A.; Zdrazil, M.; Zeitnitz, C.; Zeller, M.; Zema, P. F.; Zendler, C.; Zenin, A. V.; Zenis, T.; Zenonos, Z.; Zenz, S.; Zerwas, D.; Zhang, H.; Zhang, J.; Zheng, W.; Zhang, X.; Zhao, L.; Zhao, T.; Zhao, X.; Zhao, Z.; Zhelezko, A.; Zhemchugov, A.; Zheng, S.; Zhichao, L.; Zhou, B.; Zhou, N.; Zhou, S.; Zhou, Y.; Zhu, C. G.; Zhu, H. Z.; Zhuang, X. A.; Zhuravlov, V.; Zilka, B.; Zimin, N. I.; Zimmermann, S.; Ziolkowski, M.; Zitoun, R.; Zivkovic, L.; Zmouchko, V. V.; Zobernig, G.; Zoccoli, A.; Zoeller, M. M.; Zolnierowski, Y.; Zsenei, A.; zur Nedden, M.; Zychacek, V.

    2008-08-01

    The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper. A brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented.

  3. CERN IRRADIATION FACILITIES.

    PubMed

    Pozzi, Fabio; Garcia Alia, Ruben; Brugger, Markus; Carbonez, Pierre; Danzeca, Salvatore; Gkotse, Blerina; Richard Jaekel, Martin; Ravotti, Federico; Silari, Marco; Tali, Maris

    2017-09-28

    CERN provides unique irradiation facilities for applications in dosimetry, metrology, intercomparison of radiation protection devices, benchmark of Monte Carlo codes and radiation damage studies to electronics. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Towards a 21st century telephone exchange at CERN

    NASA Astrophysics Data System (ADS)

    Valentín, F.; Hesnaux, A.; Sierra, R.; Chapron, F.

    2015-12-01

    The advent of mobile telephony and Voice over IP (VoIP) has significantly impacted the traditional telephone exchange industry—to such an extent that private branch exchanges are likely to disappear completely in the near future. For large organisations, such as CERN, it is important to be able to smooth this transition by implementing new multimedia platforms that can protect past investments and the flexibility needed to securely interconnect emerging VoIP solutions and forthcoming developments such as Voice over LTE (VoLTE). We present the results of ongoing studies and tests at CERN of the latest technologies in this area.

  5. Ageing Studies on the First Resistive-MicroMeGaS Quadruplet at GIF++ Preliminary Results

    NASA Astrophysics Data System (ADS)

    Alvarez Gonzalez, B.; Bianco, M.; Farina, E.; Iengo, P.; Kuger, F.; Lin, T.; Longo, L.; Sekhniaidze, G.; Sidiropoulou, O.; Schott, M.; Valderanis, C.; Wotschack, J.

    2018-02-01

    A resistive-MicroMeGaS quadruplet built at CERN has been installed at the new CERN Gamma Irradiation Facility (GIF++) with the aim of carrying out a long-term ageing study. Two smaller resistive bulk-MicroMeGaS produced at the CERN PCB workshop have also been installed at GIF++ in order to provide a comparison of the ageing behavior with the MicroMeGaS quadruplet. We give an overview of the ongoing tests at GIF++ in terms of particle rate, integrated charge and spatial resolution of the MicroMeGaS detectors.

  6. Media Training

    ScienceCinema

    None

    2017-12-09

    With the LHC starting up soon, the world's media are again turning their attention to CERN. We're all likely to be called upon to explain what is happening at CERN to media, friends and neighbours. The seminar will be given by BBC television news journalists Liz Pike and Nadia Marchant, and will deal with the kind of questions we're likely to be confronted with through the restart period. The training is open for everybody. Make sure you arrive early enough to get a seat - there are only 200 seats in the Globe. The session will also be webcast: http://webcast.cern.ch/

  7. The significance of Cern

    ScienceCinema

    None

    2017-12-09

    Le Prof. V.Weisskopf, DG du Cern de 1961 à 1965, est né à Vienne, a fait ses études à Göttingen et a une carrière académique particulièrement riche. Il a travaillé à Berlin, Copenhague et Berlin et est parti aux Etats Unis pour participer au projet Manhattan et était Prof. au MTT jusqu'à 1960. Revenu en Europe, il a été DG du Cern et lui a donné l'impulsion que l'on sait.

  8. HIGH ENERGY PHYSICS: CERN Link Breathes Life Into Russian Physics.

    PubMed

    Stone, R

    2000-10-13

    Without fanfare, 600 Russian scientists here at CERN, the European particle physics laboratory, are playing key roles in building the Large Hadron Collider (LHC), a machine that will explore fundamental questions such as why particles have mass, as well as search for exotic new particles whose existence would confirm supersymmetry, a popular theory that aims to unify the four forces of nature. In fact, even though Russia is not one of CERN's 20 member states, most top high-energy physicists in Russia are working on the LHC. Some say their work could prove the salvation of high-energy physics back home.

  9. Experience with procuring, deploying and maintaining hardware at remote co-location centre

    NASA Astrophysics Data System (ADS)

    Bärring, O.; Bonfillou, E.; Clement, B.; Coelho Dos Santos, M.; Dore, V.; Gentit, A.; Grossir, A.; Salter, W.; Valsan, L.; Xafi, A.

    2014-05-01

    In May 2012 CERN signed a contract with the Wigner Data Centre in Budapest for an extension to CERN's central computing facility beyond its current boundaries set by electrical power and cooling available for computing. The centre is operated as a remote co-location site providing rack-space, electrical power and cooling for server, storage and networking equipment acquired by CERN. The contract includes a 'remote-hands' services for physical handling of hardware (rack mounting, cabling, pushing power buttons, ...) and maintenance repairs (swapping disks, memory modules, ...). However, only CERN personnel have network and console access to the equipment for system administration. This report gives an insight to adaptations of hardware architecture, procurement and delivery procedures undertaken enabling remote physical handling of the hardware. We will also describe tools and procedures developed for automating the registration, burn-in testing, acceptance and maintenance of the equipment as well as an independent but important change to the IT assets management (ITAM) developed in parallel as part of the CERN IT Agile Infrastructure project. Finally, we will report on experience from the first large delivery of 400 servers and 80 SAS JBOD expansion units (24 drive bays) to Wigner in March 2013. Changes were made to the abstract file on 13/06/2014 to correct errors, the pdf file was unchanged.

  10. Building an organic block storage service at CERN with Ceph

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel; Wiebalck, Arne

    2014-06-01

    Emerging storage requirements, such as the need for block storage for both OpenStack VMs and file services like AFS and NFS, have motivated the development of a generic backend storage service for CERN IT. The goals for such a service include (a) vendor neutrality, (b) horizontal scalability with commodity hardware, (c) fault tolerance at the disk, host, and network levels, and (d) support for geo-replication. Ceph is an attractive option due to its native block device layer RBD which is built upon its scalable, reliable, and performant object storage system, RADOS. It can be considered an "organic" storage solution because of its ability to balance and heal itself while living on an ever-changing set of heterogeneous disk servers. This work will present the outcome of a petabyte-scale test deployment of Ceph by CERN IT. We will first present the architecture and configuration of our cluster, including a summary of best practices learned from the community and discovered internally. Next the results of various functionality and performance tests will be shown: the cluster has been used as a backend block storage system for AFS and NFS servers as well as a large OpenStack cluster at CERN. Finally, we will discuss the next steps and future possibilities for Ceph at CERN.

  11. Self-service for software development projects and HPC activities

    NASA Astrophysics Data System (ADS)

    Husejko, M.; Høimyr, N.; Gonzalez, A.; Koloventzos, G.; Asbury, D.; Trzcinska, A.; Agtzidis, I.; Botrel, G.; Otto, J.

    2014-05-01

    This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons.Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAllister, Liam

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions".This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAllister, Liam

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental InteractionS". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ashoke

    Part 7.The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five seriesmore » of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions";. This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ashoke

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network". The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher.« less

  20. PREFACE: International Conference on Computing in High Energy and Nuclear Physics (CHEP 2012)

    NASA Astrophysics Data System (ADS)

    Ernst, Michael; Düllmann, Dirk; Rind, Ofer; Wong, Tony

    2012-12-01

    The International Conference on Computing in High Energy and Nuclear Physics (CHEP) was held at New York University on 21- 25 May 2012. CHEP is a major series of international conferences for physicists and computing professionals from the High Energy and Nuclear Physics community and related scientific and technical fields. The CHEP conference provides a forum to exchange information on computing progress and needs for the community, and to review recent, ongoing and future activities. CHEP conferences are held at roughly 18-month intervals, alternating between Europe, Asia, the Americas and other parts of the world. Recent CHEP conferences have been held in Taipei, Taiwan (2010); Prague, Czech Republic (2009); Victoria, Canada (2007); Mumbai, India (2006); Interlaken, Switzerland (2004); San Diego, United States (2003); Beijing, China (2001); Padova, Italy (2000). CHEP 2012 was organized by Brookhaven National Laboratory (BNL) and co-sponsored by New York University. The organizational structure for CHEP consists of an International Advisory Committee (IAC) which sets the overall themes of the conference, a Program Organizing Committee (POC) that oversees the program content, and a Local Organizing Committee (LOC) that is responsible for local arrangements (lodging, transportation and social events) and conference logistics (registration, program scheduling, conference site selection and conference proceedings). There were over 500 attendees with a program that included plenary sessions of invited speakers, a number of parallel sessions comprising around 125 oral and 425 poster presentations and industrial exhibitions. We thank all the presenters for the excellent scientific content of their contributions to the conference. Conference tracks covered topics on Online Computing, Event Processing, Distributed Processing and Analysis on Grids and Clouds, Computer Facilities, Production Grids and Networking, Software Engineering, Data Stores and Databases and Collaborative Tools. We would like to thank Brookhaven Science Associates, New York University, Blue Nest Events, the International Advisory Committee, the Program Committee and the Local Organizing Committee members for all their support and assistance. We also would like to acknowledge the support provided by the following sponsors: ACEOLE, Data Direct Networks, Dell, the European Middleware Initiative and Nexsan. Special thanks to the Program Committee members for their careful choice of conference contributions and enormous effort in reviewing and editing the conference proceedings. The next CHEP conference will be held in Amsterdam, the Netherlands on 14-18 October 2013. Conference Chair Michael Ernst (BNL) Program Committee Daniele Bonacorsi, University of Bologna, Italy Simone Campana, CERN, Switzerland Philippe Canal, Fermilab, United States Sylvain Chapeland, CERN, Switzerland Dirk Düllmann, CERN, Switzerland Johannes Elmsheuser, Ludwig Maximilian University of Munich, Germany Maria Girone, CERN, Switzerland Steven Goldfarb, University of Michigan, United States Oliver Gutsche, Fermilab, United States Benedikt Hegner, CERN, Switzerland Andreas Heiss, Karlsruhe Institute of Technology, Germany Peter Hristov, CERN, Switzerland Tony Johnson, SLAC, United States David Lange, LLNL, United States Adam Lyon, Fermilab, United States Remigius Mommsen, Fermilab, United States Axel Naumann, CERN, Switzerland Niko Neufeld, CERN, Switzerland Rolf Seuster, TRIUMF, Canada Local Organizing Committee Maureen Anderson, John De Stefano, Mariette Faulkner, Ognian Novakov, Ofer Rind, Tony Wong (BNL) Kyle Cranmer (NYU) International Advisory Committee Mohammad Al-Turany, GSI, Germany Lothar Bauerdick, Fermilab, United States Ian Bird, CERN, Switzerland Dominique Boutigny, IN2P3, France Federico Carminati, CERN, Switzerland Marco Cattaneo, CERN, Switzerland Gang Chen, Institute of High Energy Physics, China Peter Clarke, University of Edinburgh, United Kingdom Sridhara Dasu, University of Wisconsin-Madison, United States Günter Duckeck, Ludwig Maximilian University of Munich, Germany Richard Dubois, SLAC, United States Michael Ernst, BNL, United States Ian Fisk, Fermilab, United States Gonzalo Merino, PIC, Spain John Gordon, STFC-RAL, United Kingdom Volker Gülzow, DESY, Germany Frederic Hemmer, CERN, Switzerland Viatcheslav Ilyin, Moscow State University, Russia Nobuhiko Katayama, KEK, Japan Alexei Klimentov, BNL, United States Simon C. Lin, Academia Sinica, Taiwan Milos Lokajícek, FZU Prague, Czech Republic David Malon, ANL, United States Pere Mato Vila, CERN, Switzerland Mauro Morandin, INFN CNAF, Italy Harvey Newman, Caltech, United States Farid Ould-Saada, University of Oslo, Norway Ruth Pordes, Fermilab, United States Hiroshi Sakamoto, University of Tokyo, Japan Alberto Santoro, UERJ, Brazil Jim Shank, Boston University, United States Dongchul Son, Kyungpook National University, South Korea Reda Tafirout, TRIUMF, Canada Stephen Wolbers, Fermilab, United States Frank Wuerthwein, UCSD, United States

  1. Dcs Data Viewer, an Application that Accesses ATLAS DCS Historical Data

    NASA Astrophysics Data System (ADS)

    Tsarouchas, C.; Schlenker, S.; Dimitrov, G.; Jahn, G.

    2014-06-01

    The ATLAS experiment at CERN is one of the four Large Hadron Collider experiments. The Detector Control System (DCS) of ATLAS is responsible for the supervision of the detector equipment, the reading of operational parameters, the propagation of the alarms and the archiving of important operational data in a relational database (DB). DCS Data Viewer (DDV) is an application that provides access to the ATLAS DCS historical data through a web interface. Its design is structured using a client-server architecture. The pythonic server connects to the DB and fetches the data by using optimized SQL requests. It communicates with the outside world, by accepting HTTP requests and it can be used stand alone. The client is an AJAX (Asynchronous JavaScript and XML) interactive web application developed under the Google Web Toolkit (GWT) framework. Its web interface is user friendly, platform and browser independent. The selection of metadata is done via a column-tree view or with a powerful search engine. The final visualization of the data is done using java applets or java script applications as plugins. The default output is a value-over-time chart, but other types of outputs like tables, ascii or ROOT files are supported too. Excessive access or malicious use of the database is prevented by a dedicated protection mechanism, allowing the exposure of the tool to hundreds of inexperienced users. The current configuration of the client and of the outputs can be saved in an XML file. Protection against web security attacks is foreseen and authentication constrains have been taken into account, allowing the exposure of the tool to hundreds of users world wide. Due to its flexible interface and its generic and modular approach, DDV could be easily used for other experiment control systems.

  2. CERN goes iconic

    NASA Astrophysics Data System (ADS)

    2017-06-01

    There are more than 1800 emoji that can be sent and received in text messages and e-mails. Now, the CERN particle-physics lab near Geneva has got in on the act and released its own collection of 35 images that can be used by anyone with an Apple device.

  3. Neutrino Factory Plans at CERN

    NASA Astrophysics Data System (ADS)

    Riche, J. A.

    2002-10-01

    The considerable interest raised by the discovery of neutrino oscillations and recent progress in studies of muon colliders has triggered interest in considering a neutrino factory at CERN. This paper explains the reference scenario, indicates the other possible choices and mentions the R&D that are foreseen.

  4. Wi-Fi Service enhancement at CERN

    NASA Astrophysics Data System (ADS)

    Ducret, V.; Sosnowski, A.; Gonzalez Caballero, B.; Barrand, Q.

    2017-10-01

    Since the early 2000’s, the number of mobile devices connected to CERN’s internal network has increased from just a handful to well over 10,000. Wireless access is no longer simply “nice to have” or just for conference and meeting rooms; support for mobility is expected by most, if not all, of the CERN community. In this context, a full renewal of the CERN Wi-Fi network has been launched to deliver a state-of-the-art campus-wide Wi-Fi Infrastructure. We aim to deliver, in more than 200 office buildings with a surface area of over 400,000m2 and including many high-priority and high-occupation zones, an end-user experience comparable, for most applications, to a wired connection and with seamless mobility support. We describe here the studies and tests performed at CERN to ensure the solution we are deploying can meet these goals as well as delivering a single, simple, flexible and open management platform.

  5. Thermostructural characterization and structural elastic property optimization of novel high luminosity LHC collimation materials at CERN

    NASA Astrophysics Data System (ADS)

    Borg, M.; Bertarelli, A.; Carra, F.; Gradassi, P.; Guardia-Valenzuela, J.; Guinchard, M.; Izquierdo, G. Arnau; Mollicone, P.; Sacristan-de-Frutos, O.; Sammut, N.

    2018-03-01

    The CERN Large Hadron Collider is currently being upgraded to operate at a stored beam energy of 680 MJ through the High Luminosity upgrade. The LHC performance is dependent on the functionality of beam collimation systems, essential for safe beam cleaning and machine protection. A dedicated beam experiment at the CERN High Radiation to Materials facility is created under the HRMT-23 experimental campaign. This experiment investigates the behavior of three collimation jaws having novel composite absorbers made of copper diamond, molybdenum carbide graphite, and carbon fiber carbon, experiencing accidental scenarios involving the direct beam impact on the material. Material characterization is imperative for the design, execution, and analysis of such experiments. This paper presents new data and analysis of the thermostructural characteristics of some of the absorber materials commissioned within CERN facilities. In turn, characterized elastic properties are optimized through the development and implementation of a mixed numerical-experimental optimization technique.

  6. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    NASA Astrophysics Data System (ADS)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.

    2011-06-01

    A new stable version ("production version") v5.28.00 of ROOT [1] has been published [2]. It features several major improvements in many areas, most noteworthy data storage performance as well as statistics and graphics features. Some of these improvements have already been predicted in the original publication Antcheva et al. (2009) [3]. This version will be maintained for at least 6 months; new minor revisions ("patch releases") will be published [4] to solve problems reported with this version. New version program summaryProgram title: ROOT Catalogue identifier: AEFA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Lesser Public License v.2.1 No. of lines in distributed program, including test data, etc.: 2 934 693 No. of bytes in distributed program, including test data, etc.: 1009 Distribution format: tar.gz Programming language: C++ Computer: Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC Operating system: GNU/Linux, Windows XP/Vista/7, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Has the code been vectorized or parallelized?: Yes RAM: > 55 Mbytes Classification: 4, 9, 11.9, 14 Catalogue identifier of previous version: AEFA_v1_0 Journal reference of previous version: Comput. Phys. Commun. 180 (2009) 2499 Does the new version supersede the previous version?: Yes Nature of problem: Storage, analysis and visualization of scientific data Solution method: Object store, wide range of analysis algorithms and visualization methods Reasons for new version: Added features and corrections of deficiencies Summary of revisions: The release notes at http://root.cern.ch/root/v528/Version528.news.html give a module-oriented overview of the changes in v5.28.00. Highlights include File format Reading of TTrees has been improved dramatically with respect to CPU time (30%) and notably with respect to disk space. Histograms A new TEfficiency class has been provided to handle the calculation of efficiencies and their uncertainties, TH2Poly for polygon-shaped bins (e.g. maps), TKDE for kernel density estimation, and TSVDUnfold for singular value decomposition. Graphics Kerning is now supported in TLatex, PostScript and PDF; a table of contents can be added to PDF files. A new font provides italic symbols. A TPad containing GL can be stored in a binary (i.e. non-vector) image file; add support for full-scene anti-aliasing. Usability enhancements to EVE. Math New interfaces for generating random number according to a given distribution, goodness of fit tests of unbinned data, binning multidimensional data, and several advanced statistical functions were added. RooFit Introduction of HistFactory; major additions to RooStats. TMVA Updated to version 4.1.0, adding e.g. the support for simultaneous classification of multiple output classes for several multivariate methods. PROOF Many new features, adding to PROOF's usability, plus improvements and fixes. PyROOT Support of Python 3 has been added. Tutorials Several new tutorials were provided for above new features (notably RooStats). A detailed list of all the changes is available at http://root.cern.ch/root/htmldoc/examples/V5. Additional comments: For an up-to-date author list see: http://root.cern.ch/drupal/content/root-development-team and http://root.cern.ch/drupal/content/former-root-developers. The distribution file for this program is over 30 Mbytes and therefore is not delivered directly when download or E-mail is requested. Instead a html file giving details of how the program can be obtained is sent. Running time: Depending on the data size and complexity of analysis algorithms. References: id="pr0100" view="all">http://root.cern.ch. http://root.cern.ch/drupal/content/production-version-528. I. Antcheva, M. Ballintijn, B. Bellenot, M. Biskup, R. Brun, N. Buncic, Ph. Canal, D. Casadei, O. Couet, V. Fine, L. Franco, G. Ganis, A. Gheata, D. Gonzalez Maline, M. Goto, J. Iwaszkiewicz, A. Kreshuk, D. Marcos Segura, R. Maunder, L. Moneta, A. Naumann, E. Offermann, V. Onuchin, S. Panacek, F. Rademakers, P. Russo, M. Tadel, ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization, Comput. Phys. Commun. 180 (2009) 2499. http://root.cern.ch/drupal/content/root-version-v5-28-00-patch-release-notes.

  7. Highlights from the CERN/ESO/NordForsk ''Gender in Physics Day''

    NASA Astrophysics Data System (ADS)

    Primas, F.; Guinot, G.; Strandberg, L.

    2017-03-01

    In their role as observers on the EU Gender Equality Network in the European Research Area (GENERA) project, funded under the Horizon 2020 framework, CERN, ESO and NordForsk joined forces and organised a Gender in Physics Day at the CERN Globe of Science and Innovation. The one-day conference aimed to examine innovative activities promoting gender equality, and to discuss gender-oriented policies and best practice in the European Research Area (with special emphasis on intergovernmental organisations), as well as the importance of building solid networks. The event was very well attended and was declared a success. The main highlights of the meeting are reported.

  8. Dissemination of data measured at the CERN n_TOF facility

    NASA Astrophysics Data System (ADS)

    Dupont, E.; Otuka, N.; Cabellos, O.; Aberle, O.; Aerts, G.; Altstadt, S.; Alvarez, H.; Alvarez-Velarde, F.; Andriamonje, S.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Badurek, G.; Balibrea, J.; Barbagallo, M.; Barros, S.; Baumann, P.; Bécares, V.; Bečvář, F.; Beinrucker, C.; Belloni, F.; Berthier, B.; Berthoumieux, E.; Billowes, J.; Boccone, V.; Bosnar, D.; Brown, A.; Brugger, M.; Caamaño, M.; Calviani, M.; Calviño, F.; Cano-Ott, D.; Capote, R.; Cardella, R.; Carrapiço, C.; Casanovas, A.; Castelluccio, D. M.; Cennini, P.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Chin, M.; Colonna, N.; Cortés, G.; Cortés-Giraldo, M. A.; Cosentino, L.; Couture, A.; Cox, J.; Damone, L. A.; David, S.; Deo, K.; Diakaki, M.; Dillmann, I.; Domingo-Pardo, C.; Dressler, R.; Dridi, W.; Duran, I.; Eleftheriadis, C.; Embid-Segura, M.; Fernández-Domínguez, B.; Ferrant, L.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Fraval, K.; Frost, R. J. W.; Fujii, K.; Furman, W.; Ganesan, S.; Garcia, A. R.; Gawlik, A.; Gheorghe, I.; Gilardoni, S.; Giubrone, G.; Glodariu, T.; Göbel, K.; Gomez-Hornillos, M. B.; Goncalves, I. F.; Gonzalez-Romero, E.; Goverdovski, A.; Gramegna, F.; Griesmayer, E.; Guerrero, C.; Gunsing, F.; Gurusamy, P.; Haight, R.; Harada, H.; Heftrich, T.; Heil, M.; Heinitz, S.; Hernández-Prieto, A.; Heyse, J.; Igashira, M.; Isaev, S.; Jenkins, D. G.; Jericha, E.; Kadi, Y.; Kaeppeler, F.; Kalamara, A.; Karadimos, D.; Karamanis, D.; Katabuchi, T.; Kavrigin, P.; Kerveno, M.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Konovalov, V.; Krtička, M.; Kroll, J.; Kurtulgil, D.; Lampoudis, C.; Langer, C.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Naour, C. Le; Lerendegui-Marco, J.; Leong, L. S.; Licata, M.; Meo, S. Lo; Lonsdale, S. J.; Losito, R.; Lozano, M.; Macina, D.; Manousos, A.; Marganiec, J.; Martinez, T.; Marrone, S.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Mondelaers, W.; Montesano, S.; Moreau, C.; Mosconi, M.; Musumarra, A.; Negret, A.; Nolte, R.; O'Brien, S.; Oprea, A.; Palomo-Pinto, F. R.; Pancin, J.; Paradela, C.; Patronis, N.; Pavlik, A.; Pavlopoulos, P.; Perkowski, J.; Perrot, L.; Pigni, M. T.; Plag, R.; Plompen, A.; Plukis, L.; Poch, A.; Porras, I.; Praena, J.; Pretel, C.; Quesada, J. M.; Radeck, D.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego, A.; Robles, M.; Roman, F.; Rout, P. C.; Rudolf, G.; Rubbia, C.; Rullhusen, P.; Ryan, J. A.; Sabaté-Gilarte, M.; Salgado, J.; Santos, C.; Sarchiapone, L.; Sarmento, R.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Stephan, C.; Suryanarayana, S. V.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tavora, L.; Terlizzi, R.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Versaci, R.; Vermeulen, M. J.; Villamarin, D.; Vicente, M. C.; Vlachoudis, V.; Vlastou, R.; Voss, F.; Wallner, A.; Walter, S.; Ware, T.; Warren, S.; Weigand, M.; Weiß, C.; Wolf, C.; Wiesher, M.; Wisshak, K.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    The n_TOF neutron time-of-flight facility at CERN is used for high quality nuclear data measurements from thermal energy up to hundreds of MeV. In line with the CERN open data policy, the n_TOF Collaboration takes actions to preserve its unique data, facilitate access to them in standardised format, and allow their re-use by a wide community in the fields of nuclear physics, nuclear astrophysics and various nuclear technologies. The present contribution briefly describes the n_TOF outcomes, as well as the status of dissemination and preservation of n_TOF final data in the international EXFOR library.

  9. How to create successful Open Hardware projects — About White Rabbits and open fields

    NASA Astrophysics Data System (ADS)

    van der Bij, E.; Arruat, M.; Cattin, M.; Daniluk, G.; Gonzalez Cobas, J. D.; Gousiou, E.; Lewis, J.; Lipinski, M. M.; Serrano, J.; Stana, T.; Voumard, N.; Wlostowski, T.

    2013-12-01

    CERN's accelerator control group has embraced ''Open Hardware'' (OH) to facilitate peer review, avoid vendor lock-in and make support tasks scalable. A web-based tool for easing collaborative work was set up and the CERN OH Licence was created. New ADC, TDC, fine delay and carrier cards based on VITA and PCI-SIG standards were designed and drivers for Linux were written. Often industry was paid for developments, while quality and documentation was controlled by CERN. An innovative timing network was also developed with the OH paradigm. Industry now sells and supports these designs that find their way into new fields.

  10. Medical Applications at CERN and the ENLIGHT Network

    PubMed Central

    Dosanjh, Manjit; Cirilli, Manuela; Myers, Steve; Navin, Sparsh

    2016-01-01

    State-of-the-art techniques derived from particle accelerators, detectors, and physics computing are routinely used in clinical practice and medical research centers: from imaging technologies to dedicated accelerators for cancer therapy and nuclear medicine, simulations, and data analytics. Principles of particle physics themselves are the foundation of a cutting edge radiotherapy technique for cancer treatment: hadron therapy. This article is an overview of the involvement of CERN, the European Organization for Nuclear Research, in medical applications, with specific focus on hadron therapy. It also presents the history, achievements, and future scientific goals of the European Network for Light Ion Hadron Therapy, whose co-ordination office is at CERN. PMID:26835422

  11. Medical Applications at CERN and the ENLIGHT Network.

    PubMed

    Dosanjh, Manjit; Cirilli, Manuela; Myers, Steve; Navin, Sparsh

    2016-01-01

    State-of-the-art techniques derived from particle accelerators, detectors, and physics computing are routinely used in clinical practice and medical research centers: from imaging technologies to dedicated accelerators for cancer therapy and nuclear medicine, simulations, and data analytics. Principles of particle physics themselves are the foundation of a cutting edge radiotherapy technique for cancer treatment: hadron therapy. This article is an overview of the involvement of CERN, the European Organization for Nuclear Research, in medical applications, with specific focus on hadron therapy. It also presents the history, achievements, and future scientific goals of the European Network for Light Ion Hadron Therapy, whose co-ordination office is at CERN.

  12. Preparation of a primary argon beam for the CERN fixed target physics.

    PubMed

    Küchler, D; O'Neil, M; Scrivens, R; Thomae, R

    2014-02-01

    The fixed target experiment NA61 in the North Area of the Super Proton Synchrotron is studying phase transitions in strongly interacting matter. Up to now they used the primary beams available from the CERN accelerator complex (protons and lead ions) or fragmented beams created from the primary lead ion beam. To explore a wider range of energies and densities a request was made to provide primary argon and xenon beams. This paper describes the results of the setting up and 10 week test run of the Ar(11+) beam from the 14.5 GHz ECR ion source and the linear accelerator (Linac3) at CERN.

  13. Status and Roadmap of CernVM

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Blomer, J.; Buncic, P.; Charalampidis, I.; Ganis, G.; Meusel, R.

    2015-12-01

    Cloud resources nowadays contribute an essential share of resources for computing in high-energy physics. Such resources can be either provided by private or public IaaS clouds (e.g. OpenStack, Amazon EC2, Google Compute Engine) or by volunteers computers (e.g. LHC@Home 2.0). In any case, experiments need to prepare a virtual machine image that provides the execution environment for the physics application at hand. The CernVM virtual machine since version 3 is a minimal and versatile virtual machine image capable of booting different operating systems. The virtual machine image is less than 20 megabyte in size. The actual operating system is delivered on demand by the CernVM File System. CernVM 3 has matured from a prototype to a production environment. It is used, for instance, to run LHC applications in the cloud, to tune event generators using a network of volunteer computers, and as a container for the historic Scientific Linux 5 and Scientific Linux 4 based software environments in the course of long-term data preservation efforts of the ALICE, CMS, and ALEPH experiments. We present experience and lessons learned from the use of CernVM at scale. We also provide an outlook on the upcoming developments. These developments include adding support for Scientific Linux 7, the use of container virtualization, such as provided by Docker, and the streamlining of virtual machine contextualization towards the cloud-init industry standard.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher. This video is Part 11 in the series.« less

  15. A possible biomedical facility at the European Organization for Nuclear Research (CERN).

    PubMed

    Dosanjh, M; Jones, B; Myers, S

    2013-05-01

    A well-attended meeting, called "Brainstorming discussion for a possible biomedical facility at CERN", was held by the European Organization for Nuclear Research (CERN) at the European Laboratory for Particle Physics on 25 June 2012. This was concerned with adapting an existing, but little used, 78-m circumference CERN synchrotron to deliver a wide range of ion species, preferably from protons to at least neon ions, with beam specifications that match existing clinical facilities. The potential extensive research portfolio discussed included beam ballistics in humanoid phantoms, advanced dosimetry, remote imaging techniques and technical developments in beam delivery, including gantry design. In addition, a modern laboratory for biomedical characterisation of these beams would allow important radiobiological studies, such as relative biological effectiveness, in a dedicated facility with standardisation of experimental conditions and biological end points. A control photon and electron beam would be required nearby for relative biological effectiveness comparisons. Research beam time availability would far exceed that at other facilities throughout the world. This would allow more rapid progress in several biomedical areas, such as in charged hadron therapy of cancer, radioisotope production and radioprotection. The ethos of CERN, in terms of open access, peer-reviewed projects and governance has been so successful for High Energy Physics that application of the same to biomedicine would attract high-quality research, with possible contributions from Europe and beyond, along with potential new funding streams.

  16. Mapping Remote and Multidisciplinary Learning Barriers: Lessons from "Challenge-Based Innovation" at CERN

    ERIC Educational Resources Information Center

    Jensen, Matilde Bisballe; Utriainen, Tuuli Maria; Steinert, Martin

    2018-01-01

    This paper presents the experienced difficulties of students participating in the multidisciplinary, remote collaborating engineering design course challenge-based innovation at CERN. This is with the aim to identify learning barriers and improve future learning experiences. We statistically analyse the rated differences between distinct design…

  17. DG's New Year's presentation

    ScienceCinema

    Heuer, R.-D.

    2018-05-22

    CERN general staff meeting. Looking back at key messages: Highest priority: LHC physics in 2009; Increase diversity of the scientific program; Prepare for future projects; Establish open and direct communication; Prepare CERN towards a global laboratory; Increase consolidation efforts; Financial situation--tight; Knowledge and technology transfer--proactive; Contract policy and internal mobility--lessons learned.

  18. Knowledge and Technology: Sharing With Society

    NASA Astrophysics Data System (ADS)

    Benvenuti, Cristoforo; Sutton, Christine; Wenninger, Horst

    The following sections are included: * A Core Mission of CERN * Medical Accelerators: A Tool for Tumour Therapy * Medipix: The Image is the Message * Crystal Clear: From Higgs to PET * Solar Collectors: When Nothing is Better * The TARC Experiment at CERN: Modern Alchemy * A CLOUD Chamber with a Silvery Lining * References

  19. Contextualized Magnetism in Secondary School: Learning from the LHC (CERN)

    ERIC Educational Resources Information Center

    Cid, Ramon

    2005-01-01

    Physics teachers in secondary schools usually mention the world's largest particle physics laboratory--CERN (European Organization for Nuclear Research)--only because of the enormous size of the accelerators and detectors used there, the number of scientists involved in their activities and also the necessary international scientific…

  20. WorldWide Web: Hypertext from CERN.

    ERIC Educational Resources Information Center

    Nickerson, Gord

    1992-01-01

    Discussion of software tools for accessing information on the Internet focuses on the WorldWideWeb (WWW) system, which was developed at the European Particle Physics Laboratory (CERN) in Switzerland to build a worldwide network of hypertext links using available networking technology. Its potential for use with multimedia documents is also…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aaboud, M.; Aad, G.; Abbott, B.

    Measurements of the production cross section of a Z boson in association with jets in proton–proton collisions at √s = 13 TeV are presented, using data corresponding to an integrated luminosity of 3.16 fb –1 collected by the ATLAS experiment at the CERN Large Hadron Collider in 2015. Inclusive and differential cross sections are measured for events containing a Z boson decaying to electrons or muons and produced in association with up to seven jets with p T > 30 GeV and |y| < 2.5. Predictions from different Monte Carlo generators based on leading-order and next-to-leading-order matrix elements for upmore » to two additional partons interfaced with parton shower and fixed-order predictions at next-to-leading order and next-to-next-to-leading order are compared with the measured cross sections. Good agreement within the uncertainties is observed for most of the modelled quantities, in particular with the generators which use next-to-leading-order matrix elements and the more recent next-to-next-to-leading-order fixed-order predictions.« less

  2. Assessment of thermal loads in the CERN SPS crab cavities cryomodule 1

    DOE PAGES

    Carra, F.; Apeland, J.; Calaga, R.; ...

    2017-07-20

    As a part of the HL-LHC upgrade, we designed a cryomodule to host two crab cavities for a first test with protons in the SPS machine. The evaluation of the cryomodule heat loads is essential to dimension the cryogenic infrastructure of the system. The current design features two cryogenic circuits. The first circuit adopts superfluid helium at 2 K to maintain the cavities in the superconducting state. The second circuit, based on helium gas at a temperature between 50 K and 70 K, is connected to the thermal screen, also serving as heat intercept for all the interfaces between themore » cold mass and the external environment. We present an overview of the heat loads to both circuits, and the combined numerical and analytical estimations. The heat load of each element is detailed for the static and dynamic scenarios, with considerations on the design choices for the thermal optimization of the most critical components.« less

  3. Investigation of High-Level Synthesis tools’ applicability to data acquisition systems design based on the CMS ECAL Data Concentrator Card example

    NASA Astrophysics Data System (ADS)

    HUSEJKO, Michal; EVANS, John; RASTEIRO DA SILVA, Jose Carlos

    2015-12-01

    High-Level Synthesis (HLS) for Field-Programmable Logic Array (FPGA) programming is becoming a practical alternative to well-established VHDL and Verilog languages. This paper describes a case study in the use of HLS tools to design FPGA-based data acquisition systems (DAQ). We will present the implementation of the CERN CMS detector ECAL Data Concentrator Card (DCC) functionality in HLS and lessons learned from using HLS design flow. The DCC functionality and a definition of the initial system-level performance requirements (latency, bandwidth, and throughput) will be presented. We will describe how its packet processing control centric algorithm was implemented with VHDL and Verilog languages. We will then show how the HLS flow could speed up design-space exploration by providing loose coupling between functions interface design and functions algorithm implementation. We conclude with results of real-life hardware tests performed with the HLS flow-generated design with a DCC Tester system.

  4. Ceph-based storage services for Run2 and beyond

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel C.; Lamanna, Massimo; Mascetti, Luca; Peters, Andreas J.; Rousseau, Hervé

    2015-12-01

    In 2013, CERN IT evaluated then deployed a petabyte-scale Ceph cluster to support OpenStack use-cases in production. With now more than a year of smooth operations, we will present our experience and tuning best-practices. Beyond the cloud storage use-cases, we have been exploring Ceph-based services to satisfy the growing storage requirements during and after Run2. First, we have developed a Ceph back-end for CASTOR, allowing this service to deploy thin disk server nodes which act as gateways to Ceph; this feature marries the strong data archival and cataloging features of CASTOR with the resilient and high performance Ceph subsystem for disk. Second, we have developed RADOSFS, a lightweight storage API which builds a POSIX-like filesystem on top of the Ceph object layer. When combined with Xrootd, RADOSFS can offer a scalable object interface compatible with our HEP data processing applications. Lastly the same object layer is being used to build a scalable and inexpensive NFS service for several user communities.

  5. NaNet: a configurable NIC bridging the gap between HPC and real-time HEP GPU computing

    NASA Astrophysics Data System (ADS)

    Lonardo, A.; Ameli, F.; Ammendola, R.; Biagioni, A.; Cotta Ramusino, A.; Fiorini, M.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Martinelli, M.; Neri, I.; Paolucci, P. S.; Pastorelli, E.; Pontisso, L.; Rossetti, D.; Simeone, F.; Simula, F.; Sozzi, M.; Tosoratto, L.; Vicini, P.

    2015-04-01

    NaNet is a FPGA-based PCIe Network Interface Card (NIC) design with GPUDirect and Remote Direct Memory Access (RDMA) capabilities featuring a configurable and extensible set of network channels. The design currently supports both standard—Gbe (1000BASE-T) and 10GbE (10Base-R)—and custom—34 Gbps APElink and 2.5 Gbps deterministic latency KM3link—channels, but its modularity allows for straightforward inclusion of other link technologies. The GPUDirect feature combined with a transport layer offload module and a data stream processing stage makes NaNet a low-latency NIC suitable for real-time GPU processing. In this paper we describe the NaNet architecture and its performances, exhibiting two of its use cases: the GPU-based low-level trigger for the RICH detector in the NA62 experiment at CERN and the on-/off-shore data transport system for the KM3NeT-IT underwater neutrino telescope.

  6. Graphical processors for HEP trigger systems

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Di Lorenzo, S.; Fantechi, R.; Fiorini, M.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Neri, I.; Paolucci, P. S.; Pastorelli, E.; Piandani, R.; Pontisso, L.; Rossetti, D.; Simula, F.; Sozzi, M.; Vicini, P.

    2017-02-01

    General-purpose computing on GPUs is emerging as a new paradigm in several fields of science, although so far applications have been tailored to employ GPUs as accelerators in offline computations. With the steady decrease of GPU latencies and the increase in link and memory throughputs, time is ripe for real-time applications using GPUs in high-energy physics data acquisition and trigger systems. We will discuss the use of online parallel computing on GPUs for synchronous low level trigger systems, focusing on tests performed on the trigger of the CERN NA62 experiment. Latencies of all components need analysing, networking being the most critical. To keep it under control, we envisioned NaNet, an FPGA-based PCIe Network Interface Card (NIC) enabling GPUDirect connection. Moreover, we discuss how specific trigger algorithms can be parallelised and thus benefit from a GPU implementation, in terms of increased execution speed. Such improvements are particularly relevant for the foreseen LHC luminosity upgrade where highly selective algorithms will be crucial to maintain sustainable trigger rates with very high pileup.

  7. Applications of Emerging Parallel Optical Link Technology to High Energy Physics Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chramowicz, J.; Kwan, S.; Prosser, A.

    2011-09-01

    Modern particle detectors depend upon optical fiber links to deliver event data to upstream trigger and data processing systems. Future detector systems can benefit from the development of dense arrangements of high speed optical links emerging from the telecommunications and storage area network market segments. These links support data transfers in each direction at rates up to 120 Gbps in packages that minimize or even eliminate edge connector requirements. Emerging products include a class of devices known as optical engines which permit assembly of the optical transceivers in close proximity to the electrical interfaces of ASICs and FPGAs which handlemore » the data in parallel electrical format. Such assemblies will reduce required printed circuit board area and minimize electromagnetic interference and susceptibility. We will present test results of some of these parallel components and report on the development of pluggable FPGA Mezzanine Cards equipped with optical engines to provide to collaborators on the Versatile Link Common Project for the HI-LHC at CERN.« less

  8. Preparation of a primary argon beam for the CERN fixed target physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Küchler, D., E-mail: detlef.kuchler@cern.ch; O’Neil, M.; Scrivens, R.

    2014-02-15

    The fixed target experiment NA61 in the North Area of the Super Proton Synchrotron is studying phase transitions in strongly interacting matter. Up to now they used the primary beams available from the CERN accelerator complex (protons and lead ions) or fragmented beams created from the primary lead ion beam. To explore a wider range of energies and densities a request was made to provide primary argon and xenon beams. This paper describes the results of the setting up and 10 week test run of the Ar{sup 11+} beam from the 14.5 GHz ECR ion source and the linear acceleratormore » (Linac3) at CERN.« less

  9. Deployment and Operational Experiences with CernVM-FS at the GridKa Tier-1 Center

    NASA Astrophysics Data System (ADS)

    Alef, Manfred; Jäger, Axel; Petzold and, Andreas; Verstege, Bernhard

    2012-12-01

    In 2012 the GridKa Tier-1 computing center hosts 130 kHS06 computing resources and 14PB disk and 17PB tape space. These resources are shared between the four LHC VOs and a number of national and international VOs from high energy physics and other sciences. CernVM-FS has been deployed at GridKa to supplement the existing NFS-based system to access VO software on the worker nodes. It provides a solution tailored to the requirement of the LHC VOs. We will focus on the first operational experiences and the monitoring of CernVM-FS on the worker nodes and the squid caches.

  10. Open Media Training Session

    ScienceCinema

    None

    2017-12-09

    Have you ever wondered how the media work and why some topics make it into the news and other don't? Would you like to know how to (and how not to) give an interview to a journalist? With the LHC preparing for first collisions at high energies, the world's media are again turning their attention to CERN. We're all likely to be called upon to explain what is happening at CERN to media, friends and neighbours. The seminar will be given by BBC television news journalists Liz Pike and Nadia Marchant, and will deal with the kind of questions we're likely to be confronted with through the restart period. Follow the webcast: http://webcast.cern.ch/

  11. CERN - Six Decades of Science, Innovation, Cooperation, and Inspiration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quigg, Chris

    The European Laboratory for Particle Physics, which straddles the Swiss-French border northwest of Geneva, celebrates its sixtieth birthday in 2014 CERN is the preeminent particle-physics institution in the world, currently emphasizing the study of collisions of protons and heavy nuclei at very high energies and the exploration of physics on the electroweak scale (energies where electromagnetism and the weak nuclear force merge). With brilliant accomplishments in research, innovation, and education, and a sustained history of cooperation among people from different countries and cultures, CERN ranks as one of the signal achievements of the postwar European Project. For physicists the worldmore » over, the laboratory is a source of pride and inspiration.« less

  12. More "Hands-On" Particle Physics: Learning with ATLAS at CERN

    ERIC Educational Resources Information Center

    Long, Lynne

    2011-01-01

    This article introduces teachers and students to a new portal of resources called Learning with ATLAS at CERN (http://learningwithatlas-portal.eu/), which has been developed by a European consortium of academic researchers and schools' liaison and outreach providers from countries across Europe. It includes the use of some of the mind-boggling…

  13. History of Cern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2007-12-20

    Cérémonie à l'occasion de l'apparition du premier volume du livre sur l'histoire du Cern, avec plusieurs personnes présentes qui jouaient un rôle important dans cette organisation européenne couronnée de succès grâce à l'esprit des membres fondateurs qui est et restera essentiel

  14. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  15. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    ScienceCinema

    Arnold, Jeffrey

    2018-05-14

    Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided. About the speaker: Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.

  16. TOWARDS A NOVEL MODULAR ARCHITECTURE FOR CERN RADIATION MONITORING.

    PubMed

    Boukabache, Hamza; Pangallo, Michel; Ducos, Gael; Cardines, Nicola; Bellotta, Antonio; Toner, Ciarán; Perrin, Daniel; Forkel-Wirth, Doris

    2017-04-01

    The European Organization for Nuclear Research (CERN) has the legal obligation to protect the public and the people working on its premises from any unjustified exposure to ionising radiation. In this context, radiation monitoring is one of the main concerns of the Radiation Protection Group. After 30 y of reliable service, the ARea CONtroller (ARCON) system is approaching the end of its lifecycle, which raises the need for new, more efficient radiation monitors with a high level of modularity to ensure better maintainability. Based on these two main principles, new detectors are currently being developed that will be capable of measuring very low dose rates down to 50 nSv h-1, whilst being able to measure radiation over an extensive range of 8 decades without any auto scaling. To reach these performances, CERN Radiation MOnitoring Electronics (CROME), the new generation of CERN radiation monitors, is based on the versatile architecture that includes new read-out electronics developed by the Instrumentation and Logistics section of the CERN Radiation Protection Group as well as a reconfigurable system on chip capable of performing complex processing calculations. Beside the capabilities of CROME to continuously measure the ambient dose rate, the system generates radiation alarms, provides interlock signals, drives alarm display units through a fieldbus and provides long-term, permanent and reliable data logging. The measurement tests performed during the first phase of the development show very promising results that pave the way to the second phase: the certification. © The Author 2016. Published by Oxford University Press.

  17. TOWARDS A NOVEL MODULAR ARCHITECTURE FOR CERN RADIATION MONITORING

    PubMed Central

    Boukabache, Hamza; Pangallo, Michel; Ducos, Gael; Cardines, Nicola; Bellotta, Antonio; Toner, Ciarán; Perrin, Daniel; Forkel-Wirth, Doris

    2017-01-01

    Abstract The European Organization for Nuclear Research (CERN) has the legal obligation to protect the public and the people working on its premises from any unjustified exposure to ionising radiation. In this context, radiation monitoring is one of the main concerns of the Radiation Protection Group. After 30 y of reliable service, the ARea CONtroller (ARCON) system is approaching the end of its lifecycle, which raises the need for new, more efficient radiation monitors with a high level of modularity to ensure better maintainability. Based on these two main principles, new detectors are currently being developed that will be capable of measuring very low dose rates down to 50 nSv h−1, whilst being able to measure radiation over an extensive range of 8 decades without any auto scaling. To reach these performances, CERN Radiation MOnitoring Electronics (CROME), the new generation of CERN radiation monitors, is based on the versatile architecture that includes new read-out electronics developed by the Instrumentation and Logistics section of the CERN Radiation Protection Group as well as a reconfigurable system on chip capable of performing complex processing calculations. Beside the capabilities of CROME to continuously measure the ambient dose rate, the system generates radiation alarms, provides interlock signals, drives alarm display units through a fieldbus and provides long-term, permanent and reliable data logging. The measurement tests performed during the first phase of the development show very promising results that pave the way to the second phase: the certification. PMID:27909154

  18. Got Questions About the Higgs Boson? Ask a Scientist

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinchliffe, Ian

    Ask a scientist about the Higgs boson. There's a lot of buzz this week over new data from CERN's Large Hadron Collider (LHC) and the final data from Fermilab's Tevatron about the Higgs boson. It raises questions about what scientists have found and what still remains to be found -- and what it all means. Berkeley Lab's Ian Hinchliffe invites you to send in questions about the Higgs. He'll answer a few of your questions in a follow-up video later this week. Hinchliffe is a theoretical physicist who heads Berkeley Lab's sizable contingent with the ATLAS experiment at CERN. •more » Post your questions in the comment box • E-mail your questions to askascientist@lbl.gov • Tweet to @BerkeleyLab • Or post on our facebook page: facebook/berkeleylab Update on July 5: Ian responds to several of your questions in this video: http://youtu.be/1BkpD1IS62g. Update on 7/04: Here's CERN's press release from earlier today on the latest preliminary results in the search for the long sought Higgs particle: http://press.web.cern.ch/press/PressReleases/Releases2012/PR17.12E.htm. And here's a Q&A on what the news tells us: http://cdsweb.cern.ch/journal/CERNBulletin/2012/28/News%20Articles/1459460?ln=en. CERN will present the new LHC data at a seminar July 4th at 9:00 in the morning Geneva time (3:00 in the morning Eastern Daylight Time, midnight on the Pacific Coast), where the ATLAS collaboration and their rivals in the CMS experiment will announce their results. Tevatron results were announced by Fermilab on Monday morning. For more background on the LHC's search for the Higgs boson, visit http://newscenter.lbl.gov/feature-stories/2012/06/28/higgs-2012/.« less

  19. Got Questions About the Higgs Boson? Ask a Scientist

    ScienceCinema

    Hinchliffe, Ian

    2017-12-12

    Ask a scientist about the Higgs boson. There's a lot of buzz this week over new data from CERN's Large Hadron Collider (LHC) and the final data from Fermilab's Tevatron about the Higgs boson. It raises questions about what scientists have found and what still remains to be found -- and what it all means. Berkeley Lab's Ian Hinchliffe invites you to send in questions about the Higgs. He'll answer a few of your questions in a follow-up video later this week. Hinchliffe is a theoretical physicist who heads Berkeley Lab's sizable contingent with the ATLAS experiment at CERN. • Post your questions in the comment box • E-mail your questions to askascientist@lbl.gov • Tweet to @BerkeleyLab • Or post on our facebook page: facebook/berkeleylab Update on July 5: Ian responds to several of your questions in this video: http://youtu.be/1BkpD1IS62g. Update on 7/04: Here's CERN's press release from earlier today on the latest preliminary results in the search for the long sought Higgs particle: http://press.web.cern.ch/press/PressReleases/Releases2012/PR17.12E.htm. And here's a Q&A on what the news tells us: http://cdsweb.cern.ch/journal/CERNBulletin/2012/28/News%20Articles/1459460?ln=en. CERN will present the new LHC data at a seminar July 4th at 9:00 in the morning Geneva time (3:00 in the morning Eastern Daylight Time, midnight on the Pacific Coast), where the ATLAS collaboration and their rivals in the CMS experiment will announce their results. Tevatron results were announced by Fermilab on Monday morning. For more background on the LHC's search for the Higgs boson, visit http://newscenter.lbl.gov/feature-stories/2012/06/28/higgs-2012/.

  20. Of people, particles and prejudice

    NASA Astrophysics Data System (ADS)

    Jackson, Penny; Greene, Anne; Mears, Matt; Spacecadet1; Green, Christian; Hunt, Devin J.; Berglyd Olsen, Veronica K.; Ilya, Komarov; Pierpont, Elaine; Gillman, Matthew

    2016-05-01

    In reply to Louise Mayor's feature article “Where people and particles collide”, about the experiences of researchers at CERN who are lesbian, gay, bisexual or transgender (LGBT), efforts to make LGBT CERN an officially recognized club, and incidents where posters advertising the club have been torn down or defaced (March pp31-36, http://ow.ly/YVP2Z).

  1. The Secret Chambers in the Chephren Pyramid

    ERIC Educational Resources Information Center

    Gutowski, Bartosz; Józwiak, Witold; Joos, Markus; Kempa, Janusz; Komorowska, Kamila; Krakowski, Kamil; Pijus, Ewa; Szymczak, Kamil; Trojanowska, Malgorzata

    2018-01-01

    In 2016, we (seven high school students from a school in Plock, Poland) participated in the CERN Beamline for Schools competition. Together with our team coach, Mr. Janusz Kempa, we submitted a proposal to CERN that was selected as one of two winning proposals that year. This paper describes our experiment from the early days of brainstorming to…

  2. Lead Ions and Coulomb's Law at the LHC (CERN)

    ERIC Educational Resources Information Center

    Cid-Vidal, Xabier; Cid, Ramon

    2018-01-01

    Although for most of the time the Large Hadron Collider (LHC) at CERN collides protons, for around one month every year lead ions are collided, to expand the diversity of the LHC research programme. Furthermore, in an effort not originally foreseen, proton-lead collisions are also taking place, with results of high interest to the physics…

  3. From strangeness enhancement to quark-gluon plasma discovery

    NASA Astrophysics Data System (ADS)

    Koch, Peter; Müller, Berndt; Rafelski, Johann

    2017-11-01

    This is a short survey of signatures and characteristics of the quark-gluon plasma in the light of experimental results that have been obtained over the past three decades. In particular, we present an in-depth discussion of the strangeness observable, including a chronology of the experimental effort to detect QGP at CERN-SPS, BNL-RHIC, and CERN-LHC.

  4. Ceremony 25th birthday Cern

    ScienceCinema

    None

    2018-05-18

    Celebration of CERN's 25th birthday with a speech by L. Van Hove and J.B. Adams, musical interludes by Ms. Mey and her colleagues (starting with Beethoven). The general managers then proceed with the presentation of souvenirs to members of the personnel who have 25 years of service in the organization. A gesture of recognition is also given to Zwerner.

  5. Comittees

    NASA Astrophysics Data System (ADS)

    2004-10-01

    Fritz Caspers (CERN, Switzerland), Michel Chanel (CERN, Switzerland), Håkan Danared (MSL, Sweden), Bernhard Franzke (GSI, Germany), Manfred Grieser (MPI für Kernphysik, Germany), Dieter Habs (LMU München, Germany), Jeffrey Hangst (University of Aarhus, Denmark), Takeshi Katayama (RIKEN/Univ. Tokyo, Japan), H.-Jürgen Kluge (GSI, Germany), Shyh-Yuan Lee (Indiana University, USA), Rudolf Maier (FZ Jülich, Germany), John Marriner (FNAL, USA), Igor Meshkov (JINR, Russia), Dieter Möhl (CERN, Switzerland), Vasily Parkhomchuk (BINP, Russia), Robert Pollock (Indiana University), Dieter Prasuhn (FZ Jülich, Germany), Dag Reistad (TSL, Sweden), John Schiffer (ANL, USA), Andrew Sessler (LBNL, USA), Alexander Skrinsky (BINP, Russia), Markus Steck (GSI, Germany), Jie Wei (BNL, USA), Andreas Wolf (MPI für Kernphysik, Germany), Hongwei Zhao (IMP, People's Rep. of China).

  6. Across Europe to CERN: Taking students on the ultimate physics experience

    NASA Astrophysics Data System (ADS)

    Wheeler, Sam

    2018-05-01

    In 2013, I was an Einstein Fellow with the U.S. Department of Energy and I was asked by a colleague, working in a senator's office, if I would join him in a meeting with a physicist to "translate" the science into something more understandable. That meeting turned out to be a wonderful opportunity I would never have otherwise had. During the meeting I met Michael Tuts, a physicist who was working on project ATLAS at CERN. Afterwards, I walked with him out of the Senate office building to Union Station and, in parting, he gave me his card and told me that if I were in Geneva that he could help me get a tour of CERN and the LHC.

  7. User and group storage management the CMS CERN T2 centre

    NASA Astrophysics Data System (ADS)

    Cerminara, G.; Franzoni, G.; Pfeiffer, A.

    2015-12-01

    A wide range of detector commissioning, calibration and data analysis tasks is carried out by CMS using dedicated storage resources available at the CMS CERN Tier-2 centre. Relying on the functionalities of the EOS disk-only storage technology, the optimal exploitation of the CMS user/group resources has required the introduction of policies for data access management, data protection, cleanup campaigns based on access pattern, and long term tape archival. The resource management has been organised around the definition of working groups and the delegation to an identified responsible of each group composition. In this paper we illustrate the user/group storage management, and the development and operational experience at the CMS CERN Tier-2 centre in the 2012-2015 period.

  8. [CERN-MEDICIS (Medical Isotopes Collected from ISOLDE): a new facility].

    PubMed

    Viertl, David; Buchegger, Franz; Prior, John O; Forni, Michel; Morel, Philippe; Ratib, Osman; Bühler Léo H; Stora, Thierry

    2015-06-17

    CERN-MEDICIS is a facility dedicated to research and development in life science and medical applications. The research platform was inaugurated in October 2014 and will produce an increasing range of innovative isotopes using the proton beam of ISOLDE for fundamental studies in cancer research, for new imaging and therapy protocols in cell and animal models and for preclinical trials, possibly extended to specific early phase clinical studies (phase 0) up to phase I trials. CERN, the University Hospital of Geneva (HUG), the University Hospital of Lausanne (CHUV), the Swiss Institute for Experimental Cancer (ISREC) at Swiss Federal Institutes of Technology (EPFL) that currently support the project will benefit of the initial production that will then be extended to other centers.

  9. LCG MCDB—a knowledgebase of Monte-Carlo simulated events

    NASA Astrophysics Data System (ADS)

    Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.

    2008-02-01

    In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project. Program summaryProgram title: LCG Monte-Carlo Data Base Catalogue identifier: ADZX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 30 129 No. of bytes in distributed program, including test data, etc.: 216 943 Distribution format: tar.gz Programming language: Perl Computer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 Gb Operating system: Scientific Linux CERN 3/4 RAM: 1 073 741 824 bytes (1 Gb) Classification: 9 External routines:perl >= 5.8.5; Perl modules DBD-mysql >= 2.9004, File::Basename, GD::SecurityImage, GD::SecurityImage::AC, Linux::Statistics, XML::LibXML > 1.6, XML::SAX, XML::NamespaceSupport; Apache HTTP Server >= 2.0.59; mod auth external >= 2.2.9; edg-utils-system RPM package; gd >= 2.0.28; rpm package CASTOR-client >= 2.1.2-4; arc-server (optional) Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC) generators to prepare the events. For example, the same MC samples of Standard Model (SM) processes can be employed for the investigations either in the SM analyses (as a signal) or in searches for new phenomena in Beyond Standard Model analyses (as a background). If the samples are made available publicly and equipped with corresponding and comprehensive documentation, it can speed up cross checks of the samples themselves and physical models applied. Some event samples require a lot of computing resources for preparation. So, a central storage of the samples prevents possible waste of researcher time and computing resources, which can be used to prepare the same events many times. Solution method: Creation of a special knowledgebase (MCDB) designed to keep event samples for the LHC experimental and phenomenological community. The knowledgebase is realized as a separate web-server ( http://mcdb.cern.ch). All event samples are kept on types at CERN. Documentation describing the events is the main contents of MCDB. Users can browse the knowledgebase, read and comment articles (documentation), and download event samples. Authors can upload new event samples, create new articles, and edit own articles. Restrictions: The software is adopted to solve the problems, described in the article and there are no any additional restrictions. Unusual features: The software provides a framework to store and document large files with flexible authentication and authorization system. Different external storages with large capacity can be used to keep the files. The WEB Content Management System provides all of the necessary interfaces for the authors of the files, end-users and administrators. Running time: Real time operations. References: [1] The main LCG MCDB server, http://mcdb.cern.ch/. [2] P. Bartalini, L. Dudko, A. Kryukov, I.V. Selyuzhenkov, A. Sherstnev, A. Vologdin, LCG Monte-Carlo data base, hep-ph/0404241. [3] J.P. Baud, B. Couturier, C. Curran, J.D. Durand, E. Knezo, S. Occhetti, O. Barring, CASTOR: status and evolution, cs.oh/0305047.

  10. Transaction aware tape-infrastructure monitoring

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Fotios; Kruse, Daniele Francesco

    2014-06-01

    Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.

  11. Asymmetric B-factory note

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calderon, M.

    Three main issues giving purpose to our visit to CERN, ESRF and DESY were to: assess the current thinking at CERN on whether Eta, the gas desorption coefficient, would continue to decrease with continued with continued beam cleaning, determine if the time between NEG reconditioning could be expanded, and acquire a knowledge of the basic fabrication processes and techniques for producing beam vacuum chambers of copper.

  12. The Proton Synchrotron (PS): At the Core of the CERN Accelerators

    NASA Astrophysics Data System (ADS)

    Cundy, Donald; Gilardoni, Simone

    The following sections are included: * Introduction * Extraction: Getting the Beam to Leave the Accelerator * Acceleration and Bunch Gymnastics * Boosting PS Beam Intensity * Capacitive Energy Storage Replaces Flywheel * Taking the Neutrinos by the Horns * OMEGA: Towards the Electronic Bubble Chamber * ISOLDE: Targeting a New Era in Nuclear Physics * The CERN n_TOF Facility: Catching Neutrons on the Fly * References

  13. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  14. The Higgs Boson: Is the End in Sight?

    ERIC Educational Resources Information Center

    Lincoln, Don

    2012-01-01

    This summer, perhaps while you were lounging around the pool in the blistering heat, the blogosphere was buzzing about data taken at the Large Hadron Collider at CERN. The buzz reached a crescendo in the first week of July when both Fermilab and CERN announced the results of their searches for the Higgs boson. Hard data confronted a theory nearly…

  15. The kaon identification system in the NA62 experiment at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romano, A.

    2015-07-01

    The main goal of the NA62 experiment at CERN is to measure the branching ratio of the ultra-rare K{sup +} → π{sup +} ν ν-bar decay with 10% accuracy. NA62 will use a 750 MHz high-energy un-separated charged hadron beam, with kaons corresponding to ∼6% of the beam, and a kaon decay-in-flight technique. The positive identification of kaons is performed with a differential Cherenkov detector (CEDAR), filled with Nitrogen gas and placed in the incoming beam. To stand the kaon rate (45 MHz average) and meet the performances required in NA62, the Cherenkov detector has been upgraded (KTAG) with newmore » photon detectors, readout, mechanics and cooling systems. The KTAG provides a fast identification of kaons with an efficiency of at least 95% and precise time information with a resolution below 100 ps. A half-equipped KTAG detector has been commissioned during a technical run at CERN in 2012, while the fully equipped detector, its readout and front-end have been commissioned during a pilot run at CERN in October 2014. The measured time resolution and efficiency are within the required performances. (authors)« less

  16. Lecture archiving on a larger scale at the University of Michigan and CERN

    NASA Astrophysics Data System (ADS)

    Herr, Jeremy; Lougheed, Robert; Neal, Homer A.

    2010-04-01

    The ATLAS Collaboratory Project at the University of Michigan has been a leader in the area of collaborative tools since 1999. Its activities include the development of standards, software and hardware tools for lecture archiving, and making recommendations for videoconferencing and remote teaching facilities. Starting in 2006 our group became involved in classroom recordings, and in early 2008 we spawned CARMA, a University-wide recording service. This service uses a new portable recording system that we developed. Capture, archiving and dissemination of rich multimedia content from lectures, tutorials and classes are increasingly widespread activities among universities and research institutes. A growing array of related commercial and open source technologies is becoming available, with several new products introduced in the last couple years. As the result of a new close partnership between U-M and CERN IT, a market survey of these products was conducted and a summary of the results are presented here. It is informing an ambitious effort in 2009 to equip many CERN rooms with automated lecture archiving systems, on a much larger scale than before. This new technology is being integrated with CERN's existing webcast, CDS, and Indico applications.

  17. DataBase on Demand

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, R.; Gomez, D.; Coterillo Coz, I.; Wojcik, D.

    2012-12-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  18. 65th birthday Jack Steinberger

    ScienceCinema

    None

    2017-12-09

    Laudatio pour Jack Steinberger né le 25 mai 1921, à l'occasion de son 65me anniversaire et sa retraite officielle, pour sa précieuse collaboration au Cern. Néanmoins son principal activité continuera comme avant dans sa recherche au Cern. Plusieurs orateurs prennent la parole (p.ex. E.Picasso) pour le féliciter et lui rendre hommage

  19. History of Cern

    ScienceCinema

    None

    2017-12-09

    Cérémonie à l'occasion de l'apparition du premier volume du livre sur l'histoire du Cern, avec plusieurs personnes présentes qui jouaient un rôle important dans cette organisation européenne couronnée de succès grâce à l'esprit des membres fondateurs qui est et restera essentiel

  20. Investigating the Inverse Square Law with the Timepix Hybrid Silicon Pixel Detector: A CERN [at] School Demonstration Experiment

    ERIC Educational Resources Information Center

    Whyntie, T.; Parker, B.

    2013-01-01

    The Timepix hybrid silicon pixel detector has been used to investigate the inverse square law of radiation from a point source as a demonstration of the CERN [at] school detector kit capabilities. The experiment described uses a Timepix detector to detect the gamma rays emitted by an [superscript 241]Am radioactive source at a number of different…

  1. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License Notice: Published under licence in Journal of Physics: Conference Series by IOP Publishing Ltd.

  2. Radiation protection challenges in the management of radioactive waste from high-energy accelerators.

    PubMed

    Ulrici, Luisa; Algoet, Yvon; Bruno, Luca; Magistris, Matteo

    2015-04-01

    The European Laboratory for Particle Physics (CERN) has operated high-energy accelerators for fundamental physics research for nearly 60 y. The side-product of this activity is the radioactive waste, which is mainly generated as a result of preventive and corrective maintenance, upgrading activities and the dismantling of experiments or accelerator facilities. Prior to treatment and disposal, it is common practice to temporarily store radioactive waste on CERN's premises and it is a legal requirement that these storage facilities are safe and secure. Waste treatment typically includes sorting, segregation, volume and size reduction and packaging, which will depend on the type of component, its chemical composition, residual activity and possible surface contamination. At CERN, these activities are performed in a dedicated waste treatment centre under the supervision of the Radiation Protection Group. This paper gives an overview of the radiation protection challenges in the conception of a temporary storage and treatment centre for radioactive waste in an accelerator facility, based on the experience gained at CERN. The CERN approach consists of the classification of waste items into 'families' with similar radiological and physical-chemical properties. This classification allows the use of specific, family-dependent techniques for radiological characterisation and treatment, which are simultaneously efficient and compliant with best practices in radiation protection. The storage was planned on the basis of radiological and other possible hazards such as toxicity, pollution and fire load. Examples are given of technical choices for the treatment and radiological characterisation of selected waste families, which could be of interest to other accelerator facilities. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Air liquide 1.8 K refrigeration units for CERN LHC project

    NASA Astrophysics Data System (ADS)

    Hilbert, Benoît; Gistau-Baguer, Guy M.; Caillaud, Aurélie

    2002-05-01

    The Large Hadron Collider (LHC) will be CERN's next research instrument for high energy physics. This 27 km long circular accelerator will make intensive use of superconducting magnets, operated below 2.0 K. It will thus require high capacity refrigeration below 2.0 K [1, 2]. Coupled to a refrigerator providing 18 kW equivalent at 4.5 K [3], these systems will be able to absorb a cryogenic power of 2.4 kW at 1.8 K in nominal conditions. Air Liquide has designed one Cold Compressor System (CCS) pre-series for CERN-preceding 3 more of them (among 8 in total located around the machine). These systems, making use of cryogenic centrifugal compressors in a series arrangement coupled to room temperature screw compressors, are presented. Key components characteristics will be given.

  4. Upgrade of the cryogenic CERN RF test facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pirotte, O.; Benda, V.; Brunner, O.

    2014-01-29

    With the large number of superconducting radiofrequency (RF) cryomodules to be tested for the former LEP and the present LHC accelerator a RF test facility was erected early in the 1990’s in the largest cryogenic test facility at CERN located at Point 18. This facility consisted of four vertical test stands for single cavities and originally one and then two horizontal test benches for RF cryomodules operating at 4.5 K in saturated helium. CERN is presently working on the upgrade of its accelerator infrastructure, which requires new superconducting cavities operating below 2 K in saturated superfluid helium. Consequently, the RFmore » test facility has been renewed in order to allow efficient cavity and cryomodule tests in superfluid helium and to improve its thermal performances. The new RF test facility is described and its performances are presented.« less

  5. Wolfgang Kummer at CERN

    NASA Astrophysics Data System (ADS)

    Schopper, Herwig

    Wolfgang Kummer was not only a great theorist but also a man with a noble spirit and extensive education, based on a fascinating long-term Austrian cultural tradition. As an experimentalist I am not sufficiently knowledgeable to evaluate his contributions to theoretical physics - this will certainly be done by more competent scientists. Nevertheless I admired him for not only being attached to fundamental and abstract problems like quantum field theory, quantum gravity or black holes, but for his interest in down to earth questions like electron-proton scattering or the toponium mass. I got to know Wolfgang Kummer very well and appreciate his human qualities during his long attachment to CERN, in particular when he served as president of the CERN Council, the highest decision taking authority of this international research centre, from 1985 to 1987 falling into my term as Director-General…

  6. Database on Demand: insight how to build your own DBaaS

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, Ruben; Coterillo Coz, Ignacio

    2015-12-01

    At CERN, a number of key database applications are running on user-managed MySQL, PostgreSQL and Oracle database services. The Database on Demand (DBoD) project was born out of an idea to provide CERN user community with an environment to develop and run database services as a complement to the central Oracle based database service. The Database on Demand empowers the user to perform certain actions that had been traditionally done by database administrators, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently three major RDBMS (relational database management system) vendors are offered. In this article we show the actual status of the service after almost three years of operations, some insight of our new redesign software engineering and near future evolution.

  7. The beam test of muon detector parameters for the SHiP experiment at CERN

    NASA Astrophysics Data System (ADS)

    Likhacheva, V. L.; Kudenko, Yu. G.; Mefodiev, A. V.; Mineev, O. V.; Khotyantsev, A. N.

    2018-01-01

    Scintillation detectors based on extruded plastics have been tested in a 10 GeV/c beam at CERN. The scintillation signal readout was provided using optical wavelength shifting fibers Y11 Kuraray and Hamamatsu MPPC micropixel avalanche photodiodes. The light yield was scanned along and across the detectors. Time resolution was found by fitting the MPPC digitized pulse rise and other methods.

  8. Determining the structure of Higgs couplings at the CERN LargeHadron Collider.

    PubMed

    Plehn, Tilman; Rainwater, David; Zeppenfeld, Dieter

    2002-02-04

    Higgs boson production via weak boson fusion at the CERN Large Hadron Collider has the capability to determine the dominant CP nature of a Higgs boson, via the tensor structure of its coupling to weak bosons. This information is contained in the azimuthal angle distribution of the two outgoing forward tagging jets. The technique is independent of both the Higgs boson mass and the observed decay channel.

  9. CERN data services for LHC computing

    NASA Astrophysics Data System (ADS)

    Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.

    2017-10-01

    Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.

  10. Commissioning results of CERN HIE-ISOLDE and INFN ALPI cryogenic control systems

    NASA Astrophysics Data System (ADS)

    Inglese, V.; Pezzetti, M.; Calore, A.; Modanese, P.; Pengo, R.

    2017-02-01

    The cryogenic systems of both accelerators, namely HIE ISOLDE (High Intensity and Energy Isotope Separator On Line DEvice) at CERN and ALPI (Acceleratore Lineare Per Ioni) at LNL, have been refurbished. HIE ISOLDE is a major upgrade of the existing ISOLDE facilities, which required the construction of a superconducting linear accelerator consisting of six cryomodules, each containing five superconductive RF cavities and superconducting solenoids. The ALPI linear accelerator, similar to HIE ISOLDE, is located at Legnaro National Laboratories (LNL) and became operational in the early 90’s. It is composed of 74 superconducting RF cavities, assembled inside 22 cryostats. The new control systems are equipped with PLC, developed on the CERN UNICOS framework, which include Schneider and Siemens PLCs and various fieldbuses (Profibus DP and PA, WorldFIP). The control systems were developed in synergy between CERN and LNL in order to build, effectively and with an optimized use of resources, control systems allowing to enhance ease of operation, maintainability, and long-term availability. This paper describes (i) the cryogenic systems, with special focus on the design of the control systems hardware and software, (ii) the strategy adopted in order to achieve a synergic approach, and (iii) the commissioning results after the cool-down to 4.5 K of the cryomodules.

  11. The CMS tracker control system

    NASA Astrophysics Data System (ADS)

    Dierlamm, A.; Dirkes, G. H.; Fahrer, M.; Frey, M.; Hartmann, F.; Masetti, L.; Militaru, O.; Shah, S. Y.; Stringer, R.; Tsirou, A.

    2008-07-01

    The Tracker Control System (TCS) is a distributed control software to operate about 2000 power supplies for the silicon modules of the CMS Tracker and monitor its environmental sensors. TCS must thus be able to handle about 104 power supply parameters, about 103 environmental probes from the Programmable Logic Controllers of the Tracker Safety System (TSS), about 105 parameters read via DAQ from the DCUs in all front end hybrids and from CCUs in all control groups. TCS is built on top of an industrial SCADA program (PVSS) extended with a framework developed at CERN (JCOP) and used by all LHC experiments. The logical partitioning of the detector is reflected in the hierarchical structure of the TCS, where commands move down to the individual hardware devices, while states are reported up to the root which is interfaced to the broader CMS control system. The system computes and continuously monitors the mean and maximum values of critical parameters and updates the percentage of currently operating hardware. Automatic procedures switch off selected parts of the detector using detailed granularity and avoiding widespread TSS intervention.

  12. Optimising LAN access to grid enabled storage elements

    NASA Astrophysics Data System (ADS)

    Stewart, G. A.; Cowan, G. A.; Dunne, B.; Elwell, A.; Millar, A. P.

    2008-07-01

    When operational, the Large Hadron Collider experiments at CERN will collect tens of petabytes of physics data per year. The worldwide LHC computing grid (WLCG) will distribute this data to over two hundred Tier-1 and Tier-2 computing centres, enabling particle physicists around the globe to access the data for analysis. Although different middleware solutions exist for effective management of storage systems at collaborating institutes, the patterns of access envisaged for Tier-2s fall into two distinct categories. The first involves bulk transfer of data between different Grid storage elements using protocols such as GridFTP. This data movement will principally involve writing ESD and AOD files into Tier-2 storage. Secondly, once datasets are stored at a Tier-2, physics analysis jobs will read the data from the local SE. Such jobs require a POSIX-like interface to the storage so that individual physics events can be extracted. In this paper we consider the performance of POSIX-like access to files held in Disk Pool Manager (DPM) storage elements, a popular lightweight SRM storage manager from EGEE.

  13. Interoperating Cloud-based Virtual Farms

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Colamaria, F.; Colella, D.; Casula, E.; Elia, D.; Franco, A.; Lusso, S.; Luparello, G.; Masera, M.; Miniello, G.; Mura, D.; Piano, S.; Vallero, S.; Venaruzzo, M.; Vino, G.

    2015-12-01

    The present work aims at optimizing the use of computing resources available at the grid Italian Tier-2 sites of the ALICE experiment at CERN LHC by making them accessible to interactive distributed analysis, thanks to modern solutions based on cloud computing. The scalability and elasticity of the computing resources via dynamic (“on-demand”) provisioning is essentially limited by the size of the computing site, reaching the theoretical optimum only in the asymptotic case of infinite resources. The main challenge of the project is to overcome this limitation by federating different sites through a distributed cloud facility. Storage capacities of the participating sites are seen as a single federated storage area, preventing the need of mirroring data across them: high data access efficiency is guaranteed by location-aware analysis software and storage interfaces, in a transparent way from an end-user perspective. Moreover, the interactive analysis on the federated cloud reduces the execution time with respect to grid batch jobs. The tests of the investigated solutions for both cloud computing and distributed storage on wide area network will be presented.

  14. Orthos, an alarm system for the ALICE DAQ operations

    NASA Astrophysics Data System (ADS)

    Chapeland, Sylvain; Carena, Franco; Carena, Wisla; Chibante Barroso, Vasco; Costa, Filippo; Denes, Ervin; Divia, Roberto; Fuchs, Ulrich; Grigore, Alexandru; Simonetti, Giuseppe; Soos, Csaba; Telesca, Adriana; Vande Vyvre, Pierre; von Haller, Barthelemy

    2012-12-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The DAQ (Data Acquisition System) facilities handle the data flow from the detectors electronics up to the mass storage. The DAQ system is based on a large farm of commodity hardware consisting of more than 600 devices (Linux PCs, storage, network switches), and controls hundreds of distributed hardware and software components interacting together. This paper presents Orthos, the alarm system used to detect, log, report, and follow-up abnormal situations on the DAQ machines at the experimental area. The main objective of this package is to integrate alarm detection and notification mechanisms with a full-featured issues tracker, in order to prioritize, assign, and fix system failures optimally. This tool relies on a database repository with a logic engine, SQL interfaces to inject or query metrics, and dynamic web pages for user interaction. We describe the system architecture, the technologies used for the implementation, and the integration with existing monitoring tools.

  15. The VISPA internet platform for outreach, education and scientific research in various experiments

    NASA Astrophysics Data System (ADS)

    van Asseldonk, D.; Erdmann, M.; Fischer, B.; Fischer, R.; Glaser, C.; Heidemann, F.; Müller, G.; Quast, T.; Rieger, M.; Urban, M.; Welling, C.

    2015-12-01

    VISPA provides a graphical front-end to computing infrastructures giving its users all functionality needed for working conditions comparable to a personal computer. It is a framework that can be extended with custom applications to support individual needs, e.g. graphical interfaces for experiment-specific software. By design, VISPA serves as a multipurpose platform for many disciplines and experiments as demonstrated in the following different use-cases. A GUI to the analysis framework OFFLINE of the Pierre Auger collaboration, submission and monitoring of computing jobs, university teaching of hundreds of students, and outreach activity, especially in CERN's open data initiative. Serving heterogeneous user groups and applications gave us lots of experience. This helps us in maturing the system, i.e. improving the robustness and responsiveness, and the interplay of the components. Among the lessons learned are the choice of a file system, the implementation of websockets, efficient load balancing, and the fine-tuning of existing technologies like the RPC over SSH. We present in detail the improved server setup and report on the performance, the user acceptance and the realized applications of the system.

  16. Heat transfer at a sapphire - indium interface in the 30 mK - 300 mK temperature range

    NASA Astrophysics Data System (ADS)

    Liberadzka, J.; Koettig, T.; Bremer, J.; van der Post, C. C. W.; ter Brake, H. J. M.

    2017-02-01

    Within the framework of the AEgIS (Antimatter Experiment: Gravity, Interferometry, Spectroscopy) project a direct measurement of the Earth’s gravitational acceleration on antihydrogen will be carried out. In order to obtain satisfactory precision of the measurement, the thermal movement of the particles should be reduced. Therefore a Penning trap, which is used to trap antiprotons and create antihydrogen, will be placed on a mixing chamber of an especially designed dilution refrigerator. The trap consists of 10 electrodes, which need to be electrically insulated, but thermally anchored. To ensure that the trap remains at a temperature below 100 mK, the heat transfer at the metallic-dielectric boundary is investigated. A copper - indium - sapphire - indium - copper sandwich setup was mounted on the CERN Cryolab dilution refrigerator. Keeping the mixing chamber at a constant low temperature in the range of 30 mK to 300 mK, steady-state measurements with indium in normal conducting and superconducting states have been performed. Obtained results along with a precise description of our setup are presented.

  17. A Medipix3 readout system based on the National Instruments FlexRIO card and using the LabVIEW programming environment

    NASA Astrophysics Data System (ADS)

    Horswell, I.; Gimenez, E. N.; Marchal, J.; Tartoni, N.

    2011-01-01

    Hybrid silicon photon-counting detectors are becoming standard equipment for many synchrotron applications. The latest in the Medipix family of read-out chips designed as part of the Medipix Collaboration at CERN is the Medipix3, which while maintaining the same pixel size as its predecessor, offers increased functionality and operating modes. The active area of the Medipix3 chip is approx 14mm × 14mm (containing 256 × 256 pixels) which is not large enough for many detector applications, this results in the need to tile many sensors and chips. As a first step on the road to develop such a detector, it was decided to build a prototype single chip readout system to gain the necessary experience in operating a Medipix3 chip. To provide a flexible learning and development tool it was decided to build an interface based on the recently released FlexRIOTM system from National Instruments and to use the LabVIEWTM graphical programming environment. This system and the achieved performance are described in this paper.

  18. About Separation of Hadron and Electromagnetic Cascades in the Pamela Calorimeter

    NASA Astrophysics Data System (ADS)

    Stozhkov, Yuri I.; Basili, A.; Bencardino, R.; Casolino, M.; de Pascale, M. P.; Furano, G.; Menicucci, A.; Minori, M.; Morselli, A.; Picozza, P.; Sparvoli, R.; Wischnewski, R.; Bakaldin, A.; Galper, A. M.; Koldashov, S. V.; Korotkov, M. G.; Mikhailov, V. V.; Voronov, S. A.; Yurkin, Y. T.; Adriani, O.; Bonechi, L.; Bongi, M.; Papini, P.; Ricciarini, S. B.; Spillantini, P.; Straulino, S.; Taccetti, F.; Vannuccini, E.; Castellini, G.; Boezio, M.; Bonvicini, M.; Mocchiutti, E.; Schiavon, P.; Vacchi, A.; Zampa, G.; Zampa, N.; Carlson, P.; Lund, J.; Lundquist, J.; Orsi, S.; Pearce, M.; Barbarino, G. C.; Campana, D.; Osteria, G.; Rossi, G.; Russo, S.; Boscherini, M.; Mennh, W.; Simonh, M.; Bongiorno, L.; Ricci, M.; Ambriola, M.; Bellotti, R.; Cafagna, F.; Circella, M.; de Marzo, C.; Giglietto, N.; Mirizzi, N.; Romita, M.; Spinelli, P.; Bogomolov, E.; Krutkov, S.; Vasiljev, G.; Bazilevskaya, G. A.; Kvashnin, A. N.; Logachev, V. I.; Makhmutov, V. S.; Maksumov, O. S.; Stozhkov, Yu. I.; Mitchell, J. W.; Streitmatter, R. E.; Stochaj, S. J.

    Results of calibration of the PAMELA instrument at the CERN facilities are discussed. In September, 2003, the calibration of the Neutron Detector together with the Calorimeter was performed with the CERN beams of electrons and protons with energies of 20 - 180 GeV. The implementation of the Neutron Detector increases a rejection factor of hadrons from electrons about ten times. The results of calibration are in agreement with calculations.

  19. DAMPE prototype and its beam test results at CERN

    NASA Astrophysics Data System (ADS)

    Wu, Jian; Hu, Yiming; Chang, Jin

    The first Chinese high energy cosmic particle detector(DAMPE) aims to detect electron/gamma at the range between 5GeV and 10TeV in space. A prototype of this detector is made and tested using both cosmic muons and test beam at CERN. Energy and space resolution as well as strong separation power for electron and proton are shown in the results. The detector structure is illustrated as well.

  20. Measurement of the inclusive jet cross section at the CERN pp collider

    NASA Astrophysics Data System (ADS)

    Arnison, G.; Albrow, M. G.; Allkofer, O. C.; Astbury, A.; Aubert, B.; Bacci, C.; Batley, J. R.; Bauer, G.; Bettini, A.; Bézaguet, A.; Bock, R. K.; Bos, K.; Buckley, E.; Bunn, J.; Busetto, G.; Catz, P.; Cennini, P.; Centro, S.; Ceradini, F.; Ciapetti, G.; Cittolin, S.; Clarke, D.; Cline, D.; Cochet, C.; Colas, J.; Colas, P.; Corden, M.; Cox, G.; Dallman, D.; Dau, D.; Debeer, M.; Debrion, J. P.; Degiorgi, M.; della Negra, M.; Demoulin, M.; Denby, B.; Denegri, D.; Diciaccio, A.; Dobrzynski, L.; Dorenbosch, J.; Dowell, J. D.; Duchovni, E.; Edgecock, R.; Eggert, K.; Eisenhandler, E.; Ellis, N.; Erhard, P.; Faissner, H.; Fince Keeler, M.; Flynn, P.; Fontaine, G.; Frey, R.; Frühwirth, R.; Garvey, J.; Gee, D.; Geer, S.; Ghesquière, C.; Ghez, P.; Ghio, F.; Giacomelli, P.; Gibson, W. R.; Giraud-Héraud, Y.; Givernaud, A.; Gonidec, A.; Goodman, M.; Grassmann, H.; Grayer, G.; Guryn, W.; Hansl-Kozanecka, T.; Haynes, W.; Haywood, S. J.; Hoffmann, H.; Holthuizen, D. J.; Homer, R. J.; Homer, R. J.; Honma, A.; Jank, W.; Jimack, M.; Jorat, G.; Kalmus, P. I. P.; Karimäri, V.; Keeler, R.; Kenyon, I.; Kernan, A.; Kienzle, W.; Kinnunen, R.; Kozanecki, W.; Kroll, J.; Kryn, D.; Kyberd, P.; Lacava, F.; Laugier, J. P.; Lees, J. P.; Leuchs, R.; Levegrun, S.; Lévêque, A.; Levi, M.; Linglin, D.; Locci, E.; Long, K.; Markiewicz, T.; Markytan, M.; Martin, T.; Maurin, F.; McMahon, T.; Mendiburu, J.-P.; Meneguzzo, A.; Meyer, O.; Meyer, T.; Minard, M.-N.; Mohammadi, M.; Morgan, K.; Moricca, M.; Moser, H.; Mours, B.; Muller, Th.; Nandi, A.; Naumann, L.; Norton, A.; Paoluzi, L.; Pascoli, D.; Pauss, F.; Perault, C.; Piano Mortari, G.; Pietarinen, E.; Pigot, C.; Pimiä, M.; Pitman, D.; Placci, A.; Porte, J.-P.; Radermacher, E.; Ransdell, J.; Redelberger, T.; Reithler, H.; Revol, J. P.; Richman, J.; Rijssenbeek, M.; Rohlf, J.; Rossi, P.; Roberts, C.; Ruhm, W.; Rubbia, C.; Sajot, G.; Salvini, G.; Sass, J.; Sadoulet, B.; Samyn, D.; Savoy-Navarro, A.; Schinzel, D.; Schwartz, A.; Scott, W.; Scott, W.; Shah, T. P.; Sheer, I.; Siotis, I.; Smith, D.; Sobie, R.; Sphicas, P.; Strauss, J.; Streets, J.; Stubenrauch, C.; Summers, D.; Sumorok, K.; Szonczo, F.; Tao, C.; Ten Have, I.; Thompson, G.; Tscheslog, E.; Tuominiemi, J.; van Eijk, B.; Verecchia, P.; Vialle, J. P.; Virdee, T. S.; von der Schmitt, H.; von Schlippe, W.; Vrana, J.; Vuillemin, V.; Wahl, H. D.; Watkins, P.; Wilke, R.; Wilson, J.; Wingerter, I.; Wimpenny, S. J.; Wulz, C.-E.; Wyatt, T.; Yvert, M.; Zacharov, I.; Zaganidis, N.; Zanello, L.; Zotto, P.

    1986-05-01

    The inclusive jet cross section has been measured in the UA1 experiment at the CERN pp Collider at centre-of-mass energies √s = 546 GeV and √s = 630 eV. The cross sections are found to be consistent with QCD predictions, The observed change in the cross section with the centre-of-mass energy √s is accounted for in terms of xT scaling.

  1. Highlights from High Energy Neutrino Experiments at CERN

    NASA Astrophysics Data System (ADS)

    Schlatter, W.-D.

    2015-07-01

    Experiments with high energy neutrino beams at CERN provided early quantitative tests of the Standard Model. This article describes results from studies of the nucleon quark structure and of the weak current, together with the precise measurement of the weak mixing angle. These results have established a new quality for tests of the electroweak model. In addition, the measurements of the nucleon structure functions in deep inelastic neutrino scattering allowed first quantitative tests of QCD.

  2. PARTICLE PHYSICS: CERN Collider Glimpses Supersymmetry--Maybe.

    PubMed

    Seife, C

    2000-07-14

    Last week, particle physicists at the CERN laboratory in Switzerland announced that by smashing together matter and antimatter in four experiments, they detected an unexpected effect in the sprays of particles that ensued. The anomaly is subtle, and physicists caution that it might still be a statistical fluke. If confirmed, however, it could mark the long-sought discovery of a whole zoo of new particles--and the end of a long-standing model of particle physics.

  3. An alternative model to distribute VO software to WLCG sites based on CernVM-FS: a prototype at PIC Tier1

    NASA Astrophysics Data System (ADS)

    Lanciotti, E.; Merino, G.; Bria, A.; Blomer, J.

    2011-12-01

    In a distributed computing model as WLCG the software of experiment specific application software has to be efficiently distributed to any site of the Grid. Application software is currently installed in a shared area of the site visible for all Worker Nodes (WNs) of the site through some protocol (NFS, AFS or other). The software is installed at the site by jobs which run on a privileged node of the computing farm where the shared area is mounted in write mode. This model presents several drawbacks which cause a non-negligible rate of job failure. An alternative model for software distribution based on the CERN Virtual Machine File System (CernVM-FS) has been tried at PIC, the Spanish Tierl site of WLCG. The test bed used and the results are presented in this paper.

  4. The management of large cabling campaigns during the Long Shutdown 1 of LHC

    NASA Astrophysics Data System (ADS)

    Meroli, S.; Machado, S.; Formenti, F.; Frans, M.; Guillaume, J. C.; Ricci, D.

    2014-03-01

    The Large Hadron Collider at CERN entered into its first 18 month-long shutdown period in February 2013. During this period the entire CERN accelerator complex will undergo major consolidation and upgrade works, preparing the machines for LHC operation at nominal energy (7 TeV/beam). One of the most challenging activities concerns the cabling infrastructure (copper and optical fibre cables) serving the CERN data acquisition, networking and control systems. About 1000 kilometres of cables, distributed in different machine areas, will be installed, representing an investment of about 15 MCHF. This implies an extraordinary challenge in terms of project management, including resource and activity planning, work execution and quality control. The preparation phase of this project started well before its implementation, by defining technical solutions and setting financial plans for staff recruitment and material supply. Enhanced task coordination was further implemented by deploying selected competences to form a central support team.

  5. CERN@school: demonstrating physics with the Timepix detector

    NASA Astrophysics Data System (ADS)

    Whyntie, T.; Bithray, H.; Cook, J.; Coupe, A.; Eddy, D.; Fickling, R. L.; McKenna, J.; Parker, B.; Paul, A.; Shearer, N.

    2015-10-01

    This article shows how the Timepix hybrid silicon pixel detector, developed by the Medipix2 Collaboration, can be used by students and teachers alike to demonstrate some key aspects of any well-rounded physics curriculum with CERN@school. After an overview of the programme, the detector's capabilities for measuring and visualising ionising radiation are examined. The classification of clusters - groups of adjacent pixels - is discussed with respect to identifying the different types of particles. Three demonstration experiments - background radiation measurements, radiation profiles and the attenuation of radiation - are described; these can used as part of lessons or as inspiration for independent research projects. Results for exemplar data-sets are presented for reference, as well as details of ongoing research projects inspired by these experiments. Interested readers are encouraged to join the CERN@school Collaboration and so contribute to achieving the programme's aim of inspiring the next generation of scientists and engineers.

  6. CERN's approach to public outreach

    NASA Astrophysics Data System (ADS)

    Landua, Rolf

    2016-03-01

    CERN's communication goes beyond publishing scientific results. Education and outreach are equally important ways of communicating with the general public, and in particular with the young generation. Over the last decade, CERN has significantly increased its efforts to accommodate the very large interest of the general public (about 300,000 visit requests per year), by ramping up its capacity for guided tours from 25,000 to more than 100,000 visitors per year, by creating six new of state-of-the-art exhibitions on-site, by building and operating a modern physics laboratory for school teachers and students, and by showing several traveling exhibitions in about 10 countries per year. The offer for school teachers has also been expanded, to 35-40 weeks of teacher courses with more than 1000 participants from more than 50 countries per year. The talk will give an overview about these and related activities.

  7. The ADAM project: a generic web interface for retrieval and display of ATLAS TDAQ information

    NASA Astrophysics Data System (ADS)

    Harwood, A.; Lehmann Miotto, G.; Magnoni, L.; Vandelli, W.; Savu, D.

    2012-06-01

    This paper describes a new approach to the visualization of information about the operation of the ATLAS Trigger and Data Acquisition system. ATLAS is one of the two general purpose detectors positioned along the Large Hadron Collider at CERN. Its data acquisition system consists of several thousand computers interconnected via multiple gigabit Ethernet networks, that are constantly monitored via different tools. Operational parameters ranging from the temperature of the computers to the network utilization are stored in several databases for later analysis. Although the ability to view these data-sets individually is already in place, currently there is no way to view this data together, in a uniform format, from one location. The ADAM project has been launched in order to overcome this limitation. It defines a uniform web interface to collect data from multiple providers that have different structures. It is capable of aggregating and correlating the data according to user defined criteria. Finally, it visualizes the collected data using a flexible and interactive front-end web system. Structurally, the project comprises of 3 main levels of the data collection cycle: The Level 0 represents the information sources within ATLAS. These providers do not store information in a uniform fashion. The first step of the project was to define a common interface with which to expose stored data. The interface designed for the project originates from the Google Data Protocol API. The idea is to allow read-only access to data providers, through HTTP requests similar in format to the SQL query structure. This provides a standardized way to access this different information sources within ATLAS. The Level 1 can be considered the engine of the system. The primary task of the Level 1 is to gather data from multiple data sources via the common interface, to correlate this data together, or over a defined time series, and expose the combined data as a whole to the Level 2 web interface. The Level 2 is designed to present the data in a similar style and aesthetic, despite the different data sources. Pages can be constructed, edited and personalized by users to suit the specific data being shown. Pages can show a collection of graphs displaying data potentially coming from multiple sources. The project as a whole has a great amount of scope thanks to the uniform approach chosen for exposing data, and the flexibility of the Level 2 in presenting results. The paper will describe in detail the design and implementation of this new tool. In particular we will go through the project architecture, the implementation choices and the examples of usage of the system in place within the ATLAS TDAQ infrastructure.

  8. International Workshop on Linear Colliders 2010

    ScienceCinema

    Lebrun, Ph.

    2018-06-20

    IWLC2010 International Workshop on Linear Colliders 2010ECFA-CLIC-ILC joint meeting: Monday 18 October - Friday 22 October 2010Venue: CERN and CICG (International Conference Centre Geneva, Switzerland). This year, the International Workshop on Linear Colliders organized by the European Committee for Future Accelerators (ECFA) will study the physics, detectors and accelerator complex of a linear collider covering both CLIC and ILC options. Contact Workshop Secretariat  IWLC2010 is hosted by CERN.

  9. CERN: A global project

    NASA Astrophysics Data System (ADS)

    Voss, Rüdiger

    2017-07-01

    In the most important shift of paradigm of its membership rules in 60 years, CERN in 2010 introduced a policy of “Geographical Enlargement” which for the first time opened the door for membership of non-European States in the Organization. This short article reviews briefly the history of CERN’s membership rules, discusses the rationale behind the new policy, its relationship with the emerging global roadmap of particle physics, and gives a short overview of the status of the enlargement process.

  10. International Workshop on Linear Colliders 2010

    ScienceCinema

    Yamada, Sakue

    2018-05-24

    IWLC2010 International Workshop on Linear Colliders 2010ECFA-CLIC-ILC joint meeting: Monday 18 October - Friday 22 October 2010Venue: CERN and CICG (International Conference Centre Geneva, Switzerland) This year, the International Workshop on Linear Colliders organized by the European Committee for Future Accelerators (ECFA) will study the physics, detectors and accelerator complex of a linear collider covering both CLIC and ILC options. Contact Workshop Secretariat  IWLC2010 is hosted by CERN

  11. Performance of a liquid argon time projection chamber exposed to the CERN West Area Neutrino Facility neutrino beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arneodo, F.; Cavanna, F.; Mitri, I. De

    2006-12-01

    We present the results of the first exposure of a Liquid Argon TPC to a multi-GeV neutrino beam. The data have been collected with a 50 liters ICARUS-like chamber located between the CHORUS and NOMAD experiments at the CERN West Area Neutrino Facility (WANF). We discuss both the instrumental performance of the detector and its capability to identify and reconstruct low-multiplicity neutrino interactions.

  12. Upper limits of the proton magnetic form factor in the time-like region from p¯p--> e+e- at the CERN-ISR

    NASA Astrophysics Data System (ADS)

    Baglin, C.; Baird, S.; Bassompierre, G.; Borreani, G.; Brient, J. C.; Broll, C.; Brom, J. M.; Bugge, L.; Buran, T.; Burq, J. P.; Bussière, A.; Buzzo, A.; Cester, R.; Chemarin, M.; Chevallier, M.; Escoubes, B.; Fay, J.; Ferroni, S.; Gracco, V.; Guillaud, J. P.; Khan-Aronsen, E.; Kirsebom, K.; Ille, B.; Lambert, M.; Leistam, L.; Lundby, A.; Macri, M.; Marchetto, F.; Mattera, L.; Menichetti, E.; Mouellic, B.; Pastrone, N.; Petrillo, L.; Pia, M. G.; Poulet, M.; Pozzo, A.; Rinaudo, G.; Santroni, A.; Severi, M.; Skjevling, G.; Stapnes, S.; Stugu, B.; Tomasini, F.; Valbusa, U.

    1985-11-01

    From the measurement of e+e- pairs from the reaction p¯p-->e+e- at the CERN-ISR, using an antiproton beam and a hydrogen jet target, we derived upper limits for the proton magnetic form factor in the time-like region at Q2⋍8.9(GeV/c)2 and Q2⋍12.5(GeV/c)2.

  13. Diffractive Higgs boson production at the Fermilab Tevatron and the CERN Large Hadron Collider.

    PubMed

    Enberg, R; Ingelman, G; Kissavos, A; Tîmneanu, N

    2002-08-19

    Improved possibilities to find the Higgs boson in diffractive events, having less hadronic activity, depend on whether the cross section is large enough. Based on the soft color interaction models that successfully describe diffractive hard scattering at DESY HERA and the Fermilab Tevatron, we find that only a few diffractive Higgs events may be produced at the Tevatron, but we predict a substantial rate at the CERN Large Hadron Collider.

  14. Integrating new Storage Technologies into EOS

    NASA Astrophysics Data System (ADS)

    Peters, Andreas J.; van der Ster, Dan C.; Rocha, Joaquim; Lensing, Paul

    2015-12-01

    The EOS[1] storage software was designed to cover CERN disk-only storage use cases in the medium-term trading scalability against latency. To cover and prepare for long-term requirements the CERN IT data and storage services group (DSS) is actively conducting R&D and open source contributions to experiment with a next generation storage software based on CEPH[3] and ethernet enabled disk drives. CEPH provides a scale-out object storage system RADOS and additionally various optional high-level services like S3 gateway, RADOS block devices and a POSIX compliant file system CephFS. The acquisition of CEPH by Redhat underlines the promising role of CEPH as the open source storage platform of the future. CERN IT is running a CEPH service in the context of OpenStack on a moderate scale of 1 PB replicated storage. Building a 100+PB storage system based on CEPH will require software and hardware tuning. It is of capital importance to demonstrate the feasibility and possibly iron out bottlenecks and blocking issues beforehand. The main idea behind this R&D is to leverage and contribute to existing building blocks in the CEPH storage stack and implement a few CERN specific requirements in a thin, customisable storage layer. A second research topic is the integration of ethernet enabled disks. This paper introduces various ongoing open source developments, their status and applicability.

  15. A possible biomedical facility at the European Organization for Nuclear Research (CERN)

    PubMed Central

    Dosanjh, M; Myers, S

    2013-01-01

    A well-attended meeting, called “Brainstorming discussion for a possible biomedical facility at CERN”, was held by the European Organization for Nuclear Research (CERN) at the European Laboratory for Particle Physics on 25 June 2012. This was concerned with adapting an existing, but little used, 78-m circumference CERN synchrotron to deliver a wide range of ion species, preferably from protons to at least neon ions, with beam specifications that match existing clinical facilities. The potential extensive research portfolio discussed included beam ballistics in humanoid phantoms, advanced dosimetry, remote imaging techniques and technical developments in beam delivery, including gantry design. In addition, a modern laboratory for biomedical characterisation of these beams would allow important radiobiological studies, such as relative biological effectiveness, in a dedicated facility with standardisation of experimental conditions and biological end points. A control photon and electron beam would be required nearby for relative biological effectiveness comparisons. Research beam time availability would far exceed that at other facilities throughout the world. This would allow more rapid progress in several biomedical areas, such as in charged hadron therapy of cancer, radioisotope production and radioprotection. The ethos of CERN, in terms of open access, peer-reviewed projects and governance has been so successful for High Energy Physics that application of the same to biomedicine would attract high-quality research, with possible contributions from Europe and beyond, along with potential new funding streams. PMID:23549990

  16. Measurements and FLUKA Simulations of Bismuth, Aluminium and Indium Activation at the upgraded CERN Shielding Benchmark Facility (CSBF)

    NASA Astrophysics Data System (ADS)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.; Yashima, H.

    2018-06-01

    The CERN High energy AcceleRator Mixed field (CHARM) facility is situated in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5·1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7·1010 protons per second. The extracted proton beam impacts on a cylindrical copper target. The shielding of the CHARM facility includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target that allows deep shielding penetration benchmark studies of various shielding materials. This facility has been significantly upgraded during the extended technical stop at the beginning of 2016. It consists now of 40 cm of cast iron shielding, a 200 cm long removable sample holder concrete block with 3 inserts for activation samples, a material test location that is used for the measurement of the attenuation length for different shielding materials as well as for sample activation at different thicknesses of the shielding materials. Activation samples of bismuth, aluminium and indium were placed in the CSBF in September 2016 to characterize the upgraded version of the CSBF. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields of bismuth isotopes (206 Bi, 205 Bi, 204 Bi, 203 Bi, 202 Bi, 201 Bi) from 209 Bi, 24 Na from 27 Al and 115 m I from 115 I for these samples. The production yields estimated by FLUKA Monte Carlo simulations are compared to the production yields obtained from γ-spectroscopy measurements of the samples taking the beam intensity profile into account. The agreement between FLUKA predictions and γ-spectroscopy measurements for the production yields is at a level of a factor of 2.

  17. Two-particle correlations in azimuthal angle and pseudorapidity in inelastic p + p interactions at the CERN Super Proton Synchrotron

    DOE PAGES

    Aduszkiewicz, A.; Ali, Y.; Andronov, E.; ...

    2017-01-30

    Results on two-particle ΔηΔΦ correlations in inelastic p + p interactions at 20, 31, 40, 80, and 158 GeV/c are presented. The measurements were performed using the large acceptance NA61/SHINE hadron spectrometer at the CERN Super Proton Synchrotron. The data show structures which can be attributed mainly to effects of resonance decays, momentum conservation, and quantum statistics. Furthermore, the results are compared with the Epos and UrQMD models.

  18. News UK public libraries offer walk-in access to research Atoms for Peace? The Atomic Weapons Establishment and UK universities Students present their research to academics: CERN@school Science in a suitcase: Marvin and Milo visit Ethiopia Inspiring telescopes A day for everyone teaching physics 2014 Forthcoming Events

    NASA Astrophysics Data System (ADS)

    2014-05-01

    UK public libraries offer walk-in access to research Atoms for Peace? The Atomic Weapons Establishment and UK universities Students present their research to academics: CERN@school Science in a suitcase: Marvin and Milo visit Ethiopia Inspiring telescopes A day for everyone teaching physics 2014 Forthcoming Events

  19. Overview of LHC physics results at ICHEP

    ScienceCinema

    Mangano, Michelangelo

    2018-06-20

    This month LHC physics day will review the physics results presented by the LHC experiments at the 2010 ICHEP in Paris. The experimental presentations will be preceeded by the bi-weekly LHC accelerator status report.The meeting will be broadcast via EVO (detailed info will appear at the time of the meeting in the "Video Services" item on the left menu bar). For those attending, information on accommodation, access to CERN and laptop registration is available from http://cern.ch/lpcc/visits

  20. CERN at 60: giant magnet journeys through Geneva

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2014-07-01

    More than 30,000 people descended onto Geneva's harbour last month to celebrate the bicentenary of the city's integration into Switzerland with a parade through the city. Joining the 1200 participants at the Genève200 celebrations were staff from the CERN particle-physics lab, which is located on the outskirts of Geneva, who paraded a superconducting dipole magnet - similar to the thousands used in the Large Hadron Collider - through the city's narrow streets on a 20 m lorry.

  1. Astronomie, écologie et poésie par Hubert Reeves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-09-21

    Hubert ReevesL'astrophysicien donne une conférence puis s'entretient avec l'écrivain François Bon autour de :"Astronomie, écologie et poésie"Pour plus d'informations : http://outreach.web.cern.ch/outreach/FR/evenements/conferences.htmlNombre de places limité. Réservation obligatoire à la Réception du CERN : +41 22 767 76 76  Soirée diffusée en direct sur le Web : http://webcast.cern.ch/      

  2. Retirement Kjell Johnsen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2007-12-05

    A l'occasion de son 65me anniversaire plusieurs orateurs (aussi l'ambassadeur de Norvège) remercient Kjell Johnsen, né en juin 1921 en Norvège, pour ses 34 ans de service au Cern et retracent sa vie et son travail. K.Johnsen a pris part aux premières études sur les accélérateurs du futur centre de physique et fut aussi le père et le premier directeur de l'Ecole du Cern sur les accélérateurs (CAS)

  3. News Conference: Physics brings the community together Training: CERN trains physics teachers Education: World conference fosters physics collaborations Lecture: Physics education live at ASE Prize: Physics teacher wins first Moore medal Festival: European presidents patronize Science on Stage festival Videoconference: Videoconference brings Durban closer to the classroom

    NASA Astrophysics Data System (ADS)

    2012-03-01

    Conference: Physics brings the community together Training: CERN trains physics teachers Education: World conference fosters physics collaborations Lecture: Physics education live at ASE Prize: Physics teacher wins first Moore medal Festival: European presidents patronize Science on Stage festival Videoconference: Videoconference brings Durban closer to the classroom

  4. CERN's Common Unix and X Terminal Environment

    NASA Astrophysics Data System (ADS)

    Cass, Tony

    The Desktop Infrastructure Group of CERN's Computing and Networks Division has developed a Common Unix and X Terminal Environment to ease the migration to Unix based Interactive Computing. The CUTE architecture relies on a distributed filesystem—currently Trans arc's AFS—to enable essentially interchangeable client work-stations to access both "home directory" and program files transparently. Additionally, we provide a suite of programs to configure workstations for CUTE and to ensure continued compatibility. This paper describes the different components and the development of the CUTE architecture.

  5. News Festival: Science on stage deadline approaches Conference: Welsh conference attracts teachers Data: New phase of CERN openlab tackles exascale IT challenges for science Meeting: German Physical Society holds its physics education spring meeting Conference: Association offers golden opportunity in Norway Competition: So what's the right answer then?

    NASA Astrophysics Data System (ADS)

    2012-07-01

    Festival: Science on stage deadline approaches Conference: Welsh conference attracts teachers Data: New phase of CERN openlab tackles exascale IT challenges for science Meeting: German Physical Society holds its physics education spring meeting Conference: Association offers golden opportunity in Norway Competition: So what's the right answer then?

  6. Overview of LHC physics results at ICHEP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2011-02-25

     This month LHC physics day will review the physics results presented by the LHC experiments at the 2010 ICHEP in Paris. The experimental presentations will be preceeded by the bi-weekly LHC accelerator status report.The meeting will be broadcast via EVO (detailed info will appear at the time of the meeting in the "Video Services" item on the left menu bar)For those attending, information on accommodation, access to CERN and laptop registration is available from http://cern.ch/lpcc/visits

  7. Measurement of the antiproton-nucleus annihilation cross-section at low energy

    NASA Astrophysics Data System (ADS)

    Aghai-Khozani, H.; Bianconi, A.; Corradini, M.; Hayano, R.; Hori, M.; Leali, M.; Lodi Rizzini, E.; Mascagna, V.; Murakami, Y.; Prest, M.; Vallazza, E.; Venturelli, L.; Yamada, H.

    2018-02-01

    Systematic measurements of the annihilation cross sections of low energy antinucleons were performed at CERN in the 80's and 90's. However the antiproton data on medium-heavy and heavy nuclear targets are scarce. The ASACUSA Collaboration at CERN has measured the antiproton annihilation cross section on carbon at 5.3 MeV: the value is (1.73 ± 0.25) barn. The result is compared with the antineutron experimental data and with the theoretical previsions.

  8. High Energy Electron Detection with ATIC

    NASA Technical Reports Server (NTRS)

    Chang, J.; Schmidt, W. K. H.; Adams, James H., Jr.; Ahn, H.; Ampe, J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The ATIC (Advanced Thin Ionization Calorimeter) balloon-borne ionization calorimeter is well suited to record and identify high energy cosmic ray electrons. The instrument was exposed to high-energy beams at CERN H2 bean-dine in September of 1999. We have simulated the performance of the instrument, and compare the simulations with actual high energy electron exposures at the CERN accelerator. Simulations and measurements do not compare exactly, in detail, but overall the simulations have predicted actual measured behavior quite well.

  9. Optical fibres in the radiation environment of CERN

    NASA Astrophysics Data System (ADS)

    Guillermain, E.

    2017-11-01

    CERN, the European Organization for Nuclear Research (in Geneva, Switzerland), is home to a complex scientific instrument: the 27-kilometre Large Hadron Collider (LHC) collides beams of high-energy particles at close to the speed of light. Optical fibres are widely used at CERN, both in surface areas (e.g. for inter-building IT networks) and in the accelerator complex underground (e.g. for cryogenics, vacuum, safety systems). Optical fibres in the accelerator are exposed to mixed radiation fields (mainly composed of protons, pions, neutrons and other hadrons, gamma rays and electrons), with dose rates depending on the particular installation zone, and with radiation levels often significantly higher than those encountered in space. In the LHC and its injector chain radiation levels range from relatively low annual doses of a few Gy up to hundreds of kGy. Optical fibres suffer from Radiation Induced Attenuation (RIA, expressed in dB per unit length) that affect light transmission and which depends on the irradiation conditions (e.g. dose rate, total dose, temperature). In the CERN accelerator complex, the failure of an optical link can affect the proper functionality of control or monitoring systems and induce the interruption of the accelerator operation. The qualification of optical fibres for installation in critical radiation areas is therefore crucial. Thus, all optical fibre types installed in radiation areas at CERN are subject to laboratory irradiation tests, in order to evaluate their RIA at different total dose and dose rates. This allows the selection of the appropriate optical fibre type (conventional or radiation resistant) compliant with the requirements of each installation. Irradiation tests are performed in collaboration with Fraunhofer INT (irradiation facilities and expert team in Euskirchen, Germany). Conventional off-the-shelf optical fibres can be installed for optical links exposed to low radiation levels (i.e. annual dose typically below few kGy). Nevertheless, the conventional optical fibres must be carefully qualified as a spread in RIA of factor 10 is observed among optical fibres of different types and dopants. In higher radiation areas, special radiation resistant optical fibres are installed. For total dose above 1 kGy, the RIA of these special optical fibres is at least 10 times lower than the conventional optical fibres RIA at same irradiation conditions. 2400 km of these special radiation resistant optical fibres were recently procured at CERN. As part of this procurement process, a quality assurance plan including the irradiation testing of all 65 produced batches was set up. This presentation will review the selection process of the appropriate optical fibre types to be installed in the radiation environment of CERN. The methodology for choosing the irradiation parameters for the laboratory tests will be discussed together with an overview of the RIA of different optical fibre types under several irradiation conditions.

  10. MCNP Output Data Analysis with ROOT (MODAR)

    NASA Astrophysics Data System (ADS)

    Carasco, C.

    2010-06-01

    MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. Program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 155 373 No. of bytes in distributed program, including test data, etc.: 14 815 461 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PC Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two-dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Nature of problem: The output of an MCNP simulation is an ASCII file. The data processing is usually performed by copying and pasting the relevant parts of the ASCII file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two-step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two-dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two-dimensional data. Running time: The CPU time needed to smear a two-dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two-dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.

  11. LHC, le Big Bang en éprouvette

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Notre compréhension de l’Univers est en train de changer… Bar des Sciences - Tout public Débat modéré par Marie-Odile Montchicourt, journaliste de France Info. Evenement en vidéoconférence entre le Globe de la science et de l’innovation, le bar le Baloard de Montpellier et la Maison des Métallos à Paris. Intervenants au CERN : Philippe Charpentier et Daniel Froideveaux, physiciens au CERN. Intervenants à Paris : Vincent Bontemps, philosophe et chercheur au CEA ; Jacques Arnould, philosophe, historien des sciences et théologien, Jean-Jacques Beineix, réalisateur, producteur, scénariste de cinéma. Intervenants à Montpellier (LPTA) : André Neveu, physicien théoricien et directeur demore » recherche au CNRS ; Gilbert Moultaka, physicien théoricien et chargé de recherche au CNRS. Partenariat : CERN, CEA, IN2P3, Université MPL2 (LPTA) Dans le cadre de la Fête de la science 2008.« less

  12. Disk storage at CERN

    NASA Astrophysics Data System (ADS)

    Mascetti, L.; Cano, E.; Chan, B.; Espinal, X.; Fiorot, A.; González Labrador, H.; Iven, J.; Lamanna, M.; Lo Presti, G.; Mościcki, JT; Peters, AJ; Ponce, S.; Rousseau, H.; van der Ster, D.

    2015-12-01

    CERN IT DSS operates the main storage resources for data taking and physics analysis mainly via three system: AFS, CASTOR and EOS. The total usable space available on disk for users is about 100 PB (with relative ratios 1:20:120). EOS actively uses the two CERN Tier0 centres (Meyrin and Wigner) with 50:50 ratio. IT DSS also provide sizeable on-demand resources for IT services most notably OpenStack and NFS-based clients: this is provided by a Ceph infrastructure (3 PB) and few proprietary servers (NetApp). We will describe our operational experience and recent changes to these systems with special emphasis to the present usages for LHC data taking, the convergence to commodity hardware (nodes with 200-TB each with optional SSD) shared across all services. We also describe our experience in coupling commodity and home-grown solution (e.g. CERNBox integration in EOS, Ceph disk pools for AFS, CASTOR and NFS) and finally the future evolution of these systems for WLCG and beyond.

  13. First test of BNL electron beam ion source with high current density electron beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pikin, Alexander, E-mail: pikin@bnl.gov; Alessi, James G., E-mail: pikin@bnl.gov; Beebe, Edward N., E-mail: pikin@bnl.gov

    A new electron gun with electrostatic compression has been installed at the Electron Beam Ion Source (EBIS) Test Stand at BNL. This is a collaborative effort by BNL and CERN teams with a common goal to study an EBIS with electron beam current up to 10 A, current density up to 10,000 A/cm{sup 2} and energy more than 50 keV. Intensive and pure beams of heavy highly charged ions with mass-to-charge ratio < 4.5 are requested by many heavy ion research facilities including NASA Space Radiation Laboratory (NSRL) at BNL and HIE-ISOLDE at CERN. With a multiampere electron gun, themore » EBIS should be capable of delivering highly charged ions for both RHIC facility applications at BNL and for ISOLDE experiments at CERN. Details of the electron gun simulations and design, and the Test EBIS electrostatic and magnetostatic structures with the new electron gun are presented. The experimental results of the electron beam transmission are given.« less

  14. Protocols for Scholarly Communication

    NASA Astrophysics Data System (ADS)

    Pepe, A.; Yeomans, J.

    2007-10-01

    CERN, the European Organization for Nuclear Research, has operated an institutional preprint repository for more than 10 years. The repository contains over 850,000 records of which more than 450,000 are full-text OA preprints, mostly in the field of particle physics, and it is integrated with the library's holdings of books, conference proceedings, journals and other grey literature. In order to encourage effective propagation and open access to scholarly material, CERN is implementing a range of innovative library services into its document repository: automatic keywording, reference extraction, collaborative management tools and bibliometric tools. Some of these services, such as user reviewing and automatic metadata extraction, could make up an interesting testbed for future publishing solutions and certainly provide an exciting environment for e-science possibilities. The future protocol for scientific communication should guide authors naturally towards OA publication, and CERN wants to help reach a full open access publishing environment for the particle physics community and related sciences in the next few years.

  15. First experimental evidence of hydrodynamic tunneling of ultra-relativistic protons in extended solid copper target at the CERN HiRadMat facility

    NASA Astrophysics Data System (ADS)

    Schmidt, R.; Blanco Sancho, J.; Burkart, F.; Grenier, D.; Wollmann, D.; Tahir, N. A.; Shutov, A.; Piriz, A. R.

    2014-08-01

    A novel experiment has been performed at the CERN HiRadMat test facility to study the impact of the 440 GeV proton beam generated by the Super Proton Synchrotron on extended solid copper cylindrical targets. Substantial hydrodynamic tunneling of the protons in the target material has been observed that leads to significant lengthening of the projectile range, which confirms our previous theoretical predictions [N. A. Tahir et al., Phys. Rev. Spec. Top.-Accel. Beams 15, 051003 (2012)]. Simulation results show very good agreement with the experimental measurements. These results have very important implications on the machine protection design for powerful machines like the Large Hadron Collider (LHC), the future High Luminosity LHC, and the proposed huge 80 km circumference Future Circular Collider, which is currently being discussed at CERN. Another very interesting outcome of this work is that one may also study the field of High Energy Density Physics at this test facility.

  16. First experience with carbon stripping foils for the 160 MeV H- injection into the CERN PSB

    NASA Astrophysics Data System (ADS)

    Weterings, Wim; Bracco, Chiara; Jorat, Louise; Noulibos, Remy; van Trappen, Pieter

    2018-05-01

    160 MeV H- beam will be delivered from the new CERN linear accelerator (Linac4) to the Proton Synchrotron Booster (PSB), using a H- charge-exchange injection system. A 200 µg/cm2 carbon stripping foil will convert H- into protons by stripping off the electrons. The H- charge-exchange injection principle will be used for the first time in the CERN accelerator complex and involves many challenges. In order to gain experience with the foil changing mechanism and the very fragile foils, in 2016, prior to the installation in the PSB, a stripping foil test stand has been installed in the Linac4 transfer line. In addition, parts of the future PSB injection equipment are also temporarily installed in the Linac4 transfer line for tests with a 160 MeV H- commissioning proton beam. This paper describes the foil changing mechanism and control system, summarizes the practical experience of gluing and handling these foils and reports on the first results with beam.

  17. Chicago Ebola Response Network (CERN): A Citywide Cross-hospital Collaborative for Infectious Disease Preparedness.

    PubMed

    Lateef, Omar; Hota, Bala; Landon, Emily; Kociolek, Larry K; Morita, Julie; Black, Stephanie; Noskin, Gary; Kelleher, Michael; Curell, Krista; Galat, Amy; Ansell, David; Segreti, John; Weber, Stephen G

    2015-11-15

    The 2014-2015 Ebola virus disease (EVD) epidemic and international public health emergency has been referred to as a "black swan" event, or an event that is unlikely, hard to predict, and highly impactful once it occurs. The Chicago Ebola Response Network (CERN) was formed in response to EVD and is capable of receiving and managing new cases of EVD, while also laying the foundation for a public health network that can anticipate, manage, and prevent the next black swan public health event. By sharing expertise, risk, and resources among 4 major academic centers, Chicago created a sustainable network to respond to the latest in a series of public health emergencies. In this respect, CERN is a roadmap for how a region can prepare to respond to public health emergencies, thereby preventing negative impacts through planning and implementation. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. x509-free access to WLCG resources

    NASA Astrophysics Data System (ADS)

    Short, H.; Manzi, A.; De Notaris, V.; Keeble, O.; Kiryanov, A.; Mikkonen, H.; Tedesco, P.; Wartel, R.

    2017-10-01

    Access to WLCG resources is authenticated using an x509 and PKI infrastructure. Even though HEP users have always been exposed to certificates directly, the development of modern Web Applications by the LHC experiments calls for simplified authentication processes keeping the underlying software unmodified. In this work we will show a solution with the goal of providing access to WLCG resources using the user’s home organisations credentials, without the need for user-acquired x509 certificates. In particular, we focus on identity providers within eduGAIN, which interconnects research and education organisations worldwide, and enables the trustworthy exchange of identity-related information. eduGAIN has been integrated at CERN in the SSO infrastructure so that users can authenticate without the need of a CERN account. This solution achieves x509-free access to Grid resources with the help of two services: STS and an online CA. The STS (Security Token Service) allows credential translation from the SAML2 format used by Identity Federations to the VOMS-enabled x509 used by most of the Grid. The IOTA CA (Identifier-Only Trust Assurance Certification Authority) is responsible for the automatic issuing of short-lived x509 certificates. The IOTA CA deployed at CERN has been accepted by EUGridPMA as the CERN LCG IOTA CA, included in the IGTF trust anchor distribution and installed by the sites in WLCG. We will also describe the first pilot projects which are integrating the solution.

  19. News Music: Here comes science that rocks Student trip: Two views of the future of CERN Classroom: Researchers can motivate pupils Appointment: AstraZeneca trust appoints new director Multimedia: Physics Education comes to YouTube Competition: Students compete in European Union Science Olympiad 2010 Physics roadshow: Pupils see wonders of physics

    NASA Astrophysics Data System (ADS)

    2010-07-01

    Music: Here comes science that rocks Student trip: Two views of the future of CERN Classroom: Researchers can motivate pupils Appointment: AstraZeneca trust appoints new director Multimedia: Physics Education comes to YouTube Competition: Students compete in European Union Science Olympiad 2010 Physics roadshow: Pupils see wonders of physics

  20. AMS data production facilities at science operations center at CERN

    NASA Astrophysics Data System (ADS)

    Choutko, V.; Egorov, A.; Eline, A.; Shan, B.

    2017-10-01

    The Alpha Magnetic Spectrometer (AMS) is a high energy physics experiment on the board of the International Space Station (ISS). This paper presents the hardware and software facilities of Science Operation Center (SOC) at CERN. Data Production is built around production server - a scalable distributed service which links together a set of different programming modules for science data transformation and reconstruction. The server has the capacity to manage 1000 paralleled job producers, i.e. up to 32K logical processors. Monitoring and management tool with Production GUI is also described.

  1. Ceremony 25th birthday Cern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2006-05-08

    Célébration du 25ème anniversaire du Cern (jour par jour) avec discours de L.Van Hove et J.B.Adams, des interludes musicals offerts par Mme Mey et ses collègues (au debut 1.mouvement du quatuor avec piano no 3 de L.van Beethoven) Les directeurs généraux procéderont à la remise du souvenir aux membres de personnel ayant 25 années de service dans l'organisation. Un témoignage de reconnaissance est auss fait à l'interprète Mme Zwerner

  2. Experience in running relational databases on clustered storage

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, Ruben; Potocky, Miroslav

    2015-12-01

    For past eight years, CERN IT Database group has based its backend storage on NAS (Network-Attached Storage) architecture, providing database access via NFS (Network File System) protocol. In last two and half years, our storage has evolved from a scale-up architecture to a scale-out one. This paper describes our setup and a set of functionalities providing key features to other services like Database on Demand [1] or CERN Oracle backup and recovery service. It also outlines possible trend of evolution that, storage for databases could follow.

  3. CERN Computing in Commercial Clouds

    NASA Astrophysics Data System (ADS)

    Cordeiro, C.; Field, L.; Garrido Bear, B.; Giordano, D.; Jones, B.; Keeble, O.; Manzi, A.; Martelli, E.; McCance, G.; Moreno-García, D.; Traylen, S.

    2017-10-01

    By the end of 2016 more than 10 Million core-hours of computing resources have been delivered by several commercial cloud providers to the four LHC experiments to run their production workloads, from simulation to full chain processing. In this paper we describe the experience gained at CERN in procuring and exploiting commercial cloud resources for the computing needs of the LHC experiments. The mechanisms used for provisioning, monitoring, accounting, alarming and benchmarking will be discussed, as well as the involvement of the LHC collaborations in terms of managing the workflows of the experiments within a multicloud environment.

  4. The ISOLDE LEGO® robot: building interest in frontier research

    NASA Astrophysics Data System (ADS)

    Elias Cocolios, Thomas; Lynch, Kara M.; Nichols, Emma

    2017-07-01

    An outreach programme centred around nuclear physics making use of a LEGO® Mindstorm® kit is presented. It consists of a presentation given by trained undergraduate students as science ambassadors followed by a workshop where the target audience programs the LEGO® Mindstorm® robots to familiarise themselves with the concepts in an interactive and exciting way. This programme has been coupled with the CERN-ISOLDE 50th anniversary and the launch of the CERN-MEDICIS facility in Geneva, Switzerland. The modular aspect of the programme readily allows its application to other topics.

  5. Neutron-induced fission cross section measurement of 233U, 241Am and 243Am in the energy range 0.5 MeV En 20 MeV at n TOF at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belloni, F.; Milazzo, P. M.; Calviani, M.

    2012-01-01

    Neutron-induced fission cross section measurements of 233U, 243Am and 241Am relative to 235U have been carried out at the neutron time-of-flight facility n TOF at CERN. A fast ionization chamber has been employed. All samples were located in the same detector; therefore the studied elements and the reference 235U target are subject to the same neutron beam.

  6. The CERN-EU high-energy Reference Field (CERF) facility: applications and latest developments

    NASA Astrophysics Data System (ADS)

    Silari, Marco; Pozzi, Fabio

    2017-09-01

    The CERF facility at CERN provides an almost unique high-energy workplace reference radiation field for the calibration and test of radiation protection instrumentation employed at high-energy accelerator facilities and for aircraft and space dosimetry. This paper describes the main features of the facility and supplies a non-exhaustive list of recent (as of 2005) applications for which CERF is used. Upgrade work started in 2015 to provide the scientific and industrial communities with a state-of-the-art reference facility is also discussed.

  7. Windows Terminal Servers Orchestration

    NASA Astrophysics Data System (ADS)

    Bukowiec, Sebastian; Gaspar, Ricardo; Smith, Tim

    2017-10-01

    Windows Terminal Servers provide application gateways for various parts of the CERN accelerator complex, used by hundreds of CERN users every day. The combination of new tools such as Puppet, HAProxy and Microsoft System Center suite enable automation of provisioning workflows to provide a terminal server infrastructure that can scale up and down in an automated manner. The orchestration does not only reduce the time and effort necessary to deploy new instances, but also facilitates operations such as patching, analysis and recreation of compromised nodes as well as catering for workload peaks.

  8. Astronomie, écologie et poésie par Hubert Reeves

    ScienceCinema

    None

    2017-12-09

    Hubert ReevesL'astrophysicien donne une conférence puis s'entretient avec l'écrivain François Bon autour de :"Astronomie, écologie et poésie"Pour plus d'informations : http://outreach.web.cern.ch/outreach/FR/evenements/conferences.htmlNombre de places limité. Réservation obligatoire à la Réception du CERN : +41 22 767 76 76  Soirée diffusée en direct sur le Web : http://webcast.cern.ch/      

  9. Unified Monitoring Architecture for IT and Grid Services

    NASA Astrophysics Data System (ADS)

    Aimar, A.; Aguado Corman, A.; Andrade, P.; Belov, S.; Delgado Fernandez, J.; Garrido Bear, B.; Georgiou, M.; Karavakis, E.; Magnoni, L.; Rama Ballesteros, R.; Riahi, H.; Rodriguez Martinez, J.; Saiz, P.; Zolnai, D.

    2017-10-01

    This paper provides a detailed overview of the Unified Monitoring Architecture (UMA) that aims at merging the monitoring of the CERN IT data centres and the WLCG monitoring using common and widely-adopted open source technologies such as Flume, Elasticsearch, Hadoop, Spark, Kibana, Grafana and Zeppelin. It provides insights and details on the lessons learned, explaining the work performed in order to monitor the CERN IT data centres and the WLCG computing activities such as the job processing, data access and transfers, and the status of sites and services.

  10. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    NASA Astrophysics Data System (ADS)

    Varela Rodriguez, F.

    2011-12-01

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  11. Commissioning and initial experience with the ALICE on-line

    NASA Astrophysics Data System (ADS)

    Altini, V.; Anticic, T.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Dénes, E.; Divià, R.; Fuchs, U.; Kiss, T.; Makhlyueva, I.; Roukoutakis, F.; Schossmaier, K.; Soós, C.; Vande Vyvre, P.; von Haller, B.; ALICE Collaboration

    2010-04-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). A large bandwidth and flexible Data Acquisition System (DAQ) has been designed and deployed to collect sufficient statistics in the short running time available per year for heavy ions and to accommodate very different requirements originated from the 18 sub-detectors. This paper will present the large scale tests conducted to assess the standalone DAQ performances, the interfaces with the other online systems and the extensive commissioning performed in order to be fully prepared for physics data taking. It will review the experience accumulated since May 2007 during the standalone commissioning of the main detectors and the global cosmic runs and the lessons learned from this exposure on the "battle field". It will also discuss the test protocol followed to integrate and validate each sub-detector with the online systems and it will conclude with the first results of the LHC injection tests and startup in September 2008. Several papers of the same conference present in more details some elements of the ALICE DAQ system.

  12. YODA++: A proposal for a semi-automatic space mission control

    NASA Astrophysics Data System (ADS)

    Casolino, M.; de Pascale, M. P.; Nagni, M.; Picozza, P.

    YODA++ is a proposal for a semi-automated data handling and analysis system for the PAMELA space experiment. The core of the routines have been developed to process a stream of raw data downlinked from the Resurs DK1 satellite (housing PAMELA) to the ground station in Moscow. Raw data consist of scientific data and are complemented by housekeeping information. Housekeeping information will be analyzed within a short time from download (1 h) in order to monitor the status of the experiment and to foreseen the mission acquisition planning. A prototype for the data visualization will run on an APACHE TOMCAT web application server, providing an off-line analysis tool using a browser and part of code for the system maintenance. Data retrieving development is in production phase, while a GUI interface for human friendly monitoring is on preliminary phase as well as a JavaServerPages/JavaServerFaces (JSP/JSF) web application facility. On a longer timescale (1 3 h from download) scientific data are analyzed. The data storage core will be a mix of CERNs ROOT files structure and MySQL as a relational database. YODA++ is currently being used in the integration and testing on ground of PAMELA data.

  13. Mad-X a worthy successor for MAD8?

    NASA Astrophysics Data System (ADS)

    Schmidt, F.

    2006-03-01

    MAD-X is the successor at CERN to MAD8, a program for accelerator design and simulation with a long history. We had to give up on MAD8 since the code had evolved in such a way that the maintenance and upgrades had become increasingly difficult. In particular, the memory management with the Zebra banks seemed outdated. MAD-X was first released in June, 2002. It offers most of the MAD8 functionality, with some additions, corrections, and extensions. The most important of these extensions is the interface to PTC, the Polymorphic Tracking Code by E. Forest. The most relevant new features of MAD-X are: languages: C, Fortran77, and Fortran90; dynamic memory allocation: in the core program written in C; strictly modular organization, modified and extended input language; symplectic and arbitrary exact description of all elements via PTC; Taylor Maps and Normal Form techniques using PTC. It is also important to note that we have adopted a new style for program development and maintenance that relies heavily on active maintenance of modules by the users themselves. Proposals for collaboration as with KEK, Japan and GSI, Germany are therefore very welcome.

  14. Graphics Processors in HEP Low-Level Trigger Systems

    NASA Astrophysics Data System (ADS)

    Ammendola, Roberto; Biagioni, Andrea; Chiozzi, Stefano; Cotta Ramusino, Angelo; Cretaro, Paolo; Di Lorenzo, Stefano; Fantechi, Riccardo; Fiorini, Massimiliano; Frezza, Ottorino; Lamanna, Gianluca; Lo Cicero, Francesca; Lonardo, Alessandro; Martinelli, Michele; Neri, Ilaria; Paolucci, Pier Stanislao; Pastorelli, Elena; Piandani, Roberto; Pontisso, Luca; Rossetti, Davide; Simula, Francesco; Sozzi, Marco; Vicini, Piero

    2016-11-01

    Usage of Graphics Processing Units (GPUs) in the so called general-purpose computing is emerging as an effective approach in several fields of science, although so far applications have been employing GPUs typically for offline computations. Taking into account the steady performance increase of GPU architectures in terms of computing power and I/O capacity, the real-time applications of these devices can thrive in high-energy physics data acquisition and trigger systems. We will examine the use of online parallel computing on GPUs for the synchronous low-level trigger, focusing on tests performed on the trigger system of the CERN NA62 experiment. To successfully integrate GPUs in such an online environment, latencies of all components need analysing, networking being the most critical. To keep it under control, we envisioned NaNet, an FPGA-based PCIe Network Interface Card (NIC) enabling GPUDirect connection. Furthermore, it is assessed how specific trigger algorithms can be parallelized and thus benefit from a GPU implementation, in terms of increased execution speed. Such improvements are particularly relevant for the foreseen Large Hadron Collider (LHC) luminosity upgrade where highly selective algorithms will be essential to maintain sustainable trigger rates with very high pileup.

  15. Measurements of the production cross section of a $Z$ boson in association with jets in pp collisions at $$\\sqrt{s} = 13$$ TeV with the ATLAS detector

    DOE PAGES

    Aaboud, M.; Aad, G.; Abbott, B.; ...

    2017-05-31

    Measurements of the production cross section of a Z boson in association with jets in proton–proton collisions at √s = 13 TeV are presented, using data corresponding to an integrated luminosity of 3.16 fb –1 collected by the ATLAS experiment at the CERN Large Hadron Collider in 2015. Inclusive and differential cross sections are measured for events containing a Z boson decaying to electrons or muons and produced in association with up to seven jets with p T > 30 GeV and |y| < 2.5. Predictions from different Monte Carlo generators based on leading-order and next-to-leading-order matrix elements for upmore » to two additional partons interfaced with parton shower and fixed-order predictions at next-to-leading order and next-to-next-to-leading order are compared with the measured cross sections. Good agreement within the uncertainties is observed for most of the modelled quantities, in particular with the generators which use next-to-leading-order matrix elements and the more recent next-to-next-to-leading-order fixed-order predictions.« less

  16. Antiproton Trapping for Advanced Space Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Smith, Gerald A.

    1998-01-01

    The Summary of Research parallels the Statement of Work (Appendix I) submitted with the proposal, and funded effective Feb. 1, 1997 for one year. A proposal was submitted to CERN in October, 1996 to carry out an experiment on the synthesis and study of fundamental properties of atomic antihydrogen. Since confined atomic antihydrogen is potentially the most powerful and elegant source of propulsion energy known, its confinement and properties are of great interest to the space propulsion community. Appendix II includes an article published in the technical magazine Compressed Air, June 1997, which describes CERN antiproton facilities, and ATHENA. During the period of this grant, Prof. Michael Holzscheiter served as spokesman for ATHENA and, in collaboration with Prof. Gerald Smith, worked on the development of the antiproton confinement trap, which is an important part of the ATHENA experiment. Appendix III includes a progress report submitted to CERN on March 12, 1997 concerning development of the ATHENA detector. Section 4.1 reviews technical responsibilities within the ATHENA collaboration, including the Antiproton System, headed by Prof. Holzscheiter. The collaboration was advised (see Appendix IV) on June 13, 1997 that the CERN Research Board had approved ATHENA for operation at the new Antiproton Decelerator (AD), presently under construction. First antiproton beams are expected to be delivered to experiments in about one year. Progress toward assembly of the ATHENA detector and initial testing expected in 1999 has been excellent. Appendix V includes a copy of the minutes of the most recently documented collaboration meeting held at CERN of October 24, 1997, which provides more information on development of systems, including the antiproton trapping apparatus. On February 10, 1998 Prof. Smith gave a 3 hour lecture on the Physics of Antimatter, as part of the Physics for the Third Millennium Lecture Series held at MSFC. Included in Appendix VI are notes and graphs presented on the ATHENA experiment. Portable antiproton trap has been under development. The goal is to store and transport antiprotons from a production site, such as Fermilab near Chicago, to a distant site, such as Huntsville, AL, thus demonstrating the portability of antiprotons.

  17. [The Big Data Game : On the Ludic Constitution of the Collaborative Production of Knowledge in High-Energy Physics at CERN].

    PubMed

    Dippel, Anne

    2017-12-01

    This article looks at how games and play contribute to the big data-driven production of knowledge in High-Energy Physics, with a particular focus on the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN), where the author has been conducting anthropological fieldwork since 2014. The ludic (playful) aspect of knowledge production is analyzed here in three different dimensions: the Symbolic, the Ontological, and the Epistemic. The first one points towards CERN as place where a cosmological game of probability is played with the help of Monte-Carlo simulations. The second one can be seen in the agonistic infrastructures of competing experimental collaborations. The third dimension unfolds in ludic platforms, such as online Challenges and citizen science games, which contribute to the development of machine learning algorithms, whose function is necessary in order to process the huge amount of data gathered from experimental events. Following Clifford Geertz, CERN itself is characterized as a site of deep play, a concept that contributes to understanding wider social and cultural orders through the analysis of ludic collective phenomena. The article also engages with Peter Galison's idea of the trading zone, proposing to comprehend it in the age of big data as a Playground. Thus the author hopes to contribute to a wider discussion in the historiographical and social study of science and technology, as well as in cultural anthropology, by recognizing the ludic in science as a central element of understanding collaborative knowledge production.

  18. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN.

    PubMed

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists.

  19. Monitoring Evolution at CERN

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Fiorini, B.; Murphy, S.; Pigueiras, L.; Santos, M.

    2015-12-01

    Over the past two years, the operation of the CERN Data Centres went through significant changes with the introduction of new mechanisms for hardware procurement, new services for cloud provisioning and configuration management, among other improvements. These changes resulted in an increase of resources being operated in a more dynamic environment. Today, the CERN Data Centres provide over 11000 multi-core processor servers, 130 PB disk servers, 100 PB tape robots, and 150 high performance tape drives. To cope with these developments, an evolution of the data centre monitoring tools was also required. This modernisation was based on a number of guiding rules: sustain the increase of resources, adapt to the new dynamic nature of the data centres, make monitoring data easier to share, give more flexibility to Service Managers on how they publish and consume monitoring metrics and logs, establish a common repository of monitoring data, optimise the handling of monitoring notifications, and replace the previous toolset by new open source technologies with large adoption and community support. This contribution describes how these improvements were delivered, present the architecture and technologies of the new monitoring tools, and review the experience of its production deployment.

  20. COMMITTEES: SQM 2007 - International Conference On Strangeness In Quark Matter SQM 2007 - International Conference On Strangeness In Quark Matter

    NASA Astrophysics Data System (ADS)

    2008-04-01

    Local Organising Committee Ivan Králik (IEP SAS, Košice) Vojtěch Petráček (Czechoslovakia Technical University, Prague) Ján Pišút (Comenius University, Bratislava) Emanuele Quercigh (CERN) Karel Šafařík (CERN), Co-chair Ladislav v Sándor (IEP SAS, Košice), Co-chair Boris Tomášik (Mateja Bela University, Banská Bystrica) Jozef Urbán (UPJŠ Košice) International Advisory Committee Jörg Aichelin, Nantes Federico Antinori, Padova Tamás Biró, Budapest Peter Braun-Munzinger, GSI Jean Cleymans, Cape Town László Csernai, Bergen Timothy Hallman, BNL Huan Zhong Huang, UCLA Sonja Kabana, Nantes Roy A Lacey, Stony Brook Carlos Lourenço, CERN Yu-Gang Ma, Shanghai Jes Masden, Aarhus Yasuo Miake, Tsukuba Berndt Müller, Duke Grazyna Odyniec, LBNL Helmut Oeschler, Darmstadt Jan Rafelski, Arizona Hans Georg Ritter, LBNL Jack Sandweiss, Yale George S F Stephans, MIT Horst Stöcker, Frankfurt Thomas Ullrich, BNL Orlando Villalobos-Baillie, Birmingham William A Zajc, Columbia

  1. First experimental evidence of hydrodynamic tunneling of ultra–relativistic protons in extended solid copper target at the CERN HiRadMat facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, R.; Grenier, D.; Wollmann, D.

    2014-08-15

    A novel experiment has been performed at the CERN HiRadMat test facility to study the impact of the 440 GeV proton beam generated by the Super Proton Synchrotron on extended solid copper cylindrical targets. Substantial hydrodynamic tunneling of the protons in the target material has been observed that leads to significant lengthening of the projectile range, which confirms our previous theoretical predictions [N. A. Tahir et al., Phys. Rev. Spec. Top.-Accel. Beams 15, 051003 (2012)]. Simulation results show very good agreement with the experimental measurements. These results have very important implications on the machine protection design for powerful machines like themore » Large Hadron Collider (LHC), the future High Luminosity LHC, and the proposed huge 80 km circumference Future Circular Collider, which is currently being discussed at CERN. Another very interesting outcome of this work is that one may also study the field of High Energy Density Physics at this test facility.« less

  2. LHC, le Big Bang en éprouvette

    ScienceCinema

    None

    2017-12-09

    Notre compréhension de l’Univers est en train de changer… Bar des Sciences - Tout public Débat modéré par Marie-Odile Montchicourt, journaliste de France Info. Evenement en vidéoconférence entre le Globe de la science et de l’innovation, le bar le Baloard de Montpellier et la Maison des Métallos à Paris. Intervenants au CERN : Philippe Charpentier et Daniel Froideveaux, physiciens au CERN. Intervenants à Paris : Vincent Bontemps, philosophe et chercheur au CEA ; Jacques Arnould, philosophe, historien des sciences et théologien, Jean-Jacques Beineix, réalisateur, producteur, scénariste de cinéma. Intervenants à Montpellier (LPTA) : André Neveu, physicien théoricien et directeur de recherche au CNRS ; Gilbert Moultaka, physicien théoricien et chargé de recherche au CNRS. Partenariat : CERN, CEA, IN2P3, Université MPL2 (LPTA) Dans le cadre de la Fête de la science 2008

  3. Laser resonance ionization spectroscopy on lutetium for the MEDICIS project

    NASA Astrophysics Data System (ADS)

    Gadelshin, V.; Cocolios, T.; Fedoseev, V.; Heinke, R.; Kieck, T.; Marsh, B.; Naubereit, P.; Rothe, S.; Stora, T.; Studer, D.; Van Duppen, P.; Wendt, K.

    2017-11-01

    The MEDICIS-PROMED Innovative Training Network under the Horizon 2020 EU program aims to establish a network of early stage researchers, involving scientific exchange and active cooperation between leading European research institutions, universities, hospitals, and industry. Primary scientific goal is the purpose of providing and testing novel radioisotopes for nuclear medical imaging and radionuclide therapy. Within a closely linked project at CERN, a dedicated electromagnetic mass separator system is presently under installation for production of innovative radiopharmaceutical isotopes at the new CERN-MEDICIS laboratory, directly adjacent to the existing CERN-ISOLDE radioactive ion beam facility. It is planned to implement a resonance ionization laser ion source (RILIS) to ensure high efficiency and unrivaled purity in the production of radioactive ions. To provide a highly efficient ionization process, identification and characterization of a specific multi-step laser ionization scheme for each individual element with isotopes of interest is required. The element lutetium is of primary relevance, and therefore was considered as first candidate. Three two-step excitation schemes for lutetium atoms are presented in this work, and spectroscopic results are compared with data of other authors.

  4. Outsourcing strategy and tendering methodology for the operation and maintenance of CERN’s cryogenic facilities

    NASA Astrophysics Data System (ADS)

    Serio, L.; Bremer, J.; Claudet, S.; Delikaris, D.; Ferlin, G.; Ferrand, F.; Pezzetti, M.; Pirotte, O.

    2017-12-01

    CERN operates and maintains the world largest cryogenic infrastructure ranging from ageing but well maintained installations feeding detectors, test facilities and general services, to the state-of-the-art cryogenic system serving the flagship LHC machine complex. A study was conducted and a methodology proposed to outsource to industry the operation and maintenance of the whole cryogenic infrastructure. The cryogenic installations coupled to non LHC-detectors, test facilities and general services infrastructure have been fully outsourced for operation and maintenance on the basis of performance obligations. The contractor is responsible for the operational performance of the installations based on a yearly operation schedule provided by CERN. The maintenance of the cryogenic system serving the LHC machine and its detectors has been outsourced on the basis of tasks oriented obligations, monitored by key performance indicators. CERN operation team, with the support of the contractor operation team, remains responsible for the operational strategy and performances. We report the analysis, strategy, definition of the requirements and technical specifications as well as the achieved technical and economic performances after one year of operation.

  5. Novel apparatus and methods for performing remotely controlled particle-solid interaction experiments at CERN

    NASA Astrophysics Data System (ADS)

    Krause, H. F.; Deveney, E. F.; Jones, N. L.; Vane, C. R.; Datz, S.; Knudsen, H.; Grafström, P.; Schuch, R.

    1997-04-01

    Recent atomic physics studies involving ultrarelativistic Pb ions required solid target positioners, scintillators, and a sophisticated data acquisition and control system placed in a remote location at the CERN Super Proton Synchrotron near Geneva, Switzerland. The apparatus, installed in a high-radiation zone underground, had to (i) function for months, (ii) automatically respond to failures such as power outages and particle-induced computer upsets, and (iii) communicate with the outside world via a telephone line. The heart of the apparatus developed was an Apple Macintosh-based CAMAC system that answered the telephone and interpreted and executed remote control commands that (i) sensed and set targets, (ii) controlled voltages and discriminator levels for scintillators, (iii) modified data acquisition hardware logic, (iv) reported control information, and (v) automatically synchronized data acquisition to the CERN spill cycle via a modem signal and transmitted experimental data to a remote computer. No problems were experienced using intercontinental telephone connections at 1200 baud. Our successful "virtual laboratory" approach that uses off-the-shelf electronics is generally adaptable to more conventional bench-type experiments.

  6. Novel approaches for inspiring students and electrifying the public

    NASA Astrophysics Data System (ADS)

    Lidström, Suzy; Read, Alex; Parke, Stephen; Allen, Roland; Goldfarb, Steven; Mehlhase, Sascha; Ekelöf, Tord; Walker, Alan

    2014-03-01

    We will briefly summarize a wide variety of innovative approaches for inspiring students and stimulating broad public interest in fundamental physics research, as exemplified by recent activities related to the Higgs boson discovery and Higgs-Englert Nobel Prize on behalf of the Swedish Academy, CERN, Fermilab, and the Niels Bohr Institute. Personal interactions with the scientists themselves can be particularly electrifying, and these were encouraged by the wearing of ``Higgs Boson? Ask Me!'' badges, which will be made available to those attending this talk. At CERN, activities include Virtual Visits, (Google) Hangout with CERN, initiatives to grab attention (LEGO models, music videos, art programs, pins, etc.), substantive communication (lab visits and events, museum exhibits, traveling exhibits, local visits, Masterclasses, etc.), and educational activities (summer student programs, semester abroad programs, internships, graduate programs, etc.). For serious students and their teachers, or scientists in other areas, tutorial articles are appropriate. These are most effective if they also incorporate innovative approaches - for example, attractive figures that immediately illustrate the concepts, analogies that will resonate with the reader, and a broadening of perspective. Physica Scripta, Royal Swedish Academy of Sciences.

  7. Enhancing moral agency: clinical ethics residency for nurses.

    PubMed

    Robinson, Ellen M; Lee, Susan M; Zollfrank, Angelika; Jurchak, Martha; Frost, Debra; Grace, Pamela

    2014-09-01

    One antidote to moral distress is stronger moral agency-that is, an enhanced ability to act to bring about change. The Clinical Ethics Residency for Nurses, an educational program developed and run in two large northeastern academic medical centers with funding from the Health Resources and Services Administration, intended to strengthen nurses' moral agency. Drawing on Improving Competencies in Clinical Ethics Consultation: An Education Guide, by the American Society for Bioethics and Humanities, and on the goals of the nursing profession, CERN sought to change attitudes, increase knowledge, and develop skills to act on one's knowledge. One of the key insights the faculty members brought to the design of this program is that knowledge of clinical ethics is not enough to develop moral agency. In addition to lecture-style classes, CERN employed a variety of methods based in adult learning theory, such as active application of ethics knowledge to patient scenarios in classroom discussion, simulation, and the clinical practicum. Overwhelmingly, the feedback from the participants (sixty-seven over three years of the program) indicated that CERN achieved transformative learning. © 2014 by The Hastings Center.

  8. Techniques for hazard analysis and their use at CERN.

    PubMed

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  9. Evaluation results of xTCA equipment for HEP experiments at CERN

    NASA Astrophysics Data System (ADS)

    Di Cosmo, M.; Bobillier, V.; Haas, S.; Joos, M.; Mico, S.; Vasey, F.; Vichoudis, P.

    2013-12-01

    The MicroTCA and AdvancedTCA industry standards are candidate modular electronic platforms for the upgrade of the current generation of high energy physics experiments. The PH-ESE group at CERN launched in 2011 the xTCA evaluation project with the aim of performing technical evaluations and eventually providing support for commercially available components. Different devices from different vendors have been acquired, evaluated and interoperability tests have been performed. This paper presents the test procedures and facilities that have been developed and focuses on the evaluation results including electrical, thermal and interoperability aspects.

  10. ALICE inner tracking system readout electronics prototype testing with the CERN "Giga Bit Transceiver''

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schambach, Joachim; Rossewij, M. J.; Sielewicz, K. M.

    The ALICE Collaboration is preparing a major detector upgrade for the LHC Run 3, which includes the construction of a new silicon pixel based Inner Tracking System (ITS). The ITS readout system consists of 192 readout boards to control the sensors and their power system, receive triggers, and deliver sensor data to the DAQ. To prototype various aspects of this readout system, an FPGA based carrier board and an associated FMC daughter card containing the CERN Gigabit Transceiver (GBT) chipset have been developed. Furthermore, this contribution describes laboratory and radiation testing results with this prototype board set.

  11. Retirement Kjell Johnsen

    ScienceCinema

    None

    2017-12-09

    A l'occasion de son 65me anniversaire plusieurs orateurs (aussi l'ambassadeur de Norvège) remercient Kjell Johnsen, né en juin 1921 en Norvège, pour ses 34 ans de service au Cern et retracent sa vie et son travail. K.Johnsen a pris part aux premières études sur les accélérateurs du futur centre de physique et fut aussi le père et le premier directeur de l'Ecole du Cern sur les accélérateurs (CAS)

  12. ALICE inner tracking system readout electronics prototype testing with the CERN "Giga Bit Transceiver''

    DOE PAGES

    Schambach, Joachim; Rossewij, M. J.; Sielewicz, K. M.; ...

    2016-12-28

    The ALICE Collaboration is preparing a major detector upgrade for the LHC Run 3, which includes the construction of a new silicon pixel based Inner Tracking System (ITS). The ITS readout system consists of 192 readout boards to control the sensors and their power system, receive triggers, and deliver sensor data to the DAQ. To prototype various aspects of this readout system, an FPGA based carrier board and an associated FMC daughter card containing the CERN Gigabit Transceiver (GBT) chipset have been developed. Furthermore, this contribution describes laboratory and radiation testing results with this prototype board set.

  13. Future Approach to tier-0 extension

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Cordeiro, C.; Giordano, D.; Traylen, S.; Moreno García, D.

    2017-10-01

    The current tier-0 processing at CERN is done on two managed sites, the CERN computer centre and the Wigner computer centre. With the proliferation of public cloud resources at increasingly competitive prices, we have been investigating how to transparently increase our compute capacity to include these providers. The approach taken has been to integrate these resources using our existing deployment and computer management tools and to provide them in a way that exposes them to users as part of the same site. The paper will describe the architecture, the toolset and the current production experiences of this model.

  14. Anomalous single production of the fourth generation quarks at the CERN LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ciftci, R.

    Possible anomalous single productions of the fourth standard model generation up and down type quarks at CERN Large Hadron Collider are studied. Namely, pp{yields}u{sub 4}(d{sub 4})X with subsequent u{sub 4}{yields}bW{sup +} process followed by the leptonic decay of the W boson and d{sub 4}{yields}b{gamma} (and its H.c.) decay channel are considered. Signatures of these processes and corresponding standard model backgrounds are discussed in detail. Discovery limits for the quark mass and achievable values of the anomalous coupling strength are determined.

  15. ALICE inner tracking system readout electronics prototype testing with the CERN ``Giga Bit Transceiver''

    NASA Astrophysics Data System (ADS)

    Schambach, J.; Rossewij, M. J.; Sielewicz, K. M.; Aglieri Rinella, G.; Bonora, M.; Ferencei, J.; Giubilato, P.; Vanat, T.

    2016-12-01

    The ALICE Collaboration is preparing a major detector upgrade for the LHC Run 3, which includes the construction of a new silicon pixel based Inner Tracking System (ITS). The ITS readout system consists of 192 readout boards to control the sensors and their power system, receive triggers, and deliver sensor data to the DAQ. To prototype various aspects of this readout system, an FPGA based carrier board and an associated FMC daughter card containing the CERN Gigabit Transceiver (GBT) chipset have been developed. This contribution describes laboratory and radiation testing results with this prototype board set.

  16. W production at large transverse momentum at the CERN Large Hadron Collider.

    PubMed

    Gonsalves, Richard J; Kidonakis, Nikolaos; Sabio Vera, Agustín

    2005-11-25

    We study the production of W bosons at large transverse momentum in pp collisions at the CERN Large Hadron Collider. We calculate the complete next-to-leading order (NLO) corrections to the differential cross section. We find that the NLO corrections provide a large increase to the cross section but, surprisingly, do not reduce the scale dependence relative to leading order (LO). We also calculate next-to-next-to-leading-order (NNLO) soft-gluon corrections and find that, although they are small, they significantly reduce the scale dependence thus providing a more stable result.

  17. Lower limit on dark matter production at the CERN Large Hadron Collider.

    PubMed

    Feng, Jonathan L; Su, Shufang; Takayama, Fumihiro

    2006-04-21

    We evaluate the prospects for finding evidence of dark matter production at the CERN Large Hadron Collider. We consider weakly interacting massive particles (WIMPs) and superWIMPs and characterize their properties through model-independent parametrizations. The observed relic density then implies lower bounds on dark matter production rates as functions of a few parameters. For WIMPs, the resulting signal is indistinguishable from background. For superWIMPs, however, this analysis implies significant production of metastable charged particles. For natural parameters, these rates may far exceed Drell-Yan cross sections and yield spectacular signals.

  18. New radiation protection calibration facility at CERN.

    PubMed

    Brugger, Markus; Carbonez, Pierre; Pozzi, Fabio; Silari, Marco; Vincke, Helmut

    2014-10-01

    The CERN radiation protection group has designed a new state-of-the-art calibration laboratory to replace the present facility, which is >20 y old. The new laboratory, presently under construction, will be equipped with neutron and gamma sources, as well as an X-ray generator and a beta irradiator. The present work describes the project to design the facility, including the facility placement criteria, the 'point-zero' measurements and the shielding study performed via FLUKA Monte Carlo simulations. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. ENLIGHT: European network for Light ion hadron therapy.

    PubMed

    Dosanjh, Manjit; Amaldi, Ugo; Mayer, Ramona; Poetter, Richard

    2018-04-03

    The European Network for Light Ion Hadron Therapy (ENLIGHT) was established in 2002 following various European particle therapy network initiatives during the 1980s and 1990s (e.g. EORTC task group, EULIMA/PIMMS accelerator design). ENLIGHT started its work on major topics related to hadron therapy (HT), such as patient selection, clinical trials, technology, radiobiology, imaging and health economics. It was initiated through CERN and ESTRO and dealt with various disciplines such as (medical) physics and engineering, radiation biology and radiation oncology. ENLIGHT was funded until 2005 through the EC FP5 programme. A regular annual meeting structure was started in 2002 and continues until today bringing together the various disciplines and projects and institutions in the field of HT at different European places for regular exchange of information on best practices and research and development. Starting in 2006 ENLIGHT coordination was continued through CERN in collaboration with ESTRO and other partners involved in HT. Major projects within the EC FP7 programme (2008-2014) were launched for R&D and transnational access (ULICE, ENVISION) and education and training networks (Marie Curie ITNs: PARTNER, ENTERVISION). These projects were instrumental for the strengthening of the field of hadron therapy. With the start of 4 European carbon ion and proton centres and the upcoming numerous European proton therapy centres, the future scope of ENLIGHT will focus on strengthening current and developing European particle therapy research, multidisciplinary education and training and general R&D in technology and biology with annual meetings and a continuously strong CERN support. Collaboration with the European Particle Therapy Network (EPTN) and other similar networks will be pursued. Copyright © 2018 CERN. Published by Elsevier B.V. All rights reserved.

  20. Security in the CernVM File System and the Frontier Distributed Database Caching System

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Blomer, J.

    2014-06-01

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.

  1. Effects of bulk viscosity and hadronic rescattering in heavy ion collisions at energies available at the BNL Relativistic Heavy Ion Collider and at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Ryu, Sangwook; Paquet, Jean-François; Shen, Chun; Denicol, Gabriel; Schenke, Björn; Jeon, Sangyong; Gale, Charles

    2018-03-01

    We describe ultrarelativistic heavy ion collisions at the BNL Relativistic Heavy Ion Collider and the CERN Large Hadron Collider with a hybrid model using the IP-Glasma model for the earliest stage and viscous hydrodynamics and microscopic transport for the later stages of the collision. We demonstrate that within this framework the bulk viscosity of the plasma plays an important role in describing the experimentally observed radial flow and azimuthal anisotropy simultaneously. We further investigate the dependence of observables on the temperature below which we employ the microscopic transport description.

  2. Optimising the Active Muon Shield for the SHiP Experiment at CERN

    NASA Astrophysics Data System (ADS)

    Baranov, A.; Burnaev, E.; Derkach, D.; Filatov, A.; Klyuchnikov, N.; Lantwin, O.; Ratnikov, F.; Ustyuzhanin, A.; Zaitsev, A.

    2017-12-01

    The SHiP experiment is designed to search for very weakly interacting particles beyond the Standard Model which are produced in a 400 GeV/c proton beam dump at the CERN SPS. The critical challenge for this experiment is to keep the Standard Model background level negligible. In the beam dump, around 1011 muons will be produced per second. The muon rate in the spectrometer has to be reduced by at least four orders of magnitude to avoid muoninduced backgrounds. It is demonstrated that new improved active muon shield may be used to magnetically deflect the muons out of the acceptance of the spectrometer.

  3. A Bonner Sphere Spectrometer with extended response matrix

    NASA Astrophysics Data System (ADS)

    Birattari, C.; Dimovasili, E.; Mitaroff, A.; Silari, M.

    2010-08-01

    This paper describes the design, calibration and applications at high-energy accelerators of an extended-range Bonner Sphere neutron Spectrometer (BSS). The BSS was designed by the FLUKA Monte Carlo code, investigating several combinations of materials and diameters of the moderators for the high-energy channels. The system was calibrated at PTB in Braunschweig, Germany, using monoenergetic neutron beams in the energy range 144 keV-19 MeV. It was subsequently tested with Am-Be source neutrons and in the simulated workplace neutron field at CERF (the CERN-EU high-energy reference field facility). Since 2002, it has been employed for neutron spectral measurements around CERN accelerators.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernandez T, Arturo

    The use of the sophisticated and large underground detectors at CERN for cosmic ray studies has been considered by several groups, e.g. UA1, LEP and LHC detectors. They offer the opportunity to provide large sensitivity area with magnetic analysis which allow a precise determination of the direction of cosmic ray muons as well as their momentum up to the order of some TeV. The aim of this article is to review the observation of high energy cosmic ray muons using precise spectrometers at CERN, mainly LEP detectors as well as the possibility of improve those measurements with LHC apparatus, givingmore » special emphasis to the ACORDE-ALICE cosmic ray physics program.« less

  5. HST at CERN an Amazing Adventure

    NASA Astrophysics Data System (ADS)

    Restivo, Evelyn

    2009-04-01

    The High School Teacher Program (HST) at the European Organization for Nuclear Research, CERN, in Geneva, Switzerland was initiated in 1998 by a group of scientists, as a multicultural international program designed to introduce high school physics teachers to high-energy physics. The goal of the program is to provide experiences and materials that will help teachers lead their students to a better understanding of the physical world. Interacting with physics teachers from around the world leads to new approaches for dealing with educational issues that all teachers encounter. The program includes a variety of tours, a series of lectures and classroom activities about the physics expected from the Large Hadron Collider.

  6. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; hide

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  7. Mechanical qualification of the support structure for MQXF, the Nb 3Sn low-β quadrupole for the high luminosity LHC

    DOE PAGES

    Juchno, M.; Ambrosio, G.; Anerella, M.; ...

    2016-01-26

    Within the scope of the High Luminosity LHC project, the collaboration between CERN and U.S. LARP is developing new low-β quadrupoles using the Nb 3Sn superconducting technology for the upgrade of the LHC interaction regions. The magnet support structure of the first short model was designed and two units were fabricated and tested at CERN and at LBNL. The structure provides the preload to the collars-coils subassembly by an arrangement of outer aluminum shells pre-tensioned with water-pressurized bladders. For the mechanical qualification of the structure and the assembly procedure, superconducting coils were replaced with solid aluminum “dummy coils”, the structuremore » was preloaded at room temperature, and then cooled-down to 77 K. Mechanical behavior of the magnet structure was monitored with the use of strain gauges installed on the aluminum shells, the dummy coils and the axial preload system. As a result, this paper reports on the outcome of the assembly and the cool-down tests with dummy coils, which were performed at CERN and at LBNL, and presents the strain gauge measurements compared to the 3D finite element model predictions.« less

  8. Hadron-collider limits on new electroweak interactions from the heterotic string

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    del Aguila, F.; Moreno, J.M.; Quiros, M.

    1990-01-01

    We evaluate the {ital Z}{prime}{r arrow}{ital l}{sup +}l{sup {minus}} cross section at present and future hadron colliders, for the minimal (E{sub 6}) extended electroweak models inspired by superstrings (including renormalization effects on new gauge couplings and new mixing angles). Popular models are discussed for comparison. Analytical expressions for the bounds on the mass of a new gauge boson, {ital M}{sub {ital Z}{prime}}, as a function of the bound on the ratio {ital R}{equivalent to}{sigma}({ital Z}{prime}){ital B}(Z{prime}{r arrow}l{sup +}{ital l}{sup {minus}})/{sigma}({ital Z}){ital B} ({ital Z}{r arrow}{ital l}{sup +}{ital l}{sup {minus}}), are given for the CERN S{ital p {bar p}}S, Fermilab Teva-more » tron, Serpukhov UNK, CERN Large Hadron Collider, and Superconducting Super Collider for the different models. In particular, the {ital M}{sub {ital Z}{prime}} bounds from the present {ital R} limit at CERN, as well as from the eventually available {ital R} limits at Fermilab and at the future hadron colliders (after three months of running at the expected luminosity), are given explicitly.« less

  9. Design and performance of the virtualization platform for offline computing on the ATLAS TDAQ Farm

    NASA Astrophysics Data System (ADS)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Twomey, M. S.; Zaytsev, A.

    2014-06-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 there is an opportunity to use the computing resources of the experiments' large trigger farms for other data processing activities. In the case of the ATLAS experiment, the TDAQ farm, consisting of more than 1500 compute nodes, is suitable for running Monte Carlo (MC) production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of the design and deployment of a virtualized platform running on this computing resource and of its use to run large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to guarantee the security and the usability of the ATLAS private network, and to minimize interference with TDAQ's usage of the farm. Openstack has been chosen to provide a cloud management layer. The experience gained in the last 3.5 months shows that the use of the TDAQ farm for the MC simulation contributes to the ATLAS data processing at the level of a large Tier-1 WLCG site, despite the opportunistic nature of the underlying computing resources being used.

  10. Hadron Collider Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Incandela, J.R.

    2000-03-07

    Experiments are being prepared at the Fermilab Tevatron and the CERN Large Hadron Collider that promise to deliver extraordinary insights into the nature of spontaneous symmetry breaking, and the role of supersymmetry in the universe. This article reviews the goals, challenges, and designs of these experiments. The first hadron collider, the ISR at CERN, has to overcome two initial obstacles. The first was low luminosity, which steadily improved over time. The second was the broad angular spread of interesting events. In this regard Maurice Jacob noted (1): The answer is ... sophisticated detectors covering at least the whole central regionmore » (45{degree} {le} {theta} {le} 135{degree}) and full azimuth. This statement, while obvious today, reflects the major revelation of the ISR period that hadrons have partonic substructure. The result was an unexpectedly strong hadronic yield at large transverse momentum (p{sub T}). Partly because of this, the ISR missed the discovery of the J/{psi} and later missed the {Upsilon}. The ISR era was therefore somewhat less auspicious than it might have been. It did however make important contributions in areas such as jet production and charm excitation and it paved the way for the SPS collider, also at CERN.« less

  11. A Simulation of the Front End Signal Digitization for the ATLAS Muon Spectrometer thin RPC trigger upgrade project

    NASA Astrophysics Data System (ADS)

    Meng, Xiangting; Chapman, John; Levin, Daniel; Dai, Tiesheng; Zhu, Junjie; Zhou, Bing; Um Atlas Group Team

    2016-03-01

    The ATLAS Muon Spectrometer Phase-I (and Phase-II) upgrade includes the BIS78 muon trigger detector project: two sets of eight very thin Resistive Place Chambers (tRPCs) combined with small Monitored Drift Tube (MDT) chambers in the pseudorapidity region 1<| η|<1.3. The tRPCs will be comprised of triplet readout layer in each of the eta and azimuthal phi coordinates, with about 400 readout strips per layer. The anticipated hit rate is 100-200 kHz per strip. Digitization of the strip signals will be done by 32-channel CERN HPTDC chips. The HPTDC is a highly configurable ASIC designed by the CERN Microelectronics group. It can work in both trigger and trigger-less modes, be readout in parallel or serially. For Phase-I operation, a stringent latency requirement of 43 bunch crossings (1075 ns) is imposed. The latency budget for the front end digitization must be kept to a minimal value, ideally less than 350 ns. We conducted detailed HPTDC latency simulations using the Behavioral Verilog code from the CERN group. We will report the results of these simulations run for the anticipated detector operating environment and for various HPTDC configurations.

  12. Analysis of SEL on Commercial SRAM Memories and Mixed-Field Characterization of a Latchup Detection Circuit for LEO Space Applications

    NASA Astrophysics Data System (ADS)

    Secondo, R.; Alía, R. Garcia; Peronnard, P.; Brugger, M.; Masi, A.; Danzeca, S.; Merlenghi, A.; Vaillé, J.-R.; Dusseau, L.

    2017-08-01

    A single event latchup (SEL) experiment based on commercial static random access memory (SRAM) memories has recently been proposed in the framework of the European Organization for Nuclear Research (CERN) Latchup Experiment and Student Satellite nanosatellite low Earth orbit (LEO) space mission. SEL characterization of three commercial SRAM memories has been carried out at the Paul Scherrer Institut (PSI) facility, using monoenergetic focused proton beams and different acquisition setups. The best target candidate was selected and a circuit for SEL detection has been proposed and tested at CERN, in the CERN High Energy AcceleRator Mixed-field facility (CHARM). Experimental results were carried out at test locations representative of the LEO environment, thus providing a full characterization of the SRAM cross sections, together with the analysis of the single-event effect and total ionizing dose of the latchup detection circuit in relation to the particle spectra expected during mission. The setups used for SEL monitoring are described, and details of the proposed circuit components and topology are presented. Experimental results obtained both at PSI and at CHARM facilities are discussed.

  13. Lectures from the European RTN Winter School on Strings, Supergravity and Gauge Fields, CERN, 15 19 January 2007

    NASA Astrophysics Data System (ADS)

    Derendinger, J.-P.; Scrucca, C. A.; Uranga, A.

    2007-11-01

    This special issue is devoted to the proceedings of the conference 'Winter School on Strings, Supergravity and Gauge Theories', which took place at CERN, the European Centre for Nuclear Research, in Geneva, Switzerland, from the 15 to the 19 of January 2007. This event was organized in the framework of the European Mobility Research and Training Network entitled 'Constituents, Fundamental Forces and Symmetries of the Universe'. It is part of a yearly series of scientific schools, which represents what is by now a well established tradition. The previous conferences have been held at SISSA, in Trieste, Italy, in February 2005 and at CERN in January 2006. The next will again take place at CERN, in January 2008. The school was primarily meant for young doctoral students and postdoctoral researchers working in the area of string theory. It consisted of several general lectures of four hours each, the notes of which are published in the present proceedings, and seven working group discussion sessions, focused on specific topics of the network research program. It was attended by approximatively 250 participants. The topics of the lectures were chosen to provide an introduction to some of the areas of recent progress, and to the open problems, in string theory. String theory is a compelling candidate for a theory of all interactions. A basic challenge in this field is therefore to explore the connection of string theory models and the laws of physics in different realms, like high-energy particle physics, early cosmology, or physics of strongly coupled gauge theories. Concerning the exploration of string theory compactifications leading to realistic models of particle physics, one of the main obstacles in this direction is the proper understanding of supersymmetry breaking. The lecture notes by Nathan Seiberg review the realization of spontaneous breaking of supersymmetry in field theory, including recent developments via the use of meta-stable long-lived vacua. It is possible that such an understanding proves crucial in the realization of supersymmetry breaking in string theory. A second long-standing obstacle, which is being tackled with recent techniques, is moduli stabilization, namely the removal of unwanted massless scalar fields from string models. The present status of this problem, and its prospects of solution via the introduction of general sets of fluxes in the compactification space, were covered in the lectures by Brian Wecht. Application of these ideas to connect string theory to particle physics will require a good understanding of the experimental situation at the forthcoming collider LHC at CERN, and the detection tools for signals of new physics, as reviewed in the lectures by Joe Lykken (not covered in the present issue). Along a different line, the role of moduli fields in string theory is expected to provide a natural explanation of models of inflation, and thus of the origin of the cosmological evolution of our universe. The lecture notes by Cliff Burgess provide a review of big bang cosmology, inflation, and its possible explanation in terms of string theory constructions, including some of the most recent results in the field (these notes also appear in the proceedings of two other schools held in the same period). A surprising recent application of string theory is the description, via the ideas of holography and duality between string theories and gauge theories, of physical properties of quantum chromodynamics at high temperature. Indeed experimental data on the physical properties of the quark gluon plasma, produced in heavy ion collision at the RHIC experiment in Brookhaven (and soon at the LHC at CERN) can be recovered, at a semi-quantitative level, from computations in a string theory dual of the system. These applications are reviewed in the lectures by David Mateos. The conference was financially supported by the European Commission under contract MRTN-CT-2004-005104 and by CERN. It was jointly organized by the Physics Institute of the University of Neuchâtel and the Theory Unit of the Physics Division of CERN. It is a great pleasure for us to warmly thank the Theory Unit of CERN for its very kind hospitality and for the high quality of the assistance and the infrastructures that it has provided. We also acknowledge helpful administrative assistance from the Physics Institute of the University of Neuchâtel. A special acknowledgement also goes to Denis Frank, for his very valuable help in preparing the conference web pages. Group photo

  14. Measurements of the production cross section of a [Formula: see text] boson in association with jets in pp collisions at [Formula: see text] TeV with the ATLAS detector.

    PubMed

    Aaboud, M; Aad, G; Abbott, B; Abdallah, J; Abdinov, O; Abeloos, B; Aben, R; AbouZeid, O S; Abraham, N L; Abramowicz, H; Abreu, H; Abreu, R; Abulaiti, Y; Acharya, B S; Adachi, S; Adamczyk, L; Adams, D L; Adelman, J; Adomeit, S; Adye, T; Affolder, A A; Agatonovic-Jovin, T; Aguilar-Saavedra, J A; Ahlen, S P; Ahmadov, F; Aielli, G; Akerstedt, H; Åkesson, T P A; Akimov, A V; Alberghi, G L; Albert, J; Albrand, S; Alconada Verzini, M J; Aleksa, M; Aleksandrov, I N; Alexa, C; Alexander, G; Alexopoulos, T; Alhroob, M; Ali, B; Aliev, M; Alimonti, G; Alison, J; Alkire, S P; Allbrooke, B M M; Allen, B W; Allport, P P; Aloisio, A; Alonso, A; Alonso, F; Alpigiani, C; Alshehri, A A; Alstaty, M; Alvarez Gonzalez, B; Álvarez Piqueras, D; Alviggi, M G; Amadio, B T; Amaral Coutinho, Y; Amelung, C; Amidei, D; Amor Dos Santos, S P; Amorim, A; Amoroso, S; Amundsen, G; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anders, J K; Anderson, K J; Andreazza, A; Andrei, V; Angelidakis, S; Angelozzi, I; Angerami, A; Anghinolfi, F; Anisenkov, A V; Anjos, N; Annovi, A; Antel, C; Antonelli, M; Antonov, A; Antrim, D J; Anulli, F; Aoki, M; Aperio Bella, L; Arabidze, G; Arai, Y; Araque, J P; Arce, A T H; Arduh, F A; Arguin, J-F; Argyropoulos, S; Arik, M; Armbruster, A J; Armitage, L J; Arnaez, O; Arnold, H; Arratia, M; Arslan, O; Artamonov, A; Artoni, G; Artz, S; Asai, S; Asbah, N; Ashkenazi, A; Åsman, B; Asquith, L; Assamagan, K; Astalos, R; Atkinson, M; Atlay, N B; Augsten, K; Avolio, G; Axen, B; Ayoub, M K; Azuelos, G; Baak, M A; Baas, A E; Baca, M J; Bachacou, H; Bachas, K; Backes, M; Backhaus, M; Bagiacchi, P; Bagnaia, P; Bai, Y; Baines, J T; Bajic, M; Baker, O K; Baldin, E M; Balek, P; Balestri, T; Balli, F; Balunas, W K; Banas, E; Banerjee, Sw; Bannoura, A A E; Barak, L; Barberio, E L; Barberis, D; Barbero, M; Barillari, T; Barisits, M-S; Barklow, T; Barlow, N; Barnes, S L; Barnett, B M; Barnett, R M; Barnovska-Blenessy, Z; Baroncelli, A; Barone, G; Barr, A J; Barranco Navarro, L; Barreiro, F; Barreiro Guimarães da Costa, J; Bartoldus, R; Barton, A E; Bartos, P; Basalaev, A; Bassalat, A; Bates, R L; Batista, S J; Batley, J R; Battaglia, M; Bauce, M; Bauer, F; Bawa, H S; Beacham, J B; Beattie, M D; Beau, T; Beauchemin, P H; Bechtle, P; Beck, H P; Becker, K; Becker, M; Beckingham, M; Becot, C; Beddall, A J; Beddall, A; Bednyakov, V A; Bedognetti, M; Bee, C P; Beemster, L J; Beermann, T A; Begel, M; Behr, J K; Bell, A S; Bella, G; Bellagamba, L; Bellerive, A; Bellomo, M; Belotskiy, K; Beltramello, O; Belyaev, N L; Benary, O; Benchekroun, D; Bender, M; Bendtz, K; Benekos, N; Benhammou, Y; Benhar Noccioli, E; Benitez, J; Benjamin, D P; Bensinger, J R; Bentvelsen, S; Beresford, L; Beretta, M; Berge, D; Bergeaas Kuutmann, E; Berger, N; Beringer, J; Berlendis, S; Bernard, N R; Bernius, C; Bernlochner, F U; Berry, T; Berta, P; Bertella, C; Bertoli, G; Bertolucci, F; Bertram, I A; Bertsche, C; Bertsche, D; Besjes, G J; Bessidskaia Bylund, O; Bessner, M; Besson, N; Betancourt, C; Bethani, A; Bethke, S; Bevan, A J; Bianchi, R M; Bianco, M; Biebel, O; Biedermann, D; Bielski, R; Biesuz, N V; Biglietti, M; Bilbao De Mendizabal, J; Billoud, T R V; Bilokon, H; Bindi, M; Bingul, A; Bini, C; Biondi, S; Bisanz, T; Bjergaard, D M; Black, C W; Black, J E; Black, K M; Blackburn, D; Blair, R E; Blazek, T; Bloch, I; Blocker, C; Blue, A; Blum, W; Blumenschein, U; Blunier, S; Bobbink, G J; Bobrovnikov, V S; Bocchetta, S S; Bocci, A; Bock, C; Boehler, M; Boerner, D; Bogaerts, J A; Bogavac, D; Bogdanchikov, A G; Bohm, C; Boisvert, V; Bokan, P; Bold, T; Boldyrev, A S; Bomben, M; Bona, M; Boonekamp, M; Borisov, A; Borissov, G; Bortfeldt, J; Bortoletto, D; Bortolotto, V; Bos, K; Boscherini, D; Bosman, M; Bossio Sola, J D; Boudreau, J; Bouffard, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Boutle, S K; Boveia, A; Boyd, J; Boyko, I R; Bracinik, J; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Breaden Madden, W D; Brendlinger, K; Brennan, A J; Brenner, L; Brenner, R; Bressler, S; Bristow, T M; Britton, D; Britzger, D; Brochu, F M; Brock, I; Brock, R; Brooijmans, G; Brooks, T; Brooks, W K; Brosamer, J; Brost, E; Broughton, J H; Bruckman de Renstrom, P A; Bruncko, D; Bruneliere, R; Bruni, A; Bruni, G; Bruni, L S; Brunt, B H; Bruschi, M; Bruscino, N; Bryant, P; Bryngemark, L; Buanes, T; Buat, Q; Buchholz, P; Buckley, A G; Budagov, I A; Buehrer, F; Bugge, M K; Bulekov, O; Bullock, D; Burckhart, H; Burdin, S; Burgard, C D; Burger, A M; Burghgrave, B; Burka, K; Burke, S; Burmeister, I; Burr, J T P; Busato, E; Büscher, D; Büscher, V; Bussey, P; Butler, J M; Buttar, C M; Butterworth, J M; Butti, P; Buttinger, W; Buzatu, A; Buzykaev, A R; Cabrera Urbán, S; Caforio, D; Cairo, V M; Cakir, O; Calace, N; Calafiura, P; Calandri, A; Calderini, G; Calfayan, P; Callea, G; Caloba, L P; Calvente Lopez, S; Calvet, D; Calvet, S; Calvet, T P; Camacho Toro, R; Camarda, S; Camarri, P; Cameron, D; Caminal Armadans, R; Camincher, C; Campana, S; Campanelli, M; Camplani, A; Campoverde, A; Canale, V; Canepa, A; Cano Bret, M; Cantero, J; Cao, T; Capeans Garrido, M D M; Caprini, I; Caprini, M; Capua, M; Carbone, R M; Cardarelli, R; Cardillo, F; Carli, I; Carli, T; Carlino, G; Carminati, L; Carney, R M D; Caron, S; Carquin, E; Carrillo-Montoya, G D; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Casolino, M; Casper, D W; Castaneda-Miranda, E; Castelijn, R; Castelli, A; Castillo Gimenez, V; Castro, N F; Catinaccio, A; Catmore, J R; Cattai, A; Caudron, J; Cavaliere, V; Cavallaro, E; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Cerda Alberich, L; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cervelli, A; Cetin, S A; Chafaq, A; Chakraborty, D; Chan, S K; Chan, Y L; Chang, P; Chapman, J D; Charlton, D G; Chatterjee, A; Chau, C C; Chavez Barajas, C A; Che, S; Cheatham, S; Chegwidden, A; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, K; Chen, S; Chen, S; Chen, X; Chen, Y; Cheng, H C; Cheng, H J; Cheng, Y; Cheplakov, A; Cheremushkina, E; Cherkaoui El Moursli, R; Chernyatin, V; Cheu, E; Chevalier, L; Chiarella, V; Chiarelli, G; Chiodini, G; Chisholm, A S; Chitan, A; Chizhov, M V; Choi, K; Chomont, A R; Chouridou, S; Chow, B K B; Christodoulou, V; Chromek-Burckhart, D; Chudoba, J; Chuinard, A J; Chwastowski, J J; Chytka, L; Ciapetti, G; Ciftci, A K; Cinca, D; Cindro, V; Cioara, I A; Ciocca, C; Ciocio, A; Cirotto, F; Citron, Z H; Citterio, M; Ciubancan, M; Clark, A; Clark, B L; Clark, M R; Clark, P J; Clarke, R N; Clement, C; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Colasurdo, L; Cole, B; Colijn, A P; Collot, J; Colombo, T; Compostella, G; Conde Muiño, P; Coniavitis, E; Connell, S H; Connelly, I A; Consorti, V; Constantinescu, S; Conti, G; Conventi, F; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Cormier, F; Cormier, K J R; Cornelissen, T; Corradi, M; Corriveau, F; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Cottin, G; Cowan, G; Cox, B E; Cranmer, K; Crawley, S J; Cree, G; Crépé-Renaudin, S; Crescioli, F; Cribbs, W A; Crispin Ortuzar, M; Cristinziani, M; Croft, V; Crosetti, G; Cueto, A; Cuhadar Donszelmann, T; Cummings, J; Curatolo, M; Cúth, J; Czirr, H; Czodrowski, P; D'amen, G; D'Auria, S; D'Onofrio, M; Da Cunha Sargedas De Sousa, M J; Via, C Da; Dabrowski, W; Dado, T; Dai, T; Dale, O; Dallaire, F; Dallapiccola, C; Dam, M; Dandoy, J R; Dang, N P; Daniells, A C; Dann, N S; Danninger, M; Dano Hoffmann, M; Dao, V; Darbo, G; Darmora, S; Dassoulas, J; Dattagupta, A; Davey, W; David, C; Davidek, T; Davies, M; Davison, P; Dawe, E; Dawson, I; De, K; de Asmundis, R; De Benedetti, A; De Castro, S; De Cecco, S; De Groot, N; de Jong, P; De la Torre, H; De Lorenzi, F; De Maria, A; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Regie, J B De Vivie; Dearnaley, W J; Debbe, R; Debenedetti, C; Dedovich, D V; Dehghanian, N; Deigaard, I; Del Gaudio, M; Del Peso, J; Del Prete, T; Delgove, D; Deliot, F; Delitzsch, C M; Dell'Acqua, A; Dell'Asta, L; Dell'Orso, M; Della Pietra, M; Della Volpe, D; Delmastro, M; Delsart, P A; DeMarco, D A; Demers, S; Demichev, M; Demilly, A; Denisov, S P; Denysiuk, D; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Deterre, C; Dette, K; Deviveiros, P O; Dewhurst, A; Dhaliwal, S; Di Ciaccio, A; Di Ciaccio, L; Di Clemente, W K; Di Donato, C; Di Girolamo, A; Di Girolamo, B; Di Micco, B; Di Nardo, R; Di Petrillo, K F; Di Simone, A; Di Sipio, R; Di Valentino, D; Diaconu, C; Diamond, M; Dias, F A; Diaz, M A; Diehl, E B; Dietrich, J; Díez Cornell, S; Dimitrievska, A; Dingfelder, J; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; Djuvsland, J I; do Vale, M A B; Dobos, D; Dobre, M; Doglioni, C; Dolejsi, J; Dolezal, Z; Donadelli, M; Donati, S; Dondero, P; Donini, J; Dopke, J; Doria, A; Dova, M T; Doyle, A T; Drechsler, E; Dris, M; Du, Y; Duarte-Campderros, J; Duchovni, E; Duckeck, G; Ducu, O A; Duda, D; Dudarev, A; Dudder, A Chr; Duffield, E M; Duflot, L; Dührssen, M; Dumancic, M; Duncan, A K; Dunford, M; Duran Yildiz, H; Düren, M; Durglishvili, A; Duschinger, D; Dutta, B; Dyndal, M; Eckardt, C; Ecker, K M; Edgar, R C; Edwards, N C; Eifert, T; Eigen, G; Einsweiler, K; Ekelof, T; El Kacimi, M; Ellajosyula, V; Ellert, M; Elles, S; Ellinghaus, F; Elliot, A A; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Enari, Y; Endner, O C; Ennis, J S; Erdmann, J; Ereditato, A; Ernis, G; Ernst, J; Ernst, M; Errede, S; Ertel, E; Escalier, M; Esch, H; Escobar, C; Esposito, B; Etienvre, A I; Etzion, E; Evans, H; Ezhilov, A; Fabbri, F; Fabbri, L; Facini, G; Fakhrutdinov, R M; Falciano, S; Falla, R J; Faltova, J; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farina, C; Farina, E M; Farooque, T; Farrell, S; Farrington, S M; Farthouat, P; Fassi, F; Fassnacht, P; Fassouliotis, D; Faucci Giannelli, M; Favareto, A; Fawcett, W J; Fayard, L; Fedin, O L; Fedorko, W; Feigl, S; Feligioni, L; Feng, C; Feng, E J; Feng, H; Fenyuk, A B; Feremenga, L; Fernandez Martinez, P; Fernandez Perez, S; Ferrando, J; Ferrari, A; Ferrari, P; Ferrari, R; Ferreira de Lima, D E; Ferrer, A; Ferrere, D; Ferretti, C; Fiedler, F; Filipčič, A; Filipuzzi, M; Filthaut, F; Fincke-Keeler, M; Finelli, K D; Fiolhais, M C N; Fiorini, L; Fischer, A; Fischer, C; Fischer, J; Fisher, W C; Flaschel, N; Fleck, I; Fleischmann, P; Fletcher, G T; Fletcher, R R M; Flick, T; Flierl, B M; Flores Castillo, L R; Flowerdew, M J; Forcolin, G T; Formica, A; Forti, A; Foster, A G; Fournier, D; Fox, H; Fracchia, S; Francavilla, P; Franchini, M; Francis, D; Franconi, L; Franklin, M; Frate, M; Fraternali, M; Freeborn, D; Fressard-Batraneanu, S M; Friedrich, F; Froidevaux, D; Frost, J A; Fukunaga, C; Fullana Torregrosa, E; Fusayasu, T; Fuster, J; Gabaldon, C; Gabizon, O; Gabrielli, A; Gabrielli, A; Gach, G P; Gadatsch, S; Gagliardi, G; Gagnon, L G; Gagnon, P; Galea, C; Galhardo, B; Gallas, E J; Gallop, B J; Gallus, P; Galster, G; Gan, K K; Ganguly, S; Gao, J; Gao, Y; Gao, Y S; Garay Walls, F M; García, C; García Navarro, J E; Garcia-Sciveres, M; Gardner, R W; Garelli, N; Garonne, V; Gascon Bravo, A; Gasnikova, K; Gatti, C; Gaudiello, A; Gaudio, G; Gauthier, L; Gavrilenko, I L; Gay, C; Gaycken, G; Gazis, E N; Gecse, Z; Gee, C N P; Geich-Gimbel, Ch; Geisen, M; Geisler, M P; Gellerstedt, K; Gemme, C; Genest, M H; Geng, C; Gentile, S; Gentsos, C; George, S; Gerbaudo, D; Gershon, A; Ghasemi, S; Ghneimat, M; Giacobbe, B; Giagu, S; Giannetti, P; Gibson, S M; Gignac, M; Gilchriese, M; Gillam, T P S; Gillberg, D; Gilles, G; Gingrich, D M; Giokaris, N; Giordani, M P; Giorgi, F M; Giraud, P F; Giromini, P; Giugni, D; Giuli, F; Giuliani, C; Giulini, M; Gjelsten, B K; Gkaitatzis, S; Gkialas, I; Gkougkousis, E L; Gladilin, L K; Glasman, C; Glatzer, J; Glaysher, P C F; Glazov, A; Goblirsch-Kolb, M; Godlewski, J; Goldfarb, S; Golling, T; Golubkov, D; Gomes, A; Gonçalo, R; Costa, J Goncalves Pinto Firmino Da; Gonella, G; Gonella, L; Gongadze, A; de la Hoz, S González; Gonzalez-Sevilla, S; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Goshaw, A T; Gössling, C; Gostkin, M I; Goudet, C R; Goujdami, D; Goussiou, A G; Govender, N; Gozani, E; Graber, L; Grabowska-Bold, I; Gradin, P O J; Grafström, P; Gramling, J; Gramstad, E; Grancagnolo, S; Gratchev, V; Gravila, P M; Gray, H M; Graziani, E; Greenwood, Z D; Grefe, C; Gregersen, K; Gregor, I M; Grenier, P; Grevtsov, K; Griffiths, J; Grillo, A A; Grimm, K; Grinstein, S; Gris, Ph; Grivaz, J-F; Groh, S; Gross, E; Grosse-Knetter, J; Grossi, G C; Grout, Z J; Guan, L; Guan, W; Guenther, J; Guescini, F; Guest, D; Gueta, O; Gui, B; Guido, E; Guillemin, T; Guindon, S; Gul, U; Gumpert, C; Guo, J; Guo, Y; Gupta, R; Gupta, S; Gustavino, G; Gutierrez, P; Gutierrez Ortiz, N G; Gutschow, C; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haber, C; Hadavand, H K; Hadef, A; Hageböck, S; Hagihara, M; Hajduk, Z; Hakobyan, H; Haleem, M; Haley, J; Halladjian, G; Hallewell, G D; Hamacher, K; Hamal, P; Hamano, K; Hamilton, A; Hamity, G N; Hamnett, P G; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Haney, B; Hanke, P; Hanna, R; Hansen, J B; Hansen, J D; Hansen, M C; Hansen, P H; Hara, K; Hard, A S; Harenberg, T; Hariri, F; Harkusha, S; Harrington, R D; Harrison, P F; Hartjes, F; Hartmann, N M; Hasegawa, M; Hasegawa, Y; Hasib, A; Hassani, S; Haug, S; Hauser, R; Hauswald, L; Havranek, M; Hawkes, C M; Hawkings, R J; Hayakawa, D; Hayden, D; Hays, C P; Hays, J M; Hayward, H S; Haywood, S J; Head, S J; Heck, T; Hedberg, V; Heelan, L; Heim, S; Heim, T; Heinemann, B; Heinrich, J J; Heinrich, L; Heinz, C; Hejbal, J; Helary, L; Hellman, S; Helsens, C; Henderson, J; Henderson, R C W; Heng, Y; Henkelmann, S; Henriques Correia, A M; Henrot-Versille, S; Herbert, G H; Herde, H; Herget, V; Hernández Jiménez, Y; Herten, G; Hertenberger, R; Hervas, L; Hesketh, G G; Hessey, N P; Hetherly, J W; Higón-Rodriguez, E; Hill, E; Hill, J C; Hiller, K H; Hillier, S J; Hinchliffe, I; Hines, E; Hinman, R R; Hirose, M; Hirschbuehl, D; Hoad, X; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoenig, F; Hohn, D; Holmes, T R; Homann, M; Honda, T; Hong, T M; Hooberman, B H; Hopkins, W H; Horii, Y; Horton, A J; Hostachy, J-Y; Hou, S; Hoummada, A; Howarth, J; Hoya, J; Hrabovsky, M; Hristova, I; Hrivnac, J; Hryn'ova, T; Hrynevich, A; Hsu, P J; Hsu, S-C; Hu, Q; Hu, S; Huang, Y; Hubacek, Z; Hubaut, F; Huegging, F; Huffman, T B; Hughes, E W; Hughes, G; Huhtinen, M; Huo, P; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibragimov, I; Iconomidou-Fayard, L; Ideal, E; Iengo, P; Igonkina, O; Iizawa, T; Ikegami, Y; Ikeno, M; Ilchenko, Y; Iliadis, D; Ilic, N; Introzzi, G; Ioannou, P; Iodice, M; Iordanidou, K; Ippolito, V; Ishijima, N; Ishino, M; Ishitsuka, M; Ishmukhametov, R; Issever, C; Istin, S; Ito, F; Iturbe Ponce, J M; Iuppa, R; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jabbar, S; Jackson, B; Jackson, P; Jain, V; Jakobi, K B; Jakobs, K; Jakobsen, S; Jakoubek, T; Jamin, D O; Jana, D K; Jansky, R; Janssen, J; Janus, M; Janus, P A; Jarlskog, G; Javadov, N; Javůrek, T; Javurkova, M; Jeanneau, F; Jeanty, L; Jejelava, J; Jeng, G-Y; Jennens, D; Jenni, P; Jeske, C; Jézéquel, S; Ji, H; Jia, J; Jiang, H; Jiang, Y; Jiang, Z; Jiggins, S; Jimenez Pena, J; Jin, S; Jinaru, A; Jinnouchi, O; Jivan, H; Johansson, P; Johns, K A; Johnson, W J; Jon-And, K; Jones, G; Jones, R W L; Jones, S; Jones, T J; Jongmanns, J; Jorge, P M; Jovicevic, J; Ju, X; Juste Rozas, A; Köhler, M K; Kaczmarska, A; Kado, M; Kagan, H; Kagan, M; Kahn, S J; Kaji, T; Kajomovitz, E; Kalderon, C W; Kaluza, A; Kama, S; Kamenshchikov, A; Kanaya, N; Kaneti, S; Kanjir, L; Kantserov, V A; Kanzaki, J; Kaplan, B; Kaplan, L S; Kapliy, A; Kar, D; Karakostas, K; Karamaoun, A; Karastathis, N; Kareem, M J; Karentzos, E; Karnevskiy, M; Karpov, S N; Karpova, Z M; Karthik, K; Kartvelishvili, V; Karyukhin, A N; Kasahara, K; Kashif, L; Kass, R D; Kastanas, A; Kataoka, Y; Kato, C; Katre, A; Katzy, J; Kawade, K; Kawagoe, K; Kawamoto, T; Kawamura, G; Kazanin, V F; Keeler, R; Kehoe, R; Keller, J S; Kempster, J J; Keoshkerian, H; Kepka, O; Kerševan, B P; Kersten, S; Keyes, R A; Khader, M; Khalil-Zada, F; Khanov, A; Kharlamov, A G; Kharlamova, T; Khoo, T J; Khovanskiy, V; Khramov, E; Khubua, J; Kido, S; Kilby, C R; Kim, H Y; Kim, S H; Kim, Y K; Kimura, N; Kind, O M; King, B T; King, M; Kirk, J; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kiss, F; Kiuchi, K; Kivernyk, O; Kladiva, E; Klein, M H; Klein, M; Klein, U; Kleinknecht, K; Klimek, P; Klimentov, A; Klingenberg, R; Klioutchnikova, T; Kluge, E-E; Kluit, P; Kluth, S; Knapik, J; Kneringer, E; Knoops, E B F G; Knue, A; Kobayashi, A; Kobayashi, D; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Koffas, T; Koffeman, E; Köhler, N M; Koi, T; Kolanoski, H; Kolb, M; Koletsou, I; Komar, A A; Komori, Y; Kondo, T; Kondrashova, N; Köneke, K; König, A C; Kono, T; Konoplich, R; Konstantinidis, N; Kopeliansky, R; Koperny, S; Köpke, L; Kopp, A K; Korcyl, K; Kordas, K; Korn, A; Korol, A A; Korolkov, I; Korolkova, E V; Kortner, O; Kortner, S; Kosek, T; Kostyukhin, V V; Kotwal, A; Koulouris, A; Kourkoumeli-Charalampidi, A; Kourkoumelis, C; Kouskoura, V; Kowalewska, A B; Kowalewski, R; Kowalski, T Z; Kozakai, C; Kozanecki, W; Kozhin, A S; Kramarenko, V A; Kramberger, G; Krasnopevtsev, D; Krasny, M W; Krasznahorkay, A; Kravchenko, A; Kretz, M; Kretzschmar, J; Kreutzfeldt, K; Krieger, P; Krizka, K; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Krumnack, N; Kruse, M C; Kruskal, M; Kubota, T; Kucuk, H; Kuday, S; Kuechler, J T; Kuehn, S; Kugel, A; Kuger, F; Kuhl, T; Kukhtin, V; Kukla, R; Kulchitsky, Y; Kuleshov, S; Kuna, M; Kunigo, T; Kupco, A; Kurashige, H; Kurchaninov, L L; Kurochkin, Y A; Kurth, M G; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; Kwan, T; Kyriazopoulos, D; La Rosa, A; La Rosa Navarro, J L; La Rotonda, L; Lacasta, C; Lacava, F; Lacey, J; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Lammers, S; Lampl, W; Lançon, E; Landgraf, U; Landon, M P J; Lanfermann, M C; Lang, V S; Lange, J C; Lankford, A J; Lanni, F; Lantzsch, K; Lanza, A; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Lasagni Manghi, F; Lassnig, M; Laurelli, P; Lavrijsen, W; Law, A T; Laycock, P; Lazovich, T; Lazzaroni, M; Le, B; Le Dortz, O; Le Guirriec, E; Le Quilleuc, E P; LeBlanc, M; LeCompte, T; Ledroit-Guillon, F; Lee, C A; Lee, S C; Lee, L; Lefebvre, B; Lefebvre, G; Lefebvre, M; Legger, F; Leggett, C; Lehan, A; Lehmann Miotto, G; Lei, X; Leight, W A; Leister, A G; Leite, M A L; Leitner, R; Lellouch, D; Lemmer, B; Leney, K J C; Lenz, T; Lenzi, B; Leone, R; Leone, S; Leonidopoulos, C; Leontsinis, S; Lerner, G; Leroy, C; Lesage, A A J; Lester, C G; Levchenko, M; Levêque, J; Levin, D; Levinson, L J; Levy, M; Lewis, D; Leyton, M; Li, B; Li, C; Li, H; Li, L; Li, L; Li, Q; Li, S; Li, X; Li, Y; Liang, Z; Liberti, B; Liblong, A; Lichard, P; Lie, K; Liebal, J; Liebig, W; Limosani, A; Lin, S C; Lin, T H; Lindquist, B E; Lionti, A E; Lipeles, E; Lipniacka, A; Lisovyi, M; Liss, T M; Lister, A; Litke, A M; Liu, B; Liu, D; Liu, H; Liu, H; Liu, J; Liu, J B; Liu, K; Liu, L; Liu, M; Liu, Y L; Liu, Y; Livan, M; Lleres, A; Llorente Merino, J; Lloyd, S L; Lo Sterzo, F; Lobodzinska, E M; Loch, P; Loebinger, F K; Loew, K M; Loginov, A; Lohse, T; Lohwasser, K; Lokajicek, M; Long, B A; Long, J D; Long, R E; Longo, L; Looper, K A; Lopez, J A; Lopez Mateos, D; Lopez Paredes, B; Lopez Paz, I; Lopez Solis, A; Lorenz, J; Lorenzo Martinez, N; Losada, M; Lösel, P J; Lou, X; Lounis, A; Love, J; Love, P A; Lu, H; Lu, N; Lubatti, H J; Luci, C; Lucotte, A; Luedtke, C; Luehring, F; Lukas, W; Luminari, L; Lundberg, O; Lund-Jensen, B; Luzi, P M; Lynn, D; Lysak, R; Lytken, E; Lyubushkin, V; Ma, H; Ma, L L; Ma, Y; Maccarrone, G; Macchiolo, A; Macdonald, C M; Maček, B; Machado Miguens, J; Madaffari, D; Madar, R; Maddocks, H J; Mader, W F; Madsen, A; Maeda, J; Maeland, S; Maeno, T; Maevskiy, A; Magradze, E; Mahlstedt, J; Maiani, C; Maidantchik, C; Maier, A A; Maier, T; Maio, A; Majewski, S; Makida, Y; Makovec, N; Malaescu, B; Malecki, Pa; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Malone, C; Maltezos, S; Malyukov, S; Mamuzic, J; Mancini, G; Mandelli, L; Mandić, I; Maneira, J; Manhaes de Andrade Filho, L; Manjarres Ramos, J; Mann, A; Manousos, A; Mansoulie, B; Mansour, J D; Mantifel, R; Mantoani, M; Manzoni, S; Mapelli, L; Marceca, G; March, L; Marchiori, G; Marcisovsky, M; Marjanovic, M; Marley, D E; Marroquim, F; Marsden, S P; Marshall, Z; Marti-Garcia, S; Martin, B; Martin, T A; Martin, V J; Martin Dit Latour, B; Martinez, M; Martinez Outschoorn, V I; Martin-Haugh, S; Martoiu, V S; Martyniuk, A C; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massa, L; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Mättig, P; Mattmann, J; Maurer, J; Maxfield, S J; Maximov, D A; Mazini, R; Maznas, I; Mazza, S M; Mc Fadden, N C; Mc Goldrick, G; Mc Kee, S P; McCarn, A; McCarthy, R L; McCarthy, T G; McClymont, L I; McDonald, E F; Mcfayden, J A; Mchedlidze, G; McMahon, S J; McPherson, R A; Medinnis, M; Meehan, S; Mehlhase, S; Mehta, A; Meier, K; Meineck, C; Meirose, B; Melini, D; Mellado Garcia, B R; Melo, M; Meloni, F; Menary, S B; Meng, L; Meng, X T; Mengarelli, A; Menke, S; Meoni, E; Mergelmeyer, S; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Meyer Zu Theenhausen, H; Miano, F; Middleton, R P; Miglioranzi, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Milesi, M; Milic, A; Miller, D W; Mills, C; Milov, A; Milstead, D A; Minaenko, A A; Minami, Y; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Minegishi, Y; Ming, Y; Mir, L M; Mistry, K P; Mitani, T; Mitrevski, J; Mitsou, V A; Miucci, A; Miyagawa, P S; Mizukami, A; Mjörnmark, J U; Mlynarikova, M; Moa, T; Mochizuki, K; Mogg, P; Mohapatra, S; Molander, S; Moles-Valls, R; Monden, R; Mondragon, M C; Mönig, K; Monk, J; Monnier, E; Montalbano, A; Montejo Berlingen, J; Monticelli, F; Monzani, S; Moore, R W; Morange, N; Moreno, D; Moreno Llácer, M; Morettini, P; Morgenstern, S; Mori, D; Mori, T; Morii, M; Morinaga, M; Morisbak, V; Moritz, S; Morley, A K; Mornacchi, G; Morris, J D; Morvaj, L; Moschovakos, P; Mosidze, M; Moss, H J; Moss, J; Motohashi, K; Mount, R; Mountricha, E; Moyse, E J W; Muanza, S; Mudd, R D; Mueller, F; Mueller, J; Mueller, R S P; Mueller, T; Muenstermann, D; Mullen, P; Mullier, G A; Munoz Sanchez, F J; Murillo Quijada, J A; Murray, W J; Musheghyan, H; Muškinja, M; Myagkov, A G; Myska, M; Nachman, B P; Nackenhorst, O; Nagai, K; Nagai, R; Nagano, K; Nagasaka, Y; Nagata, K; Nagel, M; Nagy, E; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Naranjo Garcia, R F; Narayan, R; Narrias Villar, D I; Naryshkin, I; Naumann, T; Navarro, G; Nayyar, R; Neal, H A; Nechaeva, P Yu; Neep, T J; Negri, A; Negrini, M; Nektarijevic, S; Nellist, C; Nelson, A; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neumann, M; Neves, R M; Nevski, P; Newman, P R; Nguyen, D H; Nguyen Manh, T; Nickerson, R B; Nicolaidou, R; Nielsen, J; Nikiforov, A; Nikolaenko, V; Nikolic-Audit, I; Nikolopoulos, K; Nilsen, J K; Nilsson, P; Ninomiya, Y; Nisati, A; Nisius, R; Nobe, T; Nomachi, M; Nomidis, I; Nooney, T; Norberg, S; Nordberg, M; Norjoharuddeen, N; Novgorodova, O; Nowak, S; Nozaki, M; Nozka, L; Ntekas, K; Nurse, E; Nuti, F; O'grady, F; O'Neil, D C; O'Rourke, A A; O'Shea, V; Oakham, F G; Oberlack, H; Obermann, T; Ocariz, J; Ochi, A; Ochoa, I; Ochoa-Ricoux, J P; Oda, S; Odaka, S; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohman, H; Oide, H; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Oleiro Seabra, L F; Olivares Pino, S A; Oliveira Damazio, D; Olszewski, A; Olszowska, J; Onofre, A; Onogi, K; Onyisi, P U E; Oreglia, M J; Oren, Y; Orestano, D; Orlando, N; Orr, R S; Osculati, B; Ospanov, R; Otero Y Garzon, G; Otono, H; Ouchrif, M; Ould-Saada, F; Ouraou, A; Oussoren, K P; Ouyang, Q; Owen, M; Owen, R E; Ozcan, V E; Ozturk, N; Pachal, K; Pacheco Pages, A; Pacheco Rodriguez, L; Padilla Aranda, C; Pagan Griso, S; Paganini, M; Paige, F; Pais, P; Pajchel, K; Palacino, G; Palazzo, S; Palestini, S; Palka, M; Pallin, D; Panagiotopoulou, E St; Panagoulias, I; Pandini, C E; Panduro Vazquez, J G; Pani, P; Panitkin, S; Pantea, D; Paolozzi, L; Papadopoulou, Th D; Papageorgiou, K; Paramonov, A; Paredes Hernandez, D; Parker, A J; Parker, M A; Parker, K A; Parodi, F; Parsons, J A; Parzefall, U; Pascuzzi, V R; Pasqualucci, E; Passaggio, S; Pastore, Fr; Pásztor, G; Pataraia, S; Pater, J R; Pauly, T; Pearce, J; Pearson, B; Pedersen, L E; Pedraza Lopez, S; Pedro, R; Peleganchuk, S V; Penc, O; Peng, C; Peng, H; Penwell, J; Peralva, B S; Perego, M M; Perepelitsa, D V; Perez Codina, E; Perini, L; Pernegger, H; Perrella, S; Peschke, R; Peshekhonov, V D; Peters, K; Peters, R F Y; Petersen, B A; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petroff, P; Petrolo, E; Petrov, M; Petrucci, F; Pettersson, N E; Peyaud, A; Pezoa, R; Phillips, P W; Piacquadio, G; Pianori, E; Picazio, A; Piccaro, E; Piccinini, M; Pickering, M A; Piegaia, R; Pilcher, J E; Pilkington, A D; Pin, A W J; Pinamonti, M; Pinfold, J L; Pingel, A; Pires, S; Pirumov, H; Pitt, M; Plazak, L; Pleier, M-A; Pleskot, V; Plotnikova, E; Pluth, D; Poettgen, R; Poggioli, L; Pohl, D; Polesello, G; Poley, A; Policicchio, A; Polifka, R; Polini, A; Pollard, C S; Polychronakos, V; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Poppleton, A; Pospisil, S; Potamianos, K; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Pozdnyakov, V; Pozo Astigarraga, M E; Pralavorio, P; Pranko, A; Prell, S; Price, D; Price, L E; Primavera, M; Prince, S; Prokofiev, K; Prokoshin, F; Protopopescu, S; Proudfoot, J; Przybycien, M; Puddu, D; Purohit, M; Puzo, P; Qian, J; Qin, G; Qin, Y; Quadt, A; Quayle, W B; Queitsch-Maitland, M; Quilty, D; Raddum, S; Radeka, V; Radescu, V; Radhakrishnan, S K; Radloff, P; Rados, P; Ragusa, F; Rahal, G; Raine, J A; Rajagopalan, S; Rammensee, M; Rangel-Smith, C; Ratti, M G; Rauch, D M; Rauscher, F; Rave, S; Ravenscroft, T; Ravinovich, I; Raymond, M; Read, A L; Readioff, N P; Reale, M; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reed, R G; Reeves, K; Rehnisch, L; Reichert, J; Reiss, A; Rembser, C; Ren, H; Rescigno, M; Resconi, S; Rezanova, O L; Reznicek, P; Rezvani, R; Richter, R; Richter, S; Richter-Was, E; Ricken, O; Ridel, M; Rieck, P; Riegel, C J; Rieger, J; Rifki, O; Rijssenbeek, M; Rimoldi, A; Rimoldi, M; Rinaldi, L; Ristić, B; Ritsch, E; Riu, I; Rizatdinova, F; Rizvi, E; Rizzi, C; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robson, A; Roda, C; Rodina, Y; Rodriguez Perez, A; Rodriguez Rodriguez, D; Roe, S; Rogan, C S; Røhne, O; Roloff, J; Romaniouk, A; Romano, M; Romano Saez, S M; Romero Adam, E; Rompotis, N; Ronzani, M; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, P; Rosien, N-A; Rossetti, V; Rossi, E; Rossi, L P; Rosten, J H N; Rosten, R; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Rozanov, A; Rozen, Y; Ruan, X; Rubbo, F; Rudolph, M S; Rühr, F; Ruiz-Martinez, A; Rurikova, Z; Rusakovich, N A; Ruschke, A; Russell, H L; Rutherfoord, J P; Ruthmann, N; Ryabov, Y F; Rybar, M; Rybkin, G; Ryu, S; Ryzhov, A; Rzehorz, G F; Saavedra, A F; Sabato, G; Sacerdoti, S; Sadrozinski, H F-W; Sadykov, R; Safai Tehrani, F; Saha, P; Sahinsoy, M; Saimpert, M; Saito, T; Sakamoto, H; Sakurai, Y; Salamanna, G; Salamon, A; Salazar Loyola, J E; Salek, D; Sales De Bruin, P H; Salihagic, D; Salnikov, A; Salt, J; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sammel, D; Sampsonidis, D; Sánchez, J; Sanchez Martinez, V; Sanchez Pineda, A; Sandaker, H; Sandbach, R L; Sandhoff, M; Sandoval, C; Sankey, D P C; Sannino, M; Sansoni, A; Santoni, C; Santonico, R; Santos, H; Santoyo Castillo, I; Sapp, K; Sapronov, A; Saraiva, J G; Sarrazin, B; Sasaki, O; Sato, K; Sauvan, E; Savage, G; Savard, P; Savic, N; Sawyer, C; Sawyer, L; Saxon, J; Sbarra, C; Sbrizzi, A; Scanlon, T; Scannicchio, D A; Scarcella, M; Scarfone, V; Schaarschmidt, J; Schacht, P; Schachtner, B M; Schaefer, D; Schaefer, L; Schaefer, R; Schaeffer, J; Schaepe, S; Schaetzel, S; Schäfer, U; Schaffer, A C; Schaile, D; Schamberger, R D; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Schiavi, C; Schier, S; Schillo, C; Schioppa, M; Schlenker, S; Schmidt-Sommerfeld, K R; Schmieden, K; Schmitt, C; Schmitt, S; Schmitz, S; Schneider, B; Schnoor, U; Schoeffel, L; Schoening, A; Schoenrock, B D; Schopf, E; Schott, M; Schouwenberg, J F P; Schovancova, J; Schramm, S; Schreyer, M; Schuh, N; Schulte, A; Schultens, M J; Schultz-Coulon, H-C; Schulz, H; Schumacher, M; Schumm, B A; Schune, Ph; Schwartzman, A; Schwarz, T A; Schweiger, H; Schwemling, Ph; Schwienhorst, R; Schwindling, J; Schwindt, T; Sciolla, G; Scuri, F; Scutti, F; Searcy, J; Seema, P; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Sekhon, K; Sekula, S J; Seliverstov, D M; Semprini-Cesari, N; Serfon, C; Serin, L; Serkin, L; Sessa, M; Seuster, R; Severini, H; Sfiligoj, T; Sforza, F; Sfyrla, A; Shabalina, E; Shaikh, N W; Shan, L Y; Shang, R; Shank, J T; Shapiro, M; Shatalov, P B; Shaw, K; Shaw, S M; Shcherbakova, A; Shehu, C Y; Sherwood, P; Shi, L; Shimizu, S; Shimmin, C O; Shimojima, M; Shirabe, S; Shiyakova, M; Shmeleva, A; Saadi, D Shoaleh; Shochet, M J; Shojaii, S; Shope, D R; Shrestha, S; Shulga, E; Shupe, M A; Sicho, P; Sickles, A M; Sidebo, P E; Sideras Haddad, E; Sidiropoulou, O; Sidorov, D; Sidoti, A; Siegert, F; Sijacki, Dj; Silva, J; Silverstein, S B; Simak, V; Simic, Lj; Simion, S; Simioni, E; Simmons, B; Simon, D; Simon, M; Sinervo, P; Sinev, N B; Sioli, M; Siragusa, G; Sivoklokov, S Yu; Sjölin, J; Skinner, M B; Skottowe, H P; Skubic, P; Slater, M; Slavicek, T; Slawinska, M; Sliwa, K; Slovak, R; Smakhtin, V; Smart, B H; Smestad, L; Smiesko, J; Smirnov, S Yu; Smirnov, Y; Smirnova, L N; Smirnova, O; Smith, J W; Smith, M N K; Smith, R W; Smizanska, M; Smolek, K; Snesarev, A A; Snyder, I M; Snyder, S; Sobie, R; Socher, F; Soffer, A; Soh, D A; Sokhrannyi, G; Solans Sanchez, C A; Solar, M; Soldatov, E Yu; Soldevila, U; Solodkov, A A; Soloshenko, A; Solovyanov, O V; Solovyev, V; Sommer, P; Son, H; Song, H Y; Sood, A; Sopczak, A; Sopko, V; Sorin, V; Sosa, D; Sotiropoulou, C L; Soualah, R; Soukharev, A M; South, D; Sowden, B C; Spagnolo, S; Spalla, M; Spangenberg, M; Spanò, F; Sperlich, D; Spettel, F; Spighi, R; Spigo, G; Spiller, L A; Spousta, M; St Denis, R D; Stabile, A; Stamen, R; Stamm, S; Stanecka, E; Stanek, R W; Stanescu, C; Stanescu-Bellu, M; Stanitzki, M M; Stapnes, S; Starchenko, E A; Stark, G H; Stark, J; Stark, S H; Staroba, P; Starovoitov, P; Stärz, S; Staszewski, R; Steinberg, P; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stewart, G A; Stillings, J A; Stockton, M C; Stoebe, M; Stoicea, G; Stolte, P; Stonjek, S; Stradling, A R; Straessner, A; Stramaglia, M E; Strandberg, J; Strandberg, S; Strandlie, A; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Stroynowski, R; Strubig, A; Stucci, S A; Stugu, B; Styles, N A; Su, D; Su, J; Suchek, S; Sugaya, Y; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, S; Sun, X; Sundermann, J E; Suruliz, K; Suster, C J E; Sutton, M R; Suzuki, S; Svatos, M; Swiatlowski, M; Swift, S P; Sykora, I; Sykora, T; Ta, D; Taccini, C; Tackmann, K; Taenzer, J; Taffard, A; Tafirout, R; Taiblum, N; Takai, H; Takashima, R; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A A; Tan, K G; Tanaka, J; Tanaka, M; Tanaka, R; Tanaka, S; Tanioka, R; Tannenwald, B B; Tapia Araya, S; Tapprogge, S; Tarem, S; Tartarelli, G F; Tas, P; Tasevsky, M; Tashiro, T; Tassi, E; Tavares Delgado, A; Tayalati, Y; Taylor, A C; Taylor, G N; Taylor, P T E; Taylor, W; Teischinger, F A; Teixeira-Dias, P; Temming, K K; Temple, D; Ten Kate, H; Teng, P K; Teoh, J J; Tepel, F; Terada, S; Terashi, K; Terron, J; Terzo, S; Testa, M; Teuscher, R J; Theveneaux-Pelzer, T; Thomas, J P; Thomas-Wilsker, J; Thompson, P D; Thompson, A S; Thomsen, L A; Thomson, E; Tibbetts, M J; Ticse Torres, R E; Tikhomirov, V O; Tikhonov, Yu A; Timoshenko, S; Tipton, P; Tisserant, S; Todome, K; Todorov, T; Todorova-Nova, S; Tojo, J; Tokár, S; Tokushuku, K; Tolley, E; Tomlinson, L; Tomoto, M; Tompkins, L; Toms, K; Tong, B; Tornambe, P; Torrence, E; Torres, H; Torró Pastor, E; Toth, J; Touchard, F; Tovey, D R; Trefzger, T; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Tripiana, M F; Trischuk, W; Trocmé, B; Trofymov, A; Troncon, C; Trottier-McDonald, M; Trovatelli, M; Truong, L; Trzebinski, M; Trzupek, A; Tseng, J C-L; Tsiareshka, P V; Tsipolitis, G; Tsirintanis, N; Tsiskaridze, S; Tsiskaridze, V; Tskhadadze, E G; Tsui, K M; Tsukerman, I I; Tsulaia, V; Tsuno, S; Tsybychev, D; Tu, Y; Tudorache, A; Tudorache, V; Tulbure, T T; Tuna, A N; Tupputi, S A; Turchikhin, S; Turgeman, D; Turk Cakir, I; Turra, R; Tuts, P M; Ucchielli, G; Ueda, I; Ughetto, M; Ukegawa, F; Unal, G; Undrus, A; Unel, G; Ungaro, F C; Unno, Y; Unverdorben, C; Urban, J; Urquijo, P; Urrejola, P; Usai, G; Usui, J; Vacavant, L; Vacek, V; Vachon, B; Valderanis, C; Valdes Santurio, E; Valencic, N; Valentinetti, S; Valero, A; Valery, L; Valkar, S; Valls Ferrer, J A; Van Den Wollenberg, W; Van Der Deijl, P C; van der Graaf, H; van Eldik, N; van Gemmeren, P; Van Nieuwkoop, J; van Vulpen, I; van Woerden, M C; Vanadia, M; Vandelli, W; Vanguri, R; Vaniachine, A; Vankov, P; Vardanyan, G; Vari, R; Varnes, E W; Varol, T; Varouchas, D; Vartapetian, A; Varvell, K E; Vasquez, J G; Vasquez, G A; Vazeille, F; Vazquez Schroeder, T; Veatch, J; Veeraraghavan, V; Veloce, L M; Veloso, F; Veneziano, S; Ventura, A; Venturi, M; Venturi, N; Venturini, A; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Viazlo, O; Vichou, I; Vickey, T; Vickey Boeriu, O E; Viehhauser, G H A; Viel, S; Vigani, L; Villa, M; Villaplana Perez, M; Vilucchi, E; Vincter, M G; Vinogradov, V B; Vittori, C; Vivarelli, I; Vlachos, S; Vlasak, M; Vogel, M; Vokac, P; Volpi, G; Volpi, M; von der Schmitt, H; von Toerne, E; Vorobel, V; Vorobev, K; Vos, M; Voss, R; Vossebeld, J H; Vranjes, N; Vranjes Milosavljevic, M; Vrba, V; Vreeswijk, M; Vuillermet, R; Vukotic, I; Wagner, P; Wagner, W; Wahlberg, H; Wahrmund, S; Wakabayashi, J; Walder, J; Walker, R; Walkowiak, W; Wallangen, V; Wang, C; Wang, C; Wang, F; Wang, H; Wang, H; Wang, J; Wang, J; Wang, K; Wang, R; Wang, S M; Wang, T; Wang, W; Wanotayaroj, C; Warburton, A; Ward, C P; Wardrope, D R; Washbrook, A; Watkins, P M; Watson, A T; Watson, M F; Watts, G; Watts, S; Waugh, B M; Webb, S; Weber, M S; Weber, S W; Weber, S A; Webster, J S; Weidberg, A R; Weinert, B; Weingarten, J; Weiser, C; Weits, H; Wells, P S; Wenaus, T; Wengler, T; Wenig, S; Wermes, N; Werner, M D; Werner, P; Wessels, M; Wetter, J; Whalen, K; Whallon, N L; Wharton, A M; White, A; White, M J; White, R; Whiteson, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wiglesworth, C; Wiik-Fuchs, L A M; Wildauer, A; Wilk, F; Wilkens, H G; Williams, H H; Williams, S; Willis, C; Willocq, S; Wilson, J A; Wingerter-Seez, I; Winklmeier, F; Winston, O J; Winter, B T; Wittgen, M; Wolf, T M H; Wolff, R; Wolter, M W; Wolters, H; Worm, S D; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wu, M; Wu, M; Wu, S L; Wu, X; Wu, Y; Wyatt, T R; Wynne, B M; Xella, S; Xi, Z; Xu, D; Xu, L; Yabsley, B; Yacoob, S; Yamaguchi, D; Yamaguchi, Y; Yamamoto, A; Yamamoto, S; Yamanaka, T; Yamauchi, K; Yamazaki, Y; Yan, Z; Yang, H; Yang, H; Yang, Y; Yang, Z; Yao, W-M; Yap, Y C; Yasu, Y; Yatsenko, E; Yau Wong, K H; Ye, J; Ye, S; Yeletskikh, I; Yildirim, E; Yorita, K; Yoshida, R; Yoshihara, K; Young, C; Young, C J S; Youssef, S; Yu, D R; Yu, J; Yu, J M; Yu, J; Yuan, L; Yuen, S P Y; Yusuff, I; Zabinski, B; Zacharis, G; Zaidan, R; Zaitsev, A M; Zakharchuk, N; Zalieckas, J; Zaman, A; Zambito, S; Zanello, L; Zanzi, D; Zeitnitz, C; Zeman, M; Zemla, A; Zeng, J C; Zeng, Q; Zenin, O; Ženiš, T; Zerwas, D; Zhang, D; Zhang, F; Zhang, G; Zhang, H; Zhang, J; Zhang, L; Zhang, L; Zhang, M; Zhang, R; Zhang, R; Zhang, X; Zhang, Z; Zhao, X; Zhao, Y; Zhao, Z; Zhemchugov, A; Zhong, J; Zhou, B; Zhou, C; Zhou, L; Zhou, L; Zhou, M; Zhou, N; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zhukov, K; Zibell, A; Zieminska, D; Zimine, N I; Zimmermann, C; Zimmermann, S; Zinonos, Z; Zinser, M; Ziolkowski, M; Živković, L; Zobernig, G; Zoccoli, A; Zur Nedden, M; Zwalinski, L

    2017-01-01

    Measurements of the production cross section of a [Formula: see text] boson in association with jets in proton-proton collisions at [Formula: see text] TeV are presented, using data corresponding to an integrated luminosity of 3.16 fb[Formula: see text] collected by the ATLAS experiment at the CERN Large Hadron Collider in 2015. Inclusive and differential cross sections are measured for events containing a [Formula: see text] boson decaying to electrons or muons and produced in association with up to seven jets with [Formula: see text] GeV and [Formula: see text]. Predictions from different Monte Carlo generators based on leading-order and next-to-leading-order matrix elements for up to two additional partons interfaced with parton shower and fixed-order predictions at next-to-leading order and next-to-next-to-leading order are compared with the measured cross sections. Good agreement within the uncertainties is observed for most of the modelled quantities, in particular with the generators which use next-to-leading-order matrix elements and the more recent next-to-next-to-leading-order fixed-order predictions.

  15. Storage and retrieval of digital images in dermatology.

    PubMed

    Bittorf, A; Krejci-Papa, N C; Diepgen, T L

    1995-11-01

    Differential diagnosis in dermatology relies on the interpretation of visual information in the form of clinical and histopathological images. Up until now, reference images have had to be retrieved from textbooks and/or appropriate journals. To overcome inherent limitations of those storage media with respect to the number of images stored, display, and search parameters available, we designed a computer-based database of digitized dermatologic images. Images were taken from the photo archive of the Dermatological Clinic of the University of Erlangen. A database was designed using the Entity-Relationship approach. It was implemented on a PC-Windows platform using MS Access* and MS Visual Basic®. As WWW-server a Sparc 10 workstation was used with the CERN Hypertext-Transfer-Protocol-Daemon (httpd) 3.0 pre 6 software running. For compressed storage on a hard drive, a quality factor of 60 allowed on-screen differential diagnosis and corresponded to a compression factor of 1:35 for clinical images and 1:40 for histopathological images. Hierarchical keys of clinical or histopathological criteria permitted multi-criteria searches. A script using the Common Gateway Interface (CGI) enabled remote search and image retrieval via the World-Wide-Web (W3). A dermatologic image database, featurig clinical and histopathological images was constructed which allows for multi-parameter searches and world-wide remote access.

  16. The ALICE Electronic Logbook

    NASA Astrophysics Data System (ADS)

    Altini, V.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Divià, R.; Fuchs, U.; Makhlyueva, I.; Roukoutakis, F.; Schossmaier, K.; Soòs, C.; Vande Vyvre, P.; Von Haller, B.; ALICE Collaboration

    2010-04-01

    All major experiments need tools that provide a way to keep a record of the events and activities, both during commissioning and operations. In ALICE (A Large Ion Collider Experiment) at CERN, this task is performed by the Alice Electronic Logbook (eLogbook), a custom-made application developed and maintained by the Data-Acquisition group (DAQ). Started as a statistics repository, the eLogbook has evolved to become not only a fully functional electronic logbook, but also a massive information repository used to store the conditions and statistics of the several online systems. It's currently used by more than 600 users in 30 different countries and it plays an important role in the daily ALICE collaboration activities. This paper will describe the LAMP (Linux, Apache, MySQL and PHP) based architecture of the eLogbook, the database schema and the relevance of the information stored in the eLogbook to the different ALICE actors, not only for near real time procedures but also for long term data-mining and analysis. It will also present the web interface, including the different used technologies, the implemented security measures and the current main features. Finally it will present the roadmap for the future, including a migration to the web 2.0 paradigm, the handling of the database ever-increasing data volume and the deployment of data-mining tools.

  17. EDITORIAL: Lectures from the European RTN Winter School on Strings, Supergravity and Gauge Theories, CERN, 21 25 January 2008

    NASA Astrophysics Data System (ADS)

    Derendinger, J.-P.; Orlando, D.; Uranga, A.

    2008-11-01

    This special issue is devoted to the proceedings of the conference 'RTN Winter School on Strings, Supergravity and Gauge Theories', which took place at CERN, the European Centre for Nuclear Research, in Geneva, Switzerland, on the 21 25 January 2008. This event was organized in the framework of the European Mobility Research and Training Network entitled 'Constituents, Fundamental Forces and Symmetries of the Universe'. It is part of a yearly series of scientific schools, which represents what is by now a well established tradition. The previous ones have been held at SISSA, in Trieste, Italy, in February 2005 and at CERN in January 2006. The next one will again take place at CERN, in February 2009. The school was primarily meant for young doctoral students and postdoctoral researchers working in the area of string theory. It consisted of several general lectures of four hours each, whose notes are published in the present proceedings, and five working group discussion sessions, focused on specific topics of the network research program. It was attended by approximatively 250 participants. The topics of the lectures were chosen to provide an introduction to some of the areas of recent progress, and to the open problems, in string theory. One of the most active areas in string theory in recent years is the AdS/CFT or gauge/gravity correspondence, which proposes the complete equivalence of string theory on (asymptotically) anti-de Sitter spacetimes with gauge theories. The duality relates the weak coupling regime of one system to the strongly coupled regime of the other, and is therefore very non-trivial to test beyond the supersymmetry-protected BPS sector. One of the key ideas to quantitatively match several quantities on both sides is the use of integrability, both in the gauge theory and the string side. The lecture notes by Nick Dorey provide a pedagogical introduction to the fascinating topic of integrability in AdS/CFT. On the string theory side, progress has been limited by the difficulties of quantizing the worldsheet theory in the presence of RR backgrounds. There is increasing hope that these difficulties can be overcome, using the pure spinor formulation of string theory. The lectures by Yaron Oz overview the present status of this proposal. The gauge/gravity correspondence is already leading to important insights into questions of quantum gravity, like the entropy of black holes and its interpretation in terms of microstates. These questions can be addressed in string theory, for certain classes of supersymmetric black holes. The lectures by Vijay Balasubramanian, Jan de Boer, Sheer El-Showk and Ilies Messamah review recent progress in this direction. Throughout the years, formal developments in string theory have systematically led to improved understanding on how it may relate to nature. In this respect, the lectures by Henning Samtleben describe how the formal developments on gauged supergravities can be used to describe compactification vacua in string theory, and their implications for moduli stabilization and supersymmetry breaking. Indeed, softly broken supersymmetry is one of the leading proposals to describe particle physics at the TeV energy range, as described in the lectures by Gian Giudice (not covered in this issue). This connection with TeV scale physics is most appropriate and timely, given that this energy range will shortly become experimentally accessible in the LHC at CERN. The conference was financially supported by the European Commission under contract MRTN-CT-2004-005104 and by CERN. It was jointly organized by the Physics Institute of the University of Neuchâtel and the Theory Unit of the Physics Division of CERN. It is a great pleasure for us to warmly thank the Theory Unit of CERN for its very kind hospitality and for the high quality of the assistance and the infrastructure that it has provided. We also acknowledge helpful administrative assistance from the Physics Institute of the University of Neuchâtel. Special thanks also go to Denis Frank, for his very valuable help in preparing the conference web pages. Group photo

  18. Final Scientific EFNUDAT Workshop

    ScienceCinema

    None

    2018-05-23

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities ; International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) ;Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis;Workshop Assistant: Geraldine Jean

  19. Final Scientific EFNUDAT Workshop

    ScienceCinema

    None

    2018-06-20

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation, Cross section measurements, Experimental techniques, Uncertainties and covariances, Fission properties, and Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France), T. Belgya (IKI KFKI, Hungary), E. Gonzalez (CIEMAT, Spain), F. Gunsing (CEA, France), F.-J. Hambsch (IRMM, Belgium), A. Junghans (FZD, Germany), R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman), Marco Calviani, Samuel Andriamonje, Eric Berthoumieux, Carlos Guerrero, Roberto Losito, Vasilis Vlachoudis. Workshop Assistant: Geraldine Jean

  20. Final Scientific EFNUDAT Workshop

    ScienceCinema

    Garbil, Roger

    2018-04-16

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden). Workshop Organizing Committee: Enrico Chiaveri (Chairman); Marco Calviani; Samuel Andriamonje; Eric Berthoumieux; Carlos Guerrero; Roberto Losito; Vasilis Vlachoudis; Workshop Assistant: Geraldine Jean

  1. Final Scientific EFNUDAT Workshop

    ScienceCinema

    Lantz, Mattias; Neudecker, Denise

    2018-05-25

    Part 5 of The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium) A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean

  2. Final Scientific EFNUDAT Workshop

    ScienceCinema

    Wilson, J.N.

    2018-05-24

    Part 7 of The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities;International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium) A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman) Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean.

  3. CERN experience and strategy for the maintenance of cryogenic plants and distribution systems

    NASA Astrophysics Data System (ADS)

    Serio, L.; Bremer, J.; Claudet, S.; Delikaris, D.; Ferlin, G.; Pezzetti, M.; Pirotte, O.; Tavian, L.; Wagner, U.

    2015-12-01

    CERN operates and maintains the world largest cryogenic infrastructure ranging from ageing installations feeding detectors, test facilities and general services, to the state-of-the-art cryogenic system serving the flagship LHC machine complex. After several years of exploitation of a wide range of cryogenic installations and in particular following the last two years major shutdown to maintain and consolidate the LHC machine, we have analysed and reviewed the maintenance activities to implement an efficient and reliable exploitation of the installations. We report the results, statistics and lessons learned on the maintenance activities performed and in particular the required consolidations and major overhauling, the organization, management and methodologies implemented.

  4. Effects of bulk viscosity and hadronic rescattering in heavy ion collisions at energies available at the BNL Relativistic Heavy Ion Collider and at the CERN Large Hadron Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryu, Sangwook; Paquet, Jean-Francois; Shen, Chun

    Here, we describe ultrarelativistic heavy ion collisions at the BNL Relativistic Heavy Ion Collider and the CERN Large Hadron Collider with a hybrid model using the IP-Glasma model for the earliest stage and viscous hydrodynamics and microscopic transport for the later stages of the collision. We demonstrate that within this framework the bulk viscosity of the plasma plays an important role in describing the experimentally observed radial flow and azimuthal anisotropy simultaneously. Finally, we further investigate the dependence of observables on the temperature below which we employ the microscopic transport description.

  5. PanDA for COMPASS at JINR

    NASA Astrophysics Data System (ADS)

    Petrosyan, A. Sh.

    2016-09-01

    PanDA (Production and Distributed Analysis System) is a workload management system, widely used for data processing at experiments on Large Hadron Collider and others. COMPASS is a high-energy physics experiment at the Super Proton Synchrotron. Data processing for COMPASS runs locally at CERN, on lxbatch, the data itself stored in CASTOR. In 2014 an idea to start running COMPASS production through PanDA arose. Such transformation in experiment's data processing will allow COMPASS community to use not only CERN resources, but also Grid resources worldwide. During the spring and summer of 2015 installation, validation and migration work is being performed at JINR. Details and results of this process are presented in this paper.

  6. Ian Hinchliffe Answers Your Higgs Boson Questions

    ScienceCinema

    Hinchliffe, Ian

    2017-12-09

    contingent with the ATLAS experiment at CERN, answers many of your questions about the Higgs boson. Ian invited viewers to send in questions about the Higgs via email, Twitter, Facebook, or YouTube in an "Ask a Scientist" video posted July 3: http://youtu.be/xhuA3wCg06s CERN's July 4 announcement that the ATLAS and CMS experiments at the Large Hadron Collider have discovered a particle "consistent with the Higgs boson" has raised questions about what scientists have found and what still remains to be found -- and what it all means. If you have suggestions for future "Ask a Scientist" videos, post them below or send ideas to askascientist@lbl.gov

  7. Studies for the electro-magnetic calorimeter SplitCal for the SHiP experiment at CERN with shower direction reconstruction capability

    NASA Astrophysics Data System (ADS)

    Bonivento, Walter M.

    2018-02-01

    This paper describes the basic ideas and the first simulation results of a new electro-magnetic calorimeter concept, named SplitCal, aimed at optimising the measurement of photon direction in fixed-target experiment configuration, with high photon detection efficiency. This calorimeter was designed for the invariant mass reconstruction of axion-like particles decaying into two photons in the mass range 200 MeV to 1 GeV for the proposed proton beam dump experiment SHiP at CERN. Preliminary results indicate that angular resolutions better than obtained by past experiments can be achieved with this design. An implementation of this concept with real technologies is under study.

  8. Adam: a Unix Desktop Application Manager

    NASA Astrophysics Data System (ADS)

    LiÉBana, M.; Marquina, M.; Ramos, R.

    ADAM stands for Affordable Desktop Application Manager. It is a GUI developed at CERN with the aim to ease access to applications. The motivation to develop ADAM came from the unavailability of environments like COSE/CDE and their heavy resource consumption. ADAM has proven to be user friendly: new users are able to customize it to their needs in few minutes. Groups of users may share through ADAM a common application environment. ADAM also integrates the Unix and the PC world. PC users can access Unix applications in the same way as their usual Windows applications. This paper describes all the ADAM features, how they are used at CERN Public Services, and the future plans for ADAM.

  9. Effects of bulk viscosity and hadronic rescattering in heavy ion collisions at energies available at the BNL Relativistic Heavy Ion Collider and at the CERN Large Hadron Collider

    DOE PAGES

    Ryu, Sangwook; Paquet, Jean-Francois; Shen, Chun; ...

    2018-03-15

    Here, we describe ultrarelativistic heavy ion collisions at the BNL Relativistic Heavy Ion Collider and the CERN Large Hadron Collider with a hybrid model using the IP-Glasma model for the earliest stage and viscous hydrodynamics and microscopic transport for the later stages of the collision. We demonstrate that within this framework the bulk viscosity of the plasma plays an important role in describing the experimentally observed radial flow and azimuthal anisotropy simultaneously. Finally, we further investigate the dependence of observables on the temperature below which we employ the microscopic transport description.

  10. Accelerating hydrodynamic description of pseudorapidity density and the initial energy density in p +p , Cu + Cu, Au + Au, and Pb + Pb collisions at energies available at the BNL Relativistic Heavy Ion Collider and the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Ze-Fang, Jiang; Chun-Bin, Yang; Csanád, Máté; Csörgő, Tamás

    2018-06-01

    A known class of analytic, exact, accelerating solutions of prefect relativistic hydrodynamics with longitudinal acceleration is utilized to describe results on the pseudorapidity distributions for different collision systems. These results include d N /d η measured in p +p , Cu+Cu, Au+Au, and Pb+Pb collisions at the BNL Relativistic Heavy Ion Collider and the CERN Large Hadron Collider, in a broad centrality range. Going beyond the traditional Bjorken model, from the accelerating hydrodynamic description we determine the initial energy density and other thermodynamic quantities in those collisions.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valerio-Lizarraga, Cristhian A., E-mail: cristhian.alfonso.valerio.lizarraga@cern.ch; Departamento de Investigación en Física, Universidad de Sonora, Hermosillo; Lallement, Jean-Baptiste

    The space charge effect of low energy, unbunched ion beams can be compensated by the trapping of ions or electrons into the beam potential. This has been studied for the 45 keV negative hydrogen ion beam in the CERN Linac4 Low Energy Beam Transport using the package IBSimu [T. Kalvas et al., Rev. Sci. Instrum. 81, 02B703 (2010)], which allows the space charge calculation of the particle trajectories. The results of the beam simulations will be compared to emittance measurements of an H{sup −} beam at the CERN Linac4 3 MeV test stand, where the injection of hydrogen gas directlymore » into the beam transport region has been used to modify the space charge compensation degree.« less

  12. The beam and detector of the NA62 experiment at CERN

    NASA Astrophysics Data System (ADS)

    Cortina Gil, E.; Martín Albarrán, E.; Minucci, E.; Nüssle, G.; Padolski, S.; Petrov, P.; Szilasi, N.; Velghe, B.; Georgiev, G.; Kozhuharov, V.; Litov, L.; Husek, T.; Kampf, K.; Zamkovsky, M.; Aliberti, R.; Geib, K. H.; Khoriauli, G.; Kleinknecht, K.; Kunze, J.; Lomidze, D.; Marchevski, R.; Peruzzo, L.; Vormstein, M.; Wanke, R.; Winhart, A.; Bolognesi, M.; Carassiti, V.; Chiozzi, S.; Cotta Ramusino, A.; Gianoli, A.; Malaguti, R.; Dalpiaz, P.; Fiorini, M.; Gamberini, E.; Neri, I.; Norton, A.; Petrucci, F.; Statera, M.; Wahl, H.; Bucci, F.; Ciaranfi, R.; Lenti, M.; Maletta, F.; Volpe, R.; Bizzeti, A.; Cassese, A.; Iacopini, E.; Antonelli, A.; Capitolo, E.; Capoccia, C.; Cecchetti, A.; Corradi, G.; Fascianelli, V.; Gonnella, F.; Lamanna, G.; Lenci, R.; Mannocchi, G.; Martellotti, S.; Moulson, M.; Paglia, C.; Raggi, M.; Russo, V.; Santoni, M.; Spadaro, T.; Tagnani, D.; Valeri, S.; Vassilieva, T.; Cassese, F.; Roscilli, L.; Ambrosino, F.; Capussela, T.; Di Filippo, D.; Massarotti, P.; Mirra, M.; Napolitano, M.; Saracino, G.; Barbanera, M.; Cenci, P.; Checcucci, B.; Duk, V.; Farnesini, L.; Gersabeck, E.; Lupi, M.; Papi, A.; Pepe, M.; Piccini, M.; Scolieri, G.; Aisa, D.; Anzivino, G.; Bizzarri, M.; Campeggi, C.; Imbergamo, E.; Piluso, A.; Santoni, C.; Berretta, L.; Bianucci, S.; Burato, A.; Cerri, C.; Fantechi, R.; Galeotti, S.; Magazzu', G.; Minuti, M.; Orsini, A.; Petragnani, G.; Pontisso, L.; Raffaelli, F.; Spinella, F.; Collazuol, G.; Mannelli, I.; Avanzini, C.; Costantini, F.; Di Lella, L.; Doble, N.; Giorgi, M.; Giudici, S.; Pedreschi, E.; Piandani, R.; Pierazzini, G.; Pinzino, J.; Sozzi, M.; Zaccarelli, L.; Biagioni, A.; Leonardi, E.; Lonardo, A.; Valente, P.; Vicini, P.; D'Agostini, G.; Ammendola, R.; Bonaiuto, V.; De Simone, N.; Federici, L.; Fucci, A.; Paoluzzi, G.; Salamon, A.; Salina, G.; Sargeni, F.; Biino, C.; Dellacasa, G.; Garbolino, S.; Marchetto, F.; Martoiu, S.; Mazza, G.; Rivetti, A.; Arcidiacono, R.; Bloch-Devaux, B.; Boretto, M.; Iacobuzio, L.; Menichetti, E.; Soldi, D.; Engelfried, J.; Estrada-Tristan, N.; Bragadireanu, A. M.; Hutanu, O. E.; Azorskiy, N.; Elsha, V.; Enik, T.; Falaleev, V.; Glonti, L.; Gusakov, Y.; Kakurin, S.; Kekelidze, V.; Kilchakovskaya, S.; Kislov, E.; Kolesnikov, A.; Madigozhin, D.; Misheva, M.; Movchan, S.; Polenkevich, I.; Potrebenikov, Y.; Samsonov, V.; Shkarovskiy, S.; Sotnikov, S.; Tarasova, L.; Zaytseva, M.; Zinchenko, A.; Bolotov, V.; Fedotov, S.; Gushin, E.; Khotjantsev, A.; Khudyakov, A.; Kleimenova, A.; Kudenko, Yu.; Shaikhiev, A.; Gorin, A.; Kholodenko, S.; Kurshetsov, V.; Obraztsov, V.; Ostankov, A.; Rykalin, V.; Semenov, V.; Sugonyaev, V.; Yushchenko, O.; Bician, L.; Blazek, T.; Cerny, V.; Koval, M.; Lietava, R.; Aglieri Rinella, G.; Arroyo Garcia, J.; Balev, S.; Battistin, M.; Bendotti, J.; Bergsma, F.; Bonacini, S.; Butin, F.; Ceccucci, A.; Chiggiato, P.; Danielsson, H.; Degrange, J.; Dixon, N.; Döbrich, B.; Farthouat, P.; Gatignon, L.; Golonka, P.; Girod, S.; Goncalves Martins De Oliveira, A.; Guida, R.; Hahn, F.; Harrouch, E.; Hatch, M.; Jarron, P.; Jamet, O.; Jenninger, B.; Kaplon, J.; Kluge, A.; Lehmann-Miotto, G.; Lichard, P.; Maire, G.; Mapelli, A.; Morant, J.; Morel, M.; Noël, J.; Noy, M.; Palladino, V.; Pardons, A.; Perez-Gomez, F.; Perktold, L.; Perrin-Terrin, M.; Petagna, P.; Poltorak, K.; Riedler, P.; Romagnoli, G.; Ruggiero, G.; Rutter, T.; Rouet, J.; Ryjov, V.; Saputi, A.; Schneider, T.; Stefanini, G.; Theis, C.; Tiuraniemi, S.; Vareia Rodriguez, F.; Venditti, S.; Vergain, M.; Vincke, H.; Wertelaers, P.; Brunetti, M. B.; Edwards, S.; Goudzovski, E.; Hallgren, B.; Krivda, M.; Lazzeroni, C.; Lurkin, N.; Munday, D.; Newson, F.; Parkinson, C.; Pyatt, S.; Romano, A.; Serghi, X.; Sergi, A.; Staley, R.; Sturgess, A.; Heath, H.; Page, R.; Angelucci, B.; Britton, D.; Protopopescu, D.; Skillicorn, I.; Cooke, P.; Dainton, J. B.; Fry, J. R.; Fulton, L.; Hutchcroft, D.; Jones, E.; Jones, T.; Massri, K.; Maurice, E.; McCormick, K.; Sutcliffe, P.; Wrona, B.; Conovaloff, A.; Cooper, P.; Coward, D.; Rubin, P.; Winston, R.

    2017-05-01

    NA62 is a fixed-target experiment at the CERN SPS dedicated to measurements of rare kaon decays. Such measurements, like the branching fraction of the K+ → π+ ν bar nu decay, have the potential to bring significant insights into new physics processes when comparison is made with precise theoretical predictions. For this purpose, innovative techniques have been developed, in particular, in the domain of low-mass tracking devices. Detector construction spanned several years from 2009 to 2014. The collaboration started detector commissioning in 2014 and will collect data until the end of 2018. The beam line and detector components are described together with their early performance obtained from 2014 and 2015 data.

  13. Final Scientific EFNUDAT Workshop

    ScienceCinema

    None

    2018-05-24

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) & Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis; Workshop Assistant: Geraldine Jean

  14. Development of radiation tolerant components for the Quench Protection System at CERN

    NASA Astrophysics Data System (ADS)

    Bitterling, O.; Denz, R.; Steckert, J.; Uznanski, S.

    2016-01-01

    This paper describes the results of irradiation campaigns with the high resolution Analog to Digital Converter (ADC) ADS1281. This ADC will be used as part of a revised quench detection circuit for the 600 A corrector magnets at the CERN Large Hadron Collider (LHC) . To verify the radiation tolerance of the ADC an irradiation campaign using a proton beam, applying doses up to 3,4 kGy was conducted. The resulting data and an analysis of the found failure modes is discussed in this paper. Several mitigation measures are described that allow to reduce the error rate to levels acceptable for operation as part of the LHC QPS.

  15. Search for the 1P 1 charmonium state in overlinepp annihilations at the CERN intersecting storage rings

    NASA Astrophysics Data System (ADS)

    Baglin, C.; Baird, S.; Bassompierre, G.; Borreani, G.; Brient, J.-C.; Broll, C.; Brom, J.-M.; Bugge, L.; Buran, T.; Burq, J.-P.; Bussière, A.; Buzzo, A.; Cester, R.; Chemarin, M.; Chevallier, M.; Escoubes, B.; Fay, J.; Ferroni, S.; Gracco, V.; Guillaud, J.-P.; Khan-Aronsen, E.; Kirsebom, K.; Kylling, A.; Ille, B.; Lambert, M.; Leistam, L.; Lundby, A.; Macri, M.; Marchetto, F.; Menichetti, E.; Mörch, Ch.; Mouellic, B.; Olsen, D.; Pastrone, N.; Petrillo, L.; Pia, M. G.; Poole, J.; Poulet, M.; Rinaudo, G.; Santroni, A.; Severi, M.; Skjevling, G.; Stapnes, S.; Stugu, B.; R704 Collaboration

    1986-04-01

    This experiment has been performed at the CERN Intersecting Storage Rings to study the direct formation of charmonium states in antiproton-proton annihilations. The experimental program has partly been devoted to an inclusive scan for overlinepp → J/ψ + X in the range 3520-3530 MeV/ c2. A cluster of five events has been observed in a narrow energy band, centred on the centre of gravity of the 3P J states where the 1P 1 is expected to be. When interpreted as a new resonace, these data yield a mass m = 3525.4±0.8 MeV/ c2.

  16. The accuracy of the ATLAS muon X-ray tomograph

    NASA Astrophysics Data System (ADS)

    Avramidou, R.; Berbiers, J.; Boudineau, C.; Dechelette, C.; Drakoulakos, D.; Fabjan, C.; Grau, S.; Gschwendtner, E.; Maugain, J.-M.; Rieder, H.; Rangod, S.; Rohrbach, F.; Sbrissa, E.; Sedykh, E.; Sedykh, I.; Smirnov, Y.; Vertogradov, L.; Vichou, I.

    2003-01-01

    A gigantic detector, the ATLAS project, is under construction at CERN for particle physics research at the Large Hadron Collider which is to be ready by 2006. An X-ray tomograph has been developed, designed and constructed at CERN in order to control the mechanical quality of the ATLAS muon chambers. We reached a measurement accuracy of 2 μm systematic and 2 μm statistical uncertainties in the horizontal and vertical directions in the working area 220 cm (horizontal)×60 cm (vertical). Here we describe in detail the fundamental approach of the basic principle chosen to achieve such good accuracy. In order to crosscheck our precision, key results of measurements are presented.

  17. Brightness and uniformity measurements of plastic scintillator tiles at the CERN H2 test beam

    NASA Astrophysics Data System (ADS)

    Chatrchyan, S.; Sirunyan, A. M.; Tumasyan, A.; Litomin, A.; Mossolov, V.; Shumeiko, N.; Van De Klundert, M.; Van Haevermaet, H.; Van Mechelen, P.; Van Spilbeeck, A.; Alves, G. A.; Aldá Júnior, W. L.; Hensel, C.; Carvalho, W.; Chinellato, J.; De Oliveira Martins, C.; Matos Figueiredo, D.; Mora Herrera, C.; Nogima, H.; Prado Da Silva, W. L.; Tonelli Manganote, E. J.; Vilela Pereira, A.; Finger, M.; Finger, M., Jr.; Kveton, A.; Tomsa, J.; Adamov, G.; Tsamalaidze, Z.; Behrens, U.; Borras, K.; Campbell, A.; Costanza, F.; Gunnellini, P.; Lobanov, A.; Melzer-Pellmann, I.-A.; Muhl, C.; Roland, B.; Sahin, M.; Saxena, P.; Hegde, V.; Kothekar, K.; Pandey, S.; Sharma, S.; Beri, S. B.; Bhawandeep, B.; Chawla, R.; Kalsi, A.; Kaur, A.; Kaur, M.; Walia, G.; Bhattacharya, S.; Ghosh, S.; Nandan, S.; Purohit, A.; Sharan, M.; Banerjee, S.; Bhattacharya, S.; Chatterjee, S.; Das, P.; Guchait, M.; Jain, S.; Kumar, S.; Maity, M.; Majumder, G.; Mazumdar, K.; Patil, M.; Sarkar, T.; Juodagalvis, A.; Afanasiev, S.; Bunin, P.; Ershov, Y.; Golutvin, I.; Malakhov, A.; Moisenz, P.; Smirnov, V.; Zarubin, A.; Chadeeva, M.; Chistov, R.; Danilov, M.; Popova, E.; Rusinov, V.; Andreev, Yu.; Dermenev, A.; Karneyeu, A.; Krasnikov, N.; Tlisov, D.; Toropin, A.; Epshteyn, V.; Gavrilov, V.; Lychkovskaya, N.; Popov, V.; Pozdnyakov, I.; Safronov, G.; Toms, M.; Zhokin, A.; Baskakov, A.; Belyaev, A.; Boos, E.; Dubinin, M.; Dudko, L.; Ershov, A.; Gribushin, A.; Kaminskiy, A.; Klyukhin, V.; Kodolova, O.; Lokhtin, I.; Miagkov, I.; Obraztsov, S.; Petrushanko, S.; Savrin, V.; Snigirev, A.; Andreev, V.; Azarkin, M.; Dremin, I.; Kirakosyan, M.; Leonidov, A.; Terkulov, A.; Bitioukov, S.; Elumakhov, D.; Kalinin, A.; Krychkine, V.; Mandrik, P.; Petrov, V.; Ryutin, R.; Sobol, A.; Troshin, S.; Volkov, A.; Sekmen, S.; Medvedeva, T.; Rumerio, P.; Adiguzel, A.; Bakirci, N.; Boran, F.; Cerci, S.; Damarseckin, S.; Demiroglu, Z. S.; Dölek, F.; Dozen, C.; Dumanoglu, I.; Eskut, E.; Girgis, S.; Gokbulut, G.; Guler, Y.; Hos, I.; Kangal, E. E.; Kara, O.; Kayis Topaksu, A.; Işik, C.; Kiminsu, U.; Oglakci, M.; Onengut, G.; Ozdemir, K.; Ozturk, S.; Polatoz, A.; Sunar Cerci, D.; Tali, B.; Tok, U. G.; Topakli, H.; Turkcapar, S.; Zorbakir, I. S.; Zorbilmez, C.; Bilin, B.; Isildak, B.; Karapinar, G.; Murat Guler, A.; Ocalan, K.; Yalvac, M.; Zeyrek, M.; Atakisi, I. O.; Gülmez, E.; Kaya, M.; Kaya, O.; Koseyan, O. K.; Ozcelik, O.; Ozkorucuklu, S.; Tekten, S.; Yetkin, E. A.; Yetkin, T.; Cankocak, K.; Sen, S.; Boyarintsev, A.; Grynyov, B.; Levchuk, L.; Popov, V.; Sorokin, P.; Flacher, H.; Borzou, A.; Call, K.; Dittmann, J.; Hatakeyama, K.; Liu, H.; Pastika, N.; Buccilli, A.; Cooper, S. I.; Henderson, C.; West, C.; Arcaro, D.; Gastler, D.; Hazen, E.; Rohlf, J.; Sulak, L.; Wu, S.; Zou, D.; Hakala, J.; Heintz, U.; Kwok, K. H. M.; Laird, E.; Landsberg, G.; Mao, Z.; Yu, D. R.; Gary, J. W.; Ghiasi Shirazi, S. M.; Lacroix, F.; Long, O. R.; Wei, H.; Bhandari, R.; Heller, R.; Stuart, D.; Yoo, J. H.; Chen, Y.; Duarte, J.; Lawhorn, J. M.; Nguyen, T.; Spiropulu, M.; Winn, D.; Abdullin, S.; Apresyan, A.; Apyan, A.; Banerjee, S.; Chlebana, F.; Freeman, J.; Green, D.; Hare, D.; Hirschauer, J.; Joshi, U.; Lincoln, D.; Los, S.; Pedro, K.; Spalding, W. J.; Strobbe, N.; Tkaczyk, S.; Whitbeck, A.; Linn, S.; Markowitz, P.; Martinez, G.; Bertoldi, M.; Hagopian, S.; Hagopian, V.; Kolberg, T.; Baarmand, M. M.; Noonan, D.; Roy, T.; Yumiceva, F.; Bilki, B.; Clarida, W.; Debbins, P.; Dilsiz, K.; Durgut, S.; Gandrajula, R. P.; Haytmyradov, M.; Khristenko, V.; Merlo, J.-P.; Mermerkaya, H.; Mestvirishvili, A.; Miller, M.; Moeller, A.; Nachtman, J.; Ogul, H.; Onel, Y.; Ozok, F.; Penzo, A.; Schmidt, I.; Snyder, C.; Southwick, D.; Tiras, E.; Yi, K.; Al-bataineh, A.; Bowen, J.; Castle, J.; McBrayer, W.; Murray, M.; Wang, Q.; Kaadze, K.; Maravin, Y.; Mohammadi, A.; Saini, L. K.; Baden, A.; Belloni, A.; Calderon, J. D.; Eno, S. C.; Feng, Y. B.; Ferraioli, C.; Grassi, T.; Hadley, N. J.; Jeng, G.-Y.; Kellogg, R. G.; Kunkle, J.; Mignerey, A.; Ricci-Tam, F.; Shin, Y. H.; Skuja, A.; Yang, Z. S.; Yao, Y.; Brandt, S.; D'Alfonso, M.; Hu, M.; Klute, M.; Niu, X.; Chatterjee, R. M.; Evans, A.; Frahm, E.; Kubota, Y.; Lesko, Z.; Mans, J.; Ruckstuhl, N.; Heering, A.; Karmgard, D. J.; Musienko, Y.; Ruchti, R.; Wayne, M.; Benaglia, A. D.; Mei, K.; Tully, C.; Bodek, A.; de Barbaro, P.; Galanti, M.; Garcia-Bellido, A.; Khukhunaishvili, A.; Lo, K. H.; Vishnevskiy, D.; Zielinski, M.; Agapitos, A.; Amouzegar, M.; Chou, J. P.; Hughes, E.; Saka, H.; Sheffield, D.; Akchurin, N.; Damgov, J.; De Guio, F.; Dudero, P. R.; Faulkner, J.; Gurpinar, E.; Kunori, S.; Lamichhane, K.; Lee, S. W.; Libeiro, T.; Mengke, T.; Muthumuni, S.; Undleeb, S.; Volobouev, I.; Wang, Z.; Goadhouse, S.; Hirosky, R.; Wang, Y.

    2018-01-01

    We study the light output, light collection efficiency and signal timing of a variety of organic scintillators that are being considered for the upgrade of the hadronic calorimeter of the CMS detector. The experimental data are collected at the H2 test-beam area at CERN, using a 150 GeV muon beam. In particular, we investigate the usage of over-doped and green-emitting plastic scintillators, two solutions that have not been extensively considered. We present a study of the energy distribution in plastic-scintillator tiles, the hit efficiency as a function of the hit position, and a study of the signal timing for blue and green scintillators.

  18. The NA62 trigger system

    NASA Astrophysics Data System (ADS)

    Krivda, M.; NA62 Collaboration

    2013-08-01

    The main aim of the NA62 experiment (NA62 Technical Design Report, [1]) is to study ultra-rare Kaon decays. In order to select rare events over the overwhelming background, central systems with high-performance, high bandwidth, flexibility and configurability are necessary, that minimize dead time while maximizing data collection reliability. The NA62 experiment consists of 12 sub-detector systems and several trigger and control systems, for a total channel count of less than 100,000. The GigaTracKer (GTK) has the largest number of channels (54,000), and the Liquid Krypton (LKr) calorimeter shares with it the largest raw data rate (19 GB/s). The NA62 trigger system works with 3 trigger levels. The first trigger level is based on a hardware central trigger unit, so-called L0 Trigger Processor (L0TP), and Local Trigger Units (LTU), which are all located in the experimental cavern. Other two trigger levels are based on software, and done with a computer farm located on surface. The L0TP receives information from triggering sub-detectors asynchronously via Ethernet; it processes the information, and then transmits a final trigger decision synchronously to each sub-detector through the Trigger and Timing Control (TTC) system. The interface between L0TP and the TTC system, which is used for trigger and clock distribution, is provided by the Local Trigger Unit board (LTU). The LTU can work in two modes: global and stand-alone. In the global mode, the LTU provides an interface between L0TP and TTC system. In the stand-alone mode, the LTU can fully emulate L0TP and so provides an independent way for each sub-detector for testing or calibration purposes. In addition to the emulation functionality, a further functionality is implemented that allows to synchronize the clock of the LTU with the L0TP and the TTC system. For testing and debugging purposes, a Snap Shot Memory (SSM) interface is implemented, that can work both in an input or an output mode. The trigger rates will be permanently monitored by reading counters at regular intervals. This paper describes the overall NA62 trigger system focusing on the setup for the dry and technical runs in 2012.

  19. MADANALYSIS 5, a user-friendly framework for collider phenomenology

    NASA Astrophysics Data System (ADS)

    Conte, Eric; Fuks, Benjamin; Serret, Guillaume

    2013-01-01

    We present MADANALYSIS 5, a new framework for phenomenological investigations at particle colliders. Based on a C++ kernel, this program allows us to efficiently perform, in a straightforward and user-friendly fashion, sophisticated physics analyses of event files such as those generated by a large class of Monte Carlo event generators. MADANALYSIS 5 comes with two modes of running. The first one, easier to handle, uses the strengths of a powerful PYTHON interface in order to implement physics analyses by means of a set of intuitive commands. The second one requires one to implement the analyses in the C++ programming language, directly within the core of the analysis framework. This opens unlimited possibilities concerning the level of complexity which can be reached, being only limited by the programming skills and the originality of the user. Program summaryProgram title: MadAnalysis 5 Catalogue identifier: AENO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Permission to use, copy, modify and distribute this program is granted under the terms of the GNU General Public License. No. of lines in distributed program, including test data, etc.: 31087 No. of bytes in distributed program, including test data, etc.: 399105 Distribution format: tar.gz Programming language: PYTHON, C++. Computer: All platforms on which Python version 2.7, Root version 5.27 and the g++ compiler are available. Compatibility with newer versions of these programs is also ensured. However, the Python version must be below version 3.0. Operating system: Unix, Linux and Mac OS operating systems on which the above-mentioned versions of Python and Root, as well as g++, are available. Classification: 11.1. External routines: ROOT (http://root.cern.ch/drupal/) Nature of problem: Implementing sophisticated phenomenological analyses in high-energy physics through a flexible, efficient and straightforward fashion, starting from event files such as those produced by Monte Carlo event generators. The event files can have been matched or not to parton-showering and can have been processed or not by a (fast) simulation of a detector. According to the sophistication level of the event files (parton-level, hadron-level, reconstructed-level), one must note that several input formats are possible. Solution method: We implement an interface allowing the production of predefined as well as user-defined histograms for a large class of kinematical distributions after applying a set of event selection cuts specified by the user. This therefore allows us to devise robust and novel search strategies for collider experiments, such as those currently running at the Large Hadron Collider at CERN, in a very efficient way. Restrictions: Unsupported event file format. Unusual features: The code is fully based on object representations for events, particles, reconstructed objects and cuts, which facilitates the implementation of an analysis. Running time: It depends on the purposes of the user and on the number of events to process. It varies from a few seconds to the order of the minute for several millions of events.

  20. A new approach to characterize very-low-level radioactive waste produced at hadron accelerators.

    PubMed

    Zaffora, Biagio; Magistris, Matteo; Chevalier, Jean-Pierre; Luccioni, Catherine; Saporta, Gilbert; Ulrici, Luisa

    2017-04-01

    Radioactive waste is produced as a consequence of preventive and corrective maintenance during the operation of high-energy particle accelerators or associated dismantling campaigns. Their radiological characterization must be performed to ensure an appropriate disposal in the disposal facilities. The radiological characterization of waste includes the establishment of the list of produced radionuclides, called "radionuclide inventory", and the estimation of their activity. The present paper describes the process adopted at CERN to characterize very-low-level radioactive waste with a focus on activated metals. The characterization method consists of measuring and estimating the activity of produced radionuclides either by experimental methods or statistical and numerical approaches. We adapted the so-called Scaling Factor (SF) and Correlation Factor (CF) techniques to the needs of hadron accelerators, and applied them to very-low-level metallic waste produced at CERN. For each type of metal we calculated the radionuclide inventory and identified the radionuclides that most contribute to hazard factors. The methodology proposed is of general validity, can be extended to other activated materials and can be used for the characterization of waste produced in particle accelerators and research centres, where the activation mechanisms are comparable to the ones occurring at CERN. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Controlled longitudinal emittance blow-up using band-limited phase noise in CERN PSB

    NASA Astrophysics Data System (ADS)

    Quartullo, D.; Shaposhnikova, E.; Timko, H.

    2017-07-01

    Controlled longitudinal emittance blow-up (from 1 eVs to 1.4 eVs) for LHC beams in the CERN PS Booster is currently achievied using sinusoidal phase modulation of a dedicated high-harmonic RF system. In 2021, after the LHC injectors upgrade, 3 eVs should be extracted to the PS. Even if the current method may satisfy the new requirements, it relies on low-power level RF improvements. In this paper another method of blow-up was considered, that is the injection of band-limited phase noise in the main RF system (h=1), never tried in PSB but already used in CERN SPS and LHC, under different conditions (longer cycles). This technique, which lowers the peak line density and therefore the impact of intensity effects in the PSB and the PS, can also be complementary to the present method. The longitudinal space charge, dominant in the PSB, causes significant synchrotron frequency shifts with intensity, and its effect should be taken into account. Another complication arises from the interaction of the phase loop with the injected noise, since both act on the RF phase. All these elements were studied in simulations of the PSB cycle with the BLonD code, and the required blow-up was achieved.

  2. New vertical cryostat for the high field superconducting magnet test station at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vande Craen, A.; Atieh, S.; Bajko, M.

    2014-01-29

    In the framework of the R and D program for new superconducting magnets for the Large Hadron Collider accelerator upgrades, CERN is building a new vertical test station to test high field superconducting magnets of unprecedented large size. This facility will allow testing of magnets by vertical insertion in a pressurized liquid helium bath, cooled to a controlled temperature between 4.2 K and 1.9 K. The dimensions of the cryostat will allow testing magnets of up to 2.5 m in length with a maximum diameter of 1.5 m and a mass of 15 tons. To allow for a faster insertionmore » and removal of the magnets and reducing the risk of helium leaks, all cryogenics supply lines are foreseen to remain permanently connected to the cryostat. A specifically designed 100 W heat exchanger is integrated in the cryostat helium vessel for a controlled cooling of the magnet from 4.2 K down to 1.9 K in a 3 m{sup 3} helium bath. This paper describes the cryostat and its main functions, focusing on features specifically developed for this project. The status of the construction and the plans for assembly and installation at CERN are also presented.« less

  3. Common HEP UNIX Environment

    NASA Astrophysics Data System (ADS)

    Taddei, Arnaud

    After it had been decided to design a common user environment for UNIX platforms among HEP laboratories, a joint project between DESY and CERN had been started. The project consists in 2 phases: 1. Provide a common user environment at shell level, 2. Provide a common user environment at graphical level (X11). Phase 1 is in production at DESY and at CERN as well as at PISA and RAL. It has been developed around the scripts originally designed at DESY Zeuthen improved and extended with a 2 months project at CERN with a contribution from DESY Hamburg. It consists of a set of files which are customizing the environment for the 6 main shells (sh, csh, ksh, bash, tcsh, zsh) on the main platforms (AIX, HP-UX, IRIX, SunOS, Solaris 2, OSF/1, ULTRIX, etc.) and it is divided at several "sociological" levels: HEP, site, machine, cluster, group of users and user with some levels which are optional. The second phase is under design and a first proposal has been published. A first version of the phase 2 exists already for AIX and Solaris, and it should be available for all other platforms, by the time of the conference. This is a major collective work between several HEP laboratories involved in the HEPiX-scripts and HEPiX-X11 working-groups.

  4. Virtuality and efficiency - overcoming past antinomy in the remote collaboration experience

    NASA Astrophysics Data System (ADS)

    Fernandes, Joao; Bjorkli, Knut; Clavo, David Martin; Baron, Thomas

    2010-04-01

    Several recent initiatives have been put in place by the CERN IT Department to improve the user experience in remote dispersed meetings and remote collaboration at large in the LHC communities worldwide. We will present an analysis of the factors which were historically limiting the efficiency of remote dispersed meetings and describe the consequent actions which were undertaken at CERN to overcome these limitations. After giving a status update of the different equipment available at CERN to enable the virtual sessions and the various collaborative tools which are currently proposed to users, we will focus on the evolution of this market: how can the new technological trends (among others, HD videoconferencing, Telepresence, Unified Communications, etc.) impact positively the user experience and how to attain the best usage of them. Finally, by projecting ourselves in the future, we will give some hints as to how to answer the difficult question of selecting the next generation of collaborative tools: which set of tools among the various offers (systems like Vidyo H264 SVC, next generation EVO, Groupware offers, standard H323 systems, etc.) is best suited for our environment and how to unify this set for the common user. This will finally allow us to definitively overcome the past antinomy between virtuality and efficiency.

  5. Raymond Stora's obituary

    NASA Astrophysics Data System (ADS)

    Becchi, C.

    2015-10-01

    On Monday, July 20, 2015 Raymond Stora passed away; although he was seriously ill, his death was unexpected, the result of a sudden heart attack. Raymond was born on September 18, 1930. He had been sick for many months, yet continued to go to CERN where he was able to discuss the problems in physics and mathematics that interested him. In fact, his last publication (recorded on SPIRES) carries the date of December 2014, just before he contracted pneumonia, which dramatically reduced his mobility and hence the possibility of going to CERN. Still, this last project revived Raymond's interest in algebraic curves, and he spent a large part of his last months at home reading papers and books on this subject. In 2013, despite the large amount of time that his various therapies required, Raymond made a fundamental contribution to a difficult problem on renormalization in configuration space based on the subtle technical properties of homogeneous distributions. His knowledge of physics and, in particular, of quantum field theory, as well as of many fields of mathematics was so well known that many members of and visitors to CERN frequently asked Raymond for advice and assistance, which he gave with great enthusiasm and in the most gracious way. Ivan Todorov, commenting on Raymond's death, noted that we must remember Raymond's remarkable qualities, which were both human and scientific.

  6. DNS load balancing in the CERN cloud

    NASA Astrophysics Data System (ADS)

    Reguero Naredo, Ignacio; Lobato Pardavila, Lorena

    2017-10-01

    Load Balancing is one of the technologies enabling deployment of large-scale applications on cloud resources. A DNS Load Balancer Daemon (LBD) has been developed at CERN as a cost-effective way to balance applications accepting DNS timing dynamics and not requiring persistence. It currently serves over 450 load-balanced aliases with two small VMs acting as master and slave. The aliases are mapped to DNS subdomains. These subdomains are managed with DDNS according to a load metric, which is collected from the alias member nodes with SNMP. During the last years, several improvements were brought to the software, for instance: support for IPv6, parallelization of the status requests, implementing the client in Python to allow for multiple aliases with differentiated states on the same machine or support for application state. The configuration of the Load Balancer is currently managed by a Puppet type. It discovers the alias member nodes and gets the alias definitions from the Ermis REST service. The Aiermis self-service GUI for the management of the LB aliases has been produced and is based on the Ermis service above that implements a form of Load Balancing as a Service (LBaaS). The Ermis REST API has authorisation based in Foreman hostgroups. The CERN DNS LBD is Open Software with Apache 2 license.

  7. Feasibility study for a biomedical experimental facility based on LEIR at CERN.

    PubMed

    Abler, Daniel; Garonna, Adriano; Carli, Christian; Dosanjh, Manjit; Peach, Ken

    2013-07-01

    In light of the recent European developments in ion beam therapy, there is a strong interest from the biomedical research community to have more access to clinically relevant beams. Beamtime for pre-clinical studies is currently very limited and a new dedicated facility would allow extensive research into the radiobiological mechanisms of ion beam radiation and the development of more refined techniques of dosimetry and imaging. This basic research would support the current clinical efforts of the new treatment centres in Europe (for example HIT, CNAO and MedAustron). This paper presents first investigations on the feasibility of an experimental biomedical facility based on the CERN Low Energy Ion Ring LEIR accelerator. Such a new facility could provide beams of light ions (from protons to neon ions) in a collaborative and cost-effective way, since it would rely partly on CERN's competences and infrastructure. The main technical challenges linked to the implementation of a slow extraction scheme for LEIR and to the design of the experimental beamlines are described and first solutions presented. These include introducing new extraction septa into one of the straight sections of the synchrotron, changing the power supply configuration of the magnets, and designing a new horizontal beamline suitable for clinical beam energies, and a low-energy vertical beamline for particular radiobiological experiments.

  8. Feasibility study for a biomedical experimental facility based on LEIR at CERN

    PubMed Central

    Abler, Daniel; Garonna, Adriano; Carli, Christian; Dosanjh, Manjit; Peach, Ken

    2013-01-01

    In light of the recent European developments in ion beam therapy, there is a strong interest from the biomedical research community to have more access to clinically relevant beams. Beamtime for pre-clinical studies is currently very limited and a new dedicated facility would allow extensive research into the radiobiological mechanisms of ion beam radiation and the development of more refined techniques of dosimetry and imaging. This basic research would support the current clinical efforts of the new treatment centres in Europe (for example HIT, CNAO and MedAustron). This paper presents first investigations on the feasibility of an experimental biomedical facility based on the CERN Low Energy Ion Ring LEIR accelerator. Such a new facility could provide beams of light ions (from protons to neon ions) in a collaborative and cost-effective way, since it would rely partly on CERN's competences and infrastructure. The main technical challenges linked to the implementation of a slow extraction scheme for LEIR and to the design of the experimental beamlines are described and first solutions presented. These include introducing new extraction septa into one of the straight sections of the synchrotron, changing the power supply configuration of the magnets, and designing a new horizontal beamline suitable for clinical beam energies, and a low-energy vertical beamline for particular radiobiological experiments. PMID:23824122

  9. Challenges in coupling LTER with environmental assessments: An insight from potential and reality of the Chinese Ecological Research Network in servicing environment assessments.

    PubMed

    Xia, Shaoxia; Liu, Yu; Yu, Xiubo; Fu, Bojie

    2018-08-15

    Environmental assessments estimate, evaluate and predict the consequences of natural processes and human activities on the environment. Long-term ecosystem observation and research networks (LTERs) are potentially valuable infrastructure to support environmental assessments. However, very few environmental assessments have successfully incorporated them. In this study, we try to reveal the current status of coupling LTERs with environmental assessments and look at the challenges involved in improving this coupling through exploring the role that Chinese Ecological Research Network (CERN), the LTER of China, currently plays in regional environment assessments. A review of official protocols and standards, regional assessments and CERN researches related to ecosystems and environment shows that there is great potential for coupling CERN with environment assessments. However in practice, CERN does not currently play the expected role. Remote sensing and irregular inventory data are still the main data sources currently used in regional assessments. Several causes led to the present situation: (1) insufficient cross-site research and failure to scale up site-level variables to the regional scale; (2) data barriers resulting from incompatible protocols and low data usability due to lack of data assimilation and scaling; and (3) absence of indicators relevant to human activities in existing monitoring protocols. For these reasons, enhancing cross-site monitoring and research, data assimilation and scaling up are critical steps required to improve coupling of LTER with environmental assessments. Site-focused long-term monitoring should be combined with wide-scale ground surveys and remote sensing to establish an effective connection between different environmental monitoring platforms for regional assessments. It is also necessary to revise the current monitoring protocols to include human activities and their impacts on the ecosystem, or change the LTERs into Long-Term Socio-Ecological Research (LTSER) networks. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Experience from the 1st Year running a Massive High Quality Videoconferencing Service for the LHC

    NASA Astrophysics Data System (ADS)

    Fernandes, Joao; Baron, Thomas; Bompastor, Bruno

    2014-06-01

    In the last few years, we have witnessed an explosion of visual collaboration initiatives in the industry. Several advances in video services and also in their underlying infrastructure are currently improving the way people collaborate globally. These advances are creating new usage paradigms: any device in any network can be used to collaborate, in most cases with an overall high quality. To keep apace with this technology progression, the CERN IT Department launched a service based on the Vidyo product. This new service architecture introduces Adaptive Video Layering, which dynamically optimizes the video for each endpoint by leveraging the H.264 Scalable Video Coding (SVC)-based compression technology. It combines intelligent AV routing techniques with the flexibility of H.264 SVC video compression, in order to achieve resilient video collaboration over the Internet, 3G and WiFi. We present an overview of the results that have been achieved after this major change. In particular, the first year of operation of the CERN Vidyo service will be described in terms of performance and scale: The service became part of the daily activity of the LHC collaborations, reaching a monthly usage of more than 3200 meetings with a peak of 750 simultaneous connections. We also present some key features such as the integration with CERN Indico. LHC users can now join a Vidyo meeting either from their personal computer or a CERN videoconference room simply from an Indico event page, with the ease of a single click. The roadmap for future improvements, service extensions and core infrastructure tendencies such as cloud based services and virtualization of system components will also be discussed. Vidyo's strengths allowed us to build a universal service (it is accessible from PCs, but also videoconference rooms, traditional phones, tablets and smartphones), developed with 3 key ideas in mind: ease of use, full integration and high quality.

  11. O8.10A MODEL FOR RESEARCH INITIATIVES FOR RARE CANCERS: THE COLLABORATIVE EPENDYMOMA RESEARCH NETWORK (CERN)

    PubMed Central

    Armstrong, T.S.; Aldape, K.; Gajjar, A.; Haynes, C.; Hirakawa, D.; Gilbertson, R.; Gilbert, M.R.

    2014-01-01

    Ependymoma represents less than 5% of adult central nervous system (CNS) tumors and a higher percentage of pediatric CNS tumors, but it remains an orphan disease. The majority of the laboratory-based research and clinical trials have been conducted in the pediatric setting, a reflection of the relative incidence and funding opportunities. CERN, created in 2006, was designed to establish a collaborative effort between laboratory and clinical research and pediatric and adult investigators. The organization of CERN is based on integration and collaboration among five projects. Project 1 contains the clinical trials network encompassing both adult and pediatric centers. This group has completed 2 clinical trials with more underway. Project 2 is focused on molecular classification of human ependymoma tumor tissues and also contains the tumor repository which has now collected over 600 fully clinically annotated CNS ependymomas from adults and children. Project 3 is focused on drug discovery utilizing robust laboratory models of ependymoma to perform high throughput screening of drug libraries, then taking promising agents through extensive preclinical testing including monitoring of drug delivery to tumor using state of the art microdialysis. Project 4 contains the basic research efforts evaluating the molecular pathogenesis of ependymoma and has successfully translated these findings by generating the first mouse models of ependymoma that are employed in preclinical drug development in Project 3. Project 5 studies patient outcomes, including the incorporation of these measures in the clinical trials. This project also contains an online Ependymoma Outcomes survey, collecting data on the consequences of the disease and its treatment. These projects have been highly successful and collaborative. For example, the serial measurement of symptom burden (Project 5) has greatly contributed to the evaluation of treatment efficacy of a clinical trial (Project 1) and investigators from Project 2 are evaluating potential predictive markers from tumor tissue from the same clinical trial. Results from genomic and molecular discoveries generated by Project 4 were evaluated using the clinical material from the Tumor Registry (Project 2). Agents identified from the high throughput screening in Project 3 are being used to create novel clinical trials (Project 1). As a complimentary effort, CERN's community outreach efforts provide a major gateway to patients, families, caregivers and healthcare providers, contributing to greater awareness of ependymoma, and supporting clinical trial accrual in Project 1. In summary, CERN has successfully created a collaborative, multi-national integrated effort combining pediatric- and adult-focused investigators spanning from basic science to patient outcomes measures. This research paradigm may be an effective approach for other rare cancers.

  12. NEWS: A trip to CERN

    NASA Astrophysics Data System (ADS)

    Ellison, A. D.

    2000-07-01

    Two years ago John Kinchin and myself were lucky enough to attend the Goldsmith's particle physics course. As well as many interesting lectures and activities, this course included a visit to CERN. To most physics teachers CERN is Mecca, a hallowed place where gods manipulate and manufacture matter. The experience of being there was even better. Alison Wright was an enthusiastic and very knowledgeable host who ensured the visit went smoothly and we all learned a lot. While we were there, John and I discussed the possibility of bringing a party of A-level students to see real physics in action. In February of this year we managed it. 33 students from two schools, Boston Grammar School and Northampton School for Boys, and four staff left England and caught the 2 am ferry to France. Many hours and a few `short cuts' later we arrived at our hotel in St Genis, not far from CERN. The first day was spent sight-seeing in Lausanne and Geneva. The Olympic museum in Lausanne is well worth a visit. Unfortunately, the famous fountain in Geneva was turned off, but then you can't have everything. The following morning we turned up at CERN late due to the coach's brakes being iced up! We were met once again by Alison Wright who forgave us and introduced the visit by giving an excellent talk on CERN, its background and its reason for existing. At this point we met another member of our Goldsmith's course and his students so we joined forces once again. We then piled back into the coach to re-cross the border and visit ALEPH. ALEPH is a monster of a detector 150 m below ground. We divided into four groups, each with a very able and knowledgeable guide, and toured the site. The size and scale of the detector are awesome and the students were suitably impressed. We repeated the speed of sound experiment of two years ago at the bottom of a 150 m concrete shaft (320 m s-1), posed for a group photo in front of the detector (figure 1) and returned to the main site for lunch in the canteen. Over lunch we mixed with physicists of many different nationalities and backgrounds. Figure 1 Figure 1. In the afternoon we visited Microcosm, the CERN visitors centre, and the LEP control room and also the SPS. Here the students learned new applications for much of the physics of standing waves and resonance that they had been taught in the classroom. Later that night, we visited a bowling alley where momentum and collision theory were put into practice. The following morning we returned to CERN and visited the large magnet testing facility. Here again physics was brought to life. We saw superconducting magnets being assembled and tested and the students gained a real appreciation of the problems and principles involved. The afternoon was rounded off by a visit to a science museum in Geneva - well worth a visit, as some of us still use some of the apparatus on display. Friday was our last full day so we visited Chamonix in the northern Alps. In the morning, we ascended the Aiguille de Midi - by cable car. Twenty minutes and 3842 m later we emerged into 50 km h-1 winds and -10 °C temperature, not counting the -10 °C wind chill factor. A crisp packet provided an unusual demonstration of the effects of air pressure (figure 2). Figure 2 Figure 2. The views from the summit were very spectacular though a few people experienced mild altitude sickness. That afternoon the party went to the Mer de Glace. Being inside a 3 million year-old structure moving down a mountain at 3 cm per day was an interesting experience, as was a tot of whisky with 3 million year-old water. Once again the local scenery was very photogenic and the click and whirr of cameras was a constant background noise. Saturday morning saw an early start for the long drive home. Most students - and some staff - took the opportunity to catch up on their sleep. Thanks are due to many people without whom the trip would never have taken place. Anne Craige, Stuart Williams, Christine Sutton and Andrew Morrison of PPARC, but most especially Alison Wright of CERN and John Kinchin of Boston Grammar School who did all the hard work and organization. The week gave students a unique chance to see the principles of physics being applied in many different ways and I am sure this has reinforced their knowledge and understanding. Some students also took the opportunity to practise their language skills. The only remaining question is: what next? I'll have to think about it in the summer when I have some slack time. Hmm, SLAC, that gives me an idea....

  13. Physics in the Spotlight

    NASA Astrophysics Data System (ADS)

    2000-10-01

    CERN, ESA and ESO Put Physics On Stage [1] Summary Can you imagine how much physics is in a simple match of ping-pong, in throwing a boomerang, or in a musical concert? Physics is all around us and governs our lives. The World-Wide Web and mobile communication are only two examples of technologies that have rapidly found their way from science into the everyday life. [Go to Physics On Stage Website at CERN] But who is going to maintain these technologies and develop new ones in the future? Probably not young Europeans, as recent surveys show a frightening decline of interest in physics and technology among Europe's citizens, especially schoolchildren. Fewer and fewer young people enrol in physics courses at university. The project "Physics on Stage" tackles this problem head on. An international festival of 400 physics educators from 22 European countries [2] gather at CERN in Geneva from 6 to 10 November to show how fascinating and entertaining physics can be . In a week-long event innovative methods of teaching physics and demonstrations of the fun that lies in physics are presented in a fair, in 10 spectacular performances, and presentations. Workshops on 14 key themes will give the delegates - teachers, professors, artists and other physics educators - the chance to discuss and come up with solutions for the worrying situation of disenchantment with Science in Europe. The European Science and Technology Week 2000 "Physics on Stage" is a joint project organised by the European Organisation for Nuclear Research (CERN) , the European Space Agency (ESA) and the European Southern Observatory (ESO) , Europe's leading physics research organisations. This is the first time that these three organisations have worked together in such close collaboration to catalyse a change in attitude towards science and technology education. Physics on Stage is funded in part by the European Commission and happens as an event in the European Science and Technology Week 2000, an initiative of the EC to raise public awareness of science and technology. Other partners are the European Physical Society (EPS) and the European Association for Astronomy Education (EAAE). European Commissioner Busquin to Visit Physics On Stage On Thursday, November 9, Philippe Busquin , Commissioner for Research, European Commission, Prof. Luciano Maiani , Director-General of CERN, Antonio Rodota , Director-General of ESA, Dr. Catherine Cesarsky , Director-General of ESO, and Dr. Achilleas Mitsos , Director-General of the Research DG in the European Commission, will participate in the activities of the Physics on Stage Festival. On this occasion, Commissioner Busquin will address conference delegates and the Media on the importance of Science and of innovative science and technology education. The Festival Each of the more than 400 delegates of the festival has been selected during the course of the year by committees in each of the 22 countries for outstanding projects promoting science. For example, a group of Irish physics teachers and their students will give a concert on instruments made exclusively of plumbing material, explaining the physics of sound at the same time. A professional theatre company from Switzerland stages a play on antimatter. Or two young Germans invite spectators to their interactive physics show where they juggle, eat fire and perform stunning physics experiments on stage. The colourful centrepiece of this week is the Physics Fair. Every country has its own stands where delegates show their projects, programmes or experiments and gain inspiration from the exhibits from other countries. Physics on Stage is a unique event. Nothing like it has ever happened in terms of international exchange, international collaboration and state of the art science and technology education methods. The Nobel prizewinners of 2030 are at school today. What ideas can Europe's teachers put forward to boost their interest in science? An invitation to the media We invite journalists to take part in this both politically and visually interesting event. We expect many useful results from this exchange of experience, there will a large choice of potential interview partners and of course uncountable images and impressions. Please fill in the form below and fax it back to CERN under +41 22 7850247. Go to the Webpage http://www.cern.ch/pos to find out all about Physics on Stage Festival at CERN. The main "Physics on Stage" web address is: http://www.estec.esa.nl/outreach/pos There is also a Physics On Stage webpage at ESO Notes [1] This is a joint Press Release by the European Organization for Nuclear Research (CERN) , the European Space Agency (ESA) and the European Southern Observatory (ESO). [2] The 22 countries are the member countries of at least one of the participating organisations or the European Union: Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Ireland, Italy, Luxembourg, the Netherlands, Norway, Poland, Portugal, the Slovak Republic, Spain, Sweden, Switzerland, United Kingdom.

  14. Muon Bundles as a Sign of Strangelets from the Universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kankiewicz, P.; Rybczyński, M.; Włodarczyk, Z.

    Recently, the CERN ALICE experiment observed muon bundles of very high multiplicities in its dedicated cosmic ray (CR) run, thereby confirming similar findings from the LEP era at CERN (in the CosmoLEP project). Originally, it was argued that they apparently stem from the primary CRs with a heavy masses. We propose an alternative possibility arguing that muonic bundles of highest multiplicity are produced by strangelets, hypothetical stable lumps of strange quark matter infiltrating our universe. We also address the possibility of additionally deducing their directionality which could be of astrophysical interest. Significant evidence for anisotropy of arrival directions of themore » observed high-multiplicity muonic bundles is found. Estimated directionality suggests their possible extragalactic provenance.« less

  15. Path to AWAKE: Evolution of the concept

    DOE PAGES

    Caldwell, A.; Adli, E.; Amorim, L.; ...

    2016-01-02

    This study describes the conceptual steps in reaching the design of the AWAKE experiment currently under construction at CERN. We start with an introduction to plasma wakefield acceleration and the motivation for using proton drivers. We then describe the self-modulation instability – a key to an early realization of the concept. This is then followed by the historical development of the experimental design, where the critical issues that arose and their solutions are described. We conclude with the design of the experiment as it is being realized at CERN and some words on the future outlook. A summary of themore » AWAKE design and construction status as presented in this conference is given in Gschwendtner et al. [1] .« less

  16. The beam and detector of the NA62 experiment at CERN

    DOE PAGES

    Gil, E. Cortina; Albarrán, E. Martín; Minucci, E.; ...

    2017-05-31

    NA62 is a fixed-target experiment at the CERN SPS dedicated to measurements of rare kaon decays. Such measurements, like the branching fraction of the K + → π + ν ν¯ decay, have the potential to bring significant insights into new physics processes when comparison is made with precise theoretical predictions. For this purpose, innovative techniques have been developed, in particular, in the domain of low-mass tracking devices. Detector construction spanned several years from 2009 to 2014. The collaboration started detector commissioning in 2014 and will collect data until the end of 2018. Furthermore, the beam line and detector componentsmore » are described together with their early performance obtained from 2014 and 2015 data.« less

  17. Ian Hinchliffe Answers Your Higgs Boson Questions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinchliffe, Ian

    contingent with the ATLAS experiment at CERN, answers many of your questions about the Higgs boson. Ian invited viewers to send in questions about the Higgs via email, Twitter, Facebook, or YouTube in an "Ask a Scientist" video posted July 3: http://youtu.be/xhuA3wCg06s CERN's July 4 announcement that the ATLAS and CMS experiments at the Large Hadron Collider have discovered a particle "consistent with the Higgs boson" has raised questions about what scientists have found and what still remains to be found -- and what it all means. If you have suggestions for future "Ask a Scientist" videos, post them belowmore » or send ideas to askascientist@lbl.gov« less

  18. Study of muon-induced neutron production using accelerator muon beam at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakajima, Y.; Lin, C. J.; Ochoa-Ricoux, J. P.

    2015-08-17

    Cosmogenic muon-induced neutrons are one of the most problematic backgrounds for various underground experiments for rare event searches. In order to accurately understand such backgrounds, experimental data with high-statistics and well-controlled systematics is essential. We performed a test experiment to measure muon-induced neutron production yield and energy spectrum using a high-energy accelerator muon beam at CERN. We successfully observed neutrons from 160 GeV/c muon interaction on lead, and measured kinetic energy distributions for various production angles. Works towards evaluation of absolute neutron production yield is underway. This work also demonstrates that the setup is feasible for a future large-scale experimentmore » for more comprehensive study of muon-induced neutron production.« less

  19. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garbil, Roger

    2010-11-09

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden). Workshop Organizing Committee: Enrico Chiaveri (Chairman); Marco Calviani; Samuel Andriamonje; Eric Berthoumieux; Carlos Guerrero; Robertomore » Losito; Vasilis Vlachoudis; Workshop Assistant: Geraldine Jean« less

  20. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, J.N.

    2010-11-09

    Part 7 of The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities;International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium) A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman) Marco Calvianimore » Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean.« less

  1. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lantz, Mattias; Neudecker, Denise

    2010-11-09

    Part 5 of The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium) A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuelmore » Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean« less

  2. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vlachoudis, Vasilis

    2010-11-09

    Part 8. The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu Topics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany) R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Ericmore » Berthoumieux Carlos Guerrero Roberto LositoVasilis Vlachoudis Workshop Assistant: Geraldine Jean« less

  3. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-11-09

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.eu. Topics of interest include: Data evaluation; Cross section measurements; Experimental techniques; Uncertainties and covariances; Fission properties; Current and future facilities. International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain) F. Gunsing (CEA, France) F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany) R. Nolte (PTB, Germany) S. Pomp (TSL UU, Sweden) & Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuelmore » Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis; Workshop Assistant: Geraldine Jean« less

  4. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-11-09

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities ; International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) ;Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Lositomore » Vasilis Vlachoudis;Workshop Assistant: Geraldine Jean« less

  5. Final Scientific EFNUDAT Workshop

    ScienceCinema

    None

    2017-12-09

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluationCross section measurementsExperimental techniquesUncertainties and covariancesFission propertiesCurrent and future facilities  International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco CalvianiSamuel AndriamonjeEric BerthoumieuxCarlos GuerreroRoberto LositoVasilis Vlachoudis Workshop Assistant: Géraldine Jean

  6. Angular distributions for high-mass jet pairs and a limit on the energy scale of compositeness for quarks from the CERN pp¯ collider

    NASA Astrophysics Data System (ADS)

    Arnison, G.; Albajar, C.; Albrow, M. G.; Allkofer, O. C.; Astbury, A.; Aubert, B.; Axon, T.; Bacci, C.; Bacon, T.; Batley, J. R.; Bauer, G.; Bellinger, J.; Bettini, A.; Bézaguet, A.; Bock, R. K.; Bos, K.; Buckley, E.; Busetto, G.; Catz, P.; Cennini, P.; Centro, S.; Ceradini, F.; Ciapetti, G.; Cittolin, S.; Clarke, D.; Cline, D.; Cochet, C.; Colas, J.; Colas, P.; Corden, M.; Coughlan, J. A.; Cox, G.; Dau, D.; Debeer, M.; Debrion, J. P.; Degiorgi, M.; Della Negra, M.; Demoulin, M.; Denby, B.; Denegri, D.; Diciaccio, A.; Dobrzynski, L.; Dorenbosch, J.; Dowell, J. D.; Duchovni, E.; Edgecock, R.; Eggert, K.; Eisenhandler, E.; Ellis, N.; Erhard, P.; Faissner, H.; Keeler, M. Fincke; Flynn, P.; Fontaine, G.; Frey, R.; Frühwirth, R.; Garvey, J.; Gee, D.; Geer, S.; Ghesquière, C.; Ghez, P.; Ghio, F.; Giacomelli, P.; Gibson, W. R.; Giraud-Héraud, Y.; Givernaud, A.; Gonidec, A.; Goodman, M.; Grassmann, H.; Grayer, G.; Guryn, W.; Hansl-Kozanecka, T.; Haynes, W.; Haywood, S. J.; Hoffmann, H.; Holthuizen, D. J.; Homer, R. J.; Honma, A.; Ikeda, M.; Jank, W.; Jimack, M.; Jorat, G.; Kalmus, P. I. P.; Karimäki, V.; Keeler, R.; Kenyon, I.; Kernan, A.; Kienzle, W.; Kinnunen, R.; Kozanecki, W.; Krammer, M.; Kroll, J.; Kryn, D.; Kyberd, P.; Lacava, F.; Laugier, J. P.; Lees, J. P.; Leuchs, R.; Levegrun, S.; Lévêque, A.; Levi, M.; Linglin, D.; Locci, E.; Long, K.; Markiewicz, T.; Markytan, M.; Martin, T.; Maurin, G.; McMahon, T.; Mendiburu, J.-P.; Meneguzzo, A.; Meyer, O.; Meyer, T.; Minard, M.-N.; Mohammad, M.; Morgan, K.; Moricca, M.; Moser, H.; Mours, B.; Muller, Th.; Nandi, A.; Naumann, L.; Norton, A.; Pascoli, D.; Pauss, F.; Perault, C.; Petrolo, E.; Mortari, G. Piano; Pietarinen, E.; Pigot, C.; Pimiä, M.; Pitman, D.; Placci, A.; Porte, J.-P.; Radermacher, E.; Ransdell, J.; Redelberger, T.; Reithler, H.; Revol, J. P.; Richman, J.; Rijssenbeek, M.; Robinson, D.; Rohlf, J.; Rossi, P.; Ruhm, W.; Rubbia, C.; Sajot, G.; Salvini, G.; Sass, J.; Sadoulet, B.; Samyn, D.; Savoy-Navarro, A.; Schinzel, D.; Schwartz, A.; Scott, W.; Shah, T. P.; Sheer, I.; Siotis, I.; Smith, D.; Sobie, R.; Sphicas, P.; Strauss, J.; Streets, J.; Stubenrauch, C.; Summers, D.; Sumorok, K.; Szoncso, F.; Tao, C.; Taurok, A.; Have, I. Ten; Tether, S.; Thompson, G.; Tscheslog, E.; Tuominiemi, J.; Van Eijk, B.; Verecchia, P.; Vialle, J. P.; Villasenor, L.; Virdee, T. S.; Von der Schmitt, H.; Von Schlippe, W.; Vrana, J.; Vuillemin, V.; Wahl, H. D.; Watkins, P.; Wildish, A.; Wilke, R.; Wilson, J.; Wingerter, I.; Wimpenny, S. J.; Wulz, C. E.; Wyatt, T.; Yvert, M.; Zaccardelli, C.; Zacharov, I.; Zaganidis, N.; Zanello, L.; Zotto, P.; UA1 Collaboration

    1986-09-01

    Angular distributions of high-mass jet pairs (180< m2 J<350 GeV) have been measured in the UA1 experiment at the CERN pp¯ Collider ( s=630 GeV) . We show that angular distributions are independent of the subprocess centre-of-mass (CM) energy over this range, and use the data to put constraints on the definition of the Q2 scale. The distribution for the very high mass jet pairs (240< m2 J<300 GeV) has also been used to obtain a lower limit on the energy scale Λ c of compositeness of quarks. We find Λ c>415 GeV at 95% confidence level.

  7. Brightness and uniformity measurements of plastic scintillator tiles at the CERN H2 test beam

    DOE PAGES

    Chatrchyan, S.; Sirunyan, A. M.; Tumasyan, A.; ...

    2018-01-05

    Here, we study the light output, light collection efficiency and signal timing of a variety of organic scintillators that are being considered for the upgrade of the hadronic calorimeter of the CMS detector. The experimental data are collected at the H2 test-beam area at CERN, using a 150 GeV muon beam. In particular, we investigate the usage of over-doped and green-emitting plastic scintillators, two solutions that have not been extensively considered. We present a study of the energy distribution in plastic-scintillator tiles, the hit efficiency as a function of the hit position, and a study of the signal timing formore » blue and green scintillators.« less

  8. Brightness and uniformity measurements of plastic scintillator tiles at the CERN H2 test beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatrchyan, S.; Sirunyan, A. M.; Tumasyan, A.

    Here, we study the light output, light collection efficiency and signal timing of a variety of organic scintillators that are being considered for the upgrade of the hadronic calorimeter of the CMS detector. The experimental data are collected at the H2 test-beam area at CERN, using a 150 GeV muon beam. In particular, we investigate the usage of over-doped and green-emitting plastic scintillators, two solutions that have not been extensively considered. We present a study of the energy distribution in plastic-scintillator tiles, the hit efficiency as a function of the hit position, and a study of the signal timing formore » blue and green scintillators.« less

  9. The beam and detector of the NA62 experiment at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gil, E. Cortina; Albarrán, E. Martín; Minucci, E.

    NA62 is a fixed-target experiment at the CERN SPS dedicated to measurements of rare kaon decays. Such measurements, like the branching fraction of the K + → π + ν ν¯ decay, have the potential to bring significant insights into new physics processes when comparison is made with precise theoretical predictions. For this purpose, innovative techniques have been developed, in particular, in the domain of low-mass tracking devices. Detector construction spanned several years from 2009 to 2014. The collaboration started detector commissioning in 2014 and will collect data until the end of 2018. Furthermore, the beam line and detector componentsmore » are described together with their early performance obtained from 2014 and 2015 data.« less

  10. Photoproduction of vector mesons in proton-proton ultraperipheral collisions at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Xie, Ya-Ping; Chen, Xurong

    2018-05-01

    Photoproduction of vector mesons is computed with dipole model in proton-proton ultraperipheral collisions (UPCs) at the CERN Large Hadron Collider (LHC). The dipole model framework is employed in the calculations of vector mesons production in diffractive processes. Parameters of the bCGC model are refitted with the latest inclusive deep inelastic scattering experimental data. Employing the bCGC model and boosted Gaussian light-cone wave function for vector mesons, we obtain the prediction of rapidity distributions of J/ψ and ψ(2s) mesons in proton-proton ultraperipheral collisions at the LHC. The predictions give a good description of the experimental data of LHCb. Predictions of ϕ and ω mesons are also evaluated in this paper.

  11. Prospects for K+ →π+ ν ν ‾ observation at CERN in NA62

    NASA Astrophysics Data System (ADS)

    Khoriauli, G.; Aglieri Rinella, G.; Aliberti, R.; Ambrosino, F.; Angelucci, B.; Antonelli, A.; Anzivino, G.; Arcidiacono, R.; Azhinenko, I.; Balev, S.; Bendotti, J.; Biagioni, A.; Biino, C.; Bizzeti, A.; Blazek, T.; Blik, A.; Bloch-Devaux, B.; Bolotov, V.; Bonaiuto, V.; Bragadireanu, M.; Britton, D.; Britvich, G.; Bucci, F.; Butin, F.; Capitolo, E.; Capoccia, C.; Capussela, T.; Carassiti, V.; Cartiglia, N.; Cassese, A.; Catinaccio, A.; Cecchetti, A.; Ceccucci, A.; Cenci, P.; Cerny, V.; Cerri, C.; Checcucci, B.; Chikilev, O.; Ciaranfi, R.; Collazuol, G.; Conovaloff, A.; Cooke, P.; Cooper, P.; Corradi, G.; Cortina Gil, E.; Costantini, F.; Cotta Ramusino, A.; Coward, D.; D'Agostini, G.; Dainton, J.; Dalpiaz, P.; Danielsson, H.; Degrange, J.; De Simone, N.; Di Filippo, D.; Di Lella, L.; Dixon, N.; Doble, N.; Duk, V.; Elsha, V.; Engelfried, J.; Enik, T.; Falaleev, V.; Fantechi, R.; Fascianelli, V.; Federici, L.; Fiorini, M.; Fry, J.; Fucci, A.; Fulton, L.; Gallorini, S.; Gamberini, E.; Gatignon, L.; Georgiev, G.; Gianoli, A.; Giorgi, M.; Giudici, S.; Glonti, L.; Goncalves Martins, A.; Gonnella, F.; Goudzovski, E.; Guida, R.; Gushchin, E.; Hahn, F.; Hallgren, B.; Heath, H.; Herman, F.; Hutchcroft, D.; Iacopini, E.; Imbergamo, E.; Jamet, O.; Jarron, P.; Kampf, K.; Kaplon, J.; Karjavin, V.; Kekelidze, V.; Kholodenko, S.; Khoriauli, G.; Khudyakov, A.; Kiryushin, Yu.; Kleinknecht, K.; Kluge, A.; Koval, M.; Kozhuharov, V.; Krivda, M.; Kudenko, Y.; Kunze, J.; Lamanna, G.; Lazzeroni, C.; Lenci, R.; Lenti, M.; Leonardi, E.; Lichard, P.; Lietava, R.; Litov, L.; Lomidze, D.; Lonardo, A.; Lurkin, N.; Madigozhin, D.; Maire, G.; Makarov, A.; Mandeiro, C.; Mannelli, I.; Mannocchi, G.; Mapelli, A.; Marchetto, F.; Marchevski, R.; Martellotti, S.; Massarotti, P.; Massri, K.; Matak, P.; Maurice, E.; Menichetti, E.; Mila, G.; Minucci, E.; Mirra, M.; Misheva, M.; Molokanova, N.; Morant, J.; Morel, M.; Moulson, M.; Movchan, S.; Munday, D.; Napolitano, M.; Neri, I.; Newson, F.; Norton, A.; Noy, M.; Nuessle, G.; Obraztsov, V.; Ostankov, A.; Padolski, S.; Page, R.; Palladino, V.; Pardons, A.; Parkinson, C.; Pedreschi, E.; Pepe, M.; Perez Gomez, F.; Perrin-Terrin, M.; Peruzzo, L.; Petrov, P.; Petrucci, F.; Piandani, R.; Piccini, M.; Pietreanu, D.; Pinzino, J.; Pivanti, M.; Polenkevich, I.; Popov, I.; Potrebenikov, Yu.; Protopopescu, D.; Raffaelli, F.; Raggi, M.; Riedler, P.; Romano, A.; Rubin, P.; Ruggiero, G.; Russo, V.; Ryjov, V.; Salamon, A.; Salina, G.; Samsonov, V.; Santoni, C.; Santovetti, E.; Saracino, G.; Sargeni, F.; Schifano, S.; Semenov, V.; Sergi, A.; Serra, M.; Shkarovskiy, S.; Soldi, D.; Sotnikov, A.; Sougonyaev, V.; Sozzi, M.; Spadaro, T.; Spinella, F.; Staley, R.; Statera, M.; Sutcliffe, P.; Szilasi, N.; Tagnani, D.; Valdata-Nappi, M.; Valente, P.; Vasile, M.; Vassilieva, T.; Velghe, B.; Veltri, M.; Venditti, S.; Volpe, R.; Vormstein, M.; Wahl, H.; Wanke, R.; Wertelaers, P.; Winhart, A.; Winston, R.; Wrona, B.; Yushchenko, O.; Zamkovsky, M.; Zinchenko, A.; NA62 Collaboration

    2016-01-01

    The main physics goal of the NA62 experiment at CERN is to precisely measure the branching ratio of the Kaon rare decay K+ →π+ ν ν ‾. This decay is strongly suppressed in the Standard Model. On the other hand its branching ratio is calculated with high accuracy. NA62 is designed to measure the K+ →π+ ν ν ‾ decay rate with an uncertainty better than 10%. The measurement can serve as a probe to some new physics phenomena, which can alter the decay rate. The NA62 experiment has been successfully launched in October 2014. The theory framework as well as the NA62 detector and the preliminary results are reviewed in this article.

  12. A Tony Thomas-Inspired Guide to INSPIRE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Connell, Heath B.; /Fermilab

    2010-04-01

    The SPIRES database was created in the late 1960s to catalogue the high energy physics preprints received by the SLAC Library. In the early 1990s it became the first database on the web and the first website outside of Europe. Although indispensible to the HEP community, its aging software infrastructure is becoming a serious liability. In a joint project involving CERN, DESY, Fermilab and SLAC, a new database, INSPIRE, is being created to replace SPIRES using CERN's modern, open-source Invenio database software. INSPIRE will maintain the content and functionality of SPIRES plus many new features. I describe this evolution frommore » the birth of SPIRES to the current day, noting that the career of Tony Thomas spans this timeline.« less

  13. How can we turn a science exhibition on a really success outreach activity?

    NASA Astrophysics Data System (ADS)

    Farrona, A. M. M.; Vilar, R.

    2016-04-01

    In April 2013, a CERN exhibition was shown in Santander: ;The largest scientific instrument ever built;. Around the exhibition, were proposed several activities: guide tours for children, younger and adults, workshops, film projections… In this way, the exhibition was visited by more than two thousand persons. We must keep in mind that Santander is a small city and it population does not usually take part in outreach activity. With this contribution, we want to teach the way in which it is possible to take advantage of science exhibitions. It made possible to show the Large Hadron Collider at CERN experiment to the great majority of Santander population, and to awaken their interest in or enthusiasm for science.

  14. Protonium production in ATHENA

    NASA Astrophysics Data System (ADS)

    Venturelli, L.; Amoretti, M.; Amsler, C.; Bonomi, G.; Carraro, C.; Cesar, C. L.; Charlton, M.; Doser, M.; Fontana, A.; Funakoshi, R.; Genova, P.; Hayano, R. S.; Jørgensen, L. V.; Kellerbauer, A.; Lagomarsino, V.; Landua, R.; Rizzini, E. Lodi; Macrì, M.; Madsen, N.; Manuzio, G.; Mitchard, D.; Montagna, P.; Posada, L. G.; Pruys, H.; Regenfus, C.; Rotondi, A.; Testera, G.; van der Werf, D. P.; Variola, A.; Yamazaki, Y.; Zurlo, N.; Athena Collaboration

    2007-08-01

    The ATHENA experiment at CERN, after producing cold antihydrogen atoms for the first time in 2002, has synthesised protonium atoms in vacuum at very low energies. Protonium, i.e. the antiproton-proton bound system, is of interest for testing fundamental physical theories. In the nested penning trap of the ATHENA apparatus protonium has been produced as result of a chemical reaction between an antiproton and the simplest matter molecule, H2+. The formed protonium atoms have kinetic energies in the range 40-700 meV and are metastable with mean lifetimes of the order of 1 μs. Our result shows that it will be possible to start measurements on protonium at low energy antiproton facilities, such as the AD at CERN or FLAIR at GSI.

  15. Towards the high-accuracy determination of the 238U fission cross section at the threshold region at CERN - n_TOF

    NASA Astrophysics Data System (ADS)

    Diakaki, M.; Audouin, L.; Berthoumieux, E.; Calviani, M.; Colonna, N.; Dupont, E.; Duran, I.; Gunsing, F.; Leal-Cidoncha, E.; Le Naour, C.; Leong, L. S.; Mastromarco, M.; Paradela, C.; Tarrio, D.; Tassan-Got, L.; Aerts, G.; Altstadt, S.; Alvarez, H.; Alvarez-Velarde, F.; Andriamonje, S.; Andrzejewski, J.; Badurek, G.; Barbagallo, M.; Baumann, P.; Becares, V.; Becvar, F.; Belloni, F.; Berthier, B.; Billowes, J.; Boccone, V.; Bosnar, D.; Brugger, M.; Calvino, F.; Cano-Ott, D.; Capote, R.; Carrapiço, C.; Cennini, P.; Cerutti, F.; Chiaveri, E.; Chin, M.; Cortes, G.; Cortes-Giraldo, M. A.; Cosentino, L.; Couture, A.; Cox, J.; David, S.; Dillmann, I.; Domingo-Pardo, C.; Dressler, R.; Dridi, W.; Eleftheriadis, C.; Embid-Segura, M.; Ferrant, L.; Ferrari, A.; Finocchiaro, P.; Fraval, K.; Fujii, K.; Furman, W.; Ganesan, S.; Garcia, A. R.; Giubrone, G.; Gomez-Hornillos, M. B.; Goncalves, I. F.; Gonzalez-Romero, E.; Goverdovski, A.; Gramegna, F.; Griesmayer, E.; Guerrero, C.; Gurusamy, P.; Haight, R.; Heil, M.; Heinitz, S.; Igashira, M.; Isaev, S.; Jenkins, D. G.; Jericha, E.; Kadi, Y.; Kaeppeler, F.; Karadimos, D.; Karamanis, D.; Kerveno, M.; Ketlerov, V.; Kivel, N.; Kokkoris, M.; Konovalov, V.; Krticka, M.; Kroll, J.; Lampoudis, C.; Langer, C.; Lederer, C.; Leeb, H.; Lo Meo, S.; Losito, R.; Lozano, M.; Manousos, A.; Marganiec, J.; Martinez, T.; Marrone, S.; Massimi, C.; Mastinu, P.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Mondelaers, W.; Moreau, C.; Mosconi, M.; Musumarra, A.; O'Brien, S.; Pancin, J.; Patronis, N.; Pavlik, A.; Pavlopoulos, P.; Perkowski, J.; Perrot, L.; Pigni, M. T.; Plag, R.; Plompen, A.; Plukis, L.; Poch, A.; Pretel, C.; Praena, J.; Quesada, J.; Rauscher, T.; Reifarth, R.; Riego, A.; Roman, F.; Rudolf, G.; Rubbia, C.; Rullhusen, P.; Salgado, J.; Santos, C.; Sarchiapone, L.; Sarmento, R.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Stephan, C.; Tagliente, G.; Tain, J. L.; Tavora, L.; Terlizzi, R.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Versaci, R.; Vermeulen, M. J.; Villamarin, D.; Vincente, M. C.; Vlachoudis, V.; Vlastou, R.; Voss, F.; Wallner, A.; Walter, S.; Ware, T.; Weigand, M.; Weiß, C.; Wiesher, M.; Wisshak, K.; Wright, T.; Zugec, P.

    2016-03-01

    The 238U fission cross section is an international standard beyond 2 MeV where the fission plateau starts. However, due to its importance in fission reactors, this cross-section should be very accurately known also in the threshold region below 2 MeV. The 238U fission cross section has been measured relative to the 235U fission cross section at CERN - n_TOF with different detection systems. These datasets have been collected and suitably combined to increase the counting statistics in the threshold region from about 300 keV up to 3 MeV. The results are compared with other experimental data, evaluated libraries, and the IAEA standards.

  16. The high Beta cryo-modules and the associated cryogenic system for the HIE-ISOLDE upgrade at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delruelle, N.; Leclercq, Y.; Pirotte, O.

    2014-01-29

    The major upgrade of the energy and intensity of the existing ISOLDE and REX-ISOLDE radioactive ion beam facilities at CERN requires the replacement of most of the existing ISOLDE post-acceleration equipment by a superconducting linac based on quarter-wave resonators housed together with superconducting solenoids in a series of four high-β and two low-β cryo-modules. As well as providing optimum conditions for physics, the cryo-modules need to function under stringent vacuum and cryogenic conditions. We present the detail design and expected cryogenic performance of the high- β cryo-module together with the cryogenic supply and distribution system destined to service the completemore » superconducting linac.« less

  17. Proton enhancement at large pT at the CERN large hadron collider without structure in associated-particle distribution.

    PubMed

    Hwa, Rudolph C; Yang, C B

    2006-07-28

    The production of pions and protons in the pT range between 10 and 20 GeV/c for Pb+Pb collisions at CERN LHC is studied in the recombination model. It is shown that the dominant mechanism for hadronization is the recombination of shower partons from neighboring jets when the jet density is high. Protons are more copiously produced than pions in that pT range because the coalescing partons can have lower momentum fractions, but no thermal partons are involved. The proton-to-pion ratio can be as high as 20. When such high pT hadrons are used as trigger particles, there will not be any associated particles that are not in the background.

  18. Neutron capture cross section measurement of 151Sm at the CERN neutron time of flight facility (n_TOF).

    PubMed

    Abbondanno, U; Aerts, G; Alvarez-Velarde, F; Alvarez-Pol, H; Andriamonje, S; Andrzejewski, J; Badurek, G; Baumann, P; Becvár, F; Benlliure, J; Berthoumieux, E; Calviño, F; Cano-Ott, D; Capote, R; Cennini, P; Chepel, V; Chiaveri, E; Colonna, N; Cortes, G; Cortina, D; Couture, A; Cox, J; Dababneh, S; Dahlfors, M; David, S; Dolfini, R; Domingo-Pardo, C; Duran, I; Embid-Segura, M; Ferrant, L; Ferrari, A; Ferreira-Marques, R; Frais-Koelbl, H; Furman, W; Goncalves, I; Gallino, R; Gonzalez-Romero, E; Goverdovski, A; Gramegna, F; Griesmayer, E; Gunsing, F; Haas, B; Haight, R; Heil, M; Herrera-Martinez, A; Isaev, S; Jericha, E; Käppeler, F; Kadi, Y; Karadimos, D; Kerveno, M; Ketlerov, V; Koehler, P; Konovalov, V; Krticka, M; Lamboudis, C; Leeb, H; Lindote, A; Lopes, I; Lozano, M; Lukic, S; Marganiec, J; Marrone, S; Martinez-Val, J; Mastinu, P; Mengoni, A; Milazzo, P M; Molina-Coballes, A; Moreau, C; Mosconi, M; Neves, F; Oberhummer, H; O'Brien, S; Pancin, J; Papaevangelou, T; Paradela, C; Pavlik, A; Pavlopoulos, P; Perlado, J M; Perrot, L; Pignatari, M; Plag, R; Plompen, A; Plukis, A; Poch, A; Policarpo, A; Pretel, C; Quesada, J; Raman, S; Rapp, W; Rauscher, T; Reifarth, R; Rosetti, M; Rubbia, C; Rudolf, G; Rullhusen, P; Salgado, J; Soares, J C; Stephan, C; Tagliente, G; Tain, J; Tassan-Got, L; Tavora, L; Terlizzi, R; Vannini, G; Vaz, P; Ventura, A; Villamarin, D; Vincente, M C; Vlachoudis, V; Voss, F; Wendler, H; Wiescher, M; Wisshak, K

    2004-10-15

    The151Sm(n,gamma)152Sm cross section has been measured at the spallation neutron facility n_TOF at CERN in the energy range from 1 eV to 1 MeV. The new facility combines excellent resolution in neutron time-of-flight, low repetition rates, and an unsurpassed instantaneous luminosity, resulting in rather favorable signal/background ratios. The 151Sm cross section is of importance for characterizing neutron capture nucleosynthesis in asymptotic giant branch stars. At a thermal energy of kT=30 keV the Maxwellian averaged cross section of this unstable isotope (t(1/2)=93 yr) was determined to be 3100+/-160 mb, significantly larger than theoretical predictions.

  19. Black holes in many dimensions at the CERN Large Hadron Collider: testing critical string theory.

    PubMed

    Hewett, JoAnne L; Lillie, Ben; Rizzo, Thomas G

    2005-12-31

    We consider black hole production at the CERN Large Hadron Collider (LHC) in a generic scenario with many extra dimensions where the standard model fields are confined to a brane. With approximately 20 dimensions the hierarchy problem is shown to be naturally solved without the need for large compactification radii. We find that in such a scenario the properties of black holes can be used to determine the number of extra dimensions, . In particular, we demonstrate that measurements of the decay distributions of such black holes at the LHC can determine if is significantly larger than 6 or 7 with high confidence and thus can probe one of the critical properties of string theory compactifications.

  20. Measurement of the radiative capture cross section of the s-process branching points 204Tl and 171Tm at the n_TOF facility (CERN)

    NASA Astrophysics Data System (ADS)

    Casanovas, A.; Domingo-Pardo, C.; Guerrero, C.; Lerendegui-Marco, J.; Calviño, F.; Tarifeño-Saldivia, A.; Dressler, R.; Heinitz, S.; Kivel, N.; Quesada, J. M.; Schumann, D.; Aberle, O.; Alcayne, V.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Bacak, M.; Barbagallo, M.; Bečvář, F.; Bellia, G.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Busso, M.; Caamaño, M.; Caballero-Ontanaya, L.; Calviani, M.; Cano-Ott, D.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Colonna, N.; Cortés, G.; Cortés-Giraldo, M. A.; Cosentino, L.; Cristallo, S.; Damone, L. A.; Diakaki, M.; Dietz, M.; Dupont, E.; Durán, I.; Eleme, Z.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Furman, V.; Göbel, K.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González-Romero, E.; Gunsing, F.; Heyse, J.; Jenkins, D. G.; Käppeler, F.; Kadi, Y.; Katabuchi, T.; Kimura, A.; Kokkoris, M.; Kopatch, Y.; Krtička, M.; Kurtulgil, D.; Ladarescu, I.; Lederer-Woods, C.; Meo, S. Lo; Lonsdale, S. J.; Macina, D.; Martínez, T.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Michalopoulou, V.; Milazzo, P. M.; Mingrone, F.; Musumarra, A.; Negret, A.; Nolte, R.; Ogállar, F.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Persanti, L.; Porras, I.; Praena, J.; Radeck, D.; Ramos, D.; Rauscher, T.; Reifarth, R.; Rochman, D.; Sabaté-Gilarte, M.; Saxena, A.; Schillebeeckx, P.; Simone, S.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Talip, T.; Tassan-Got, L.; Tsinganis, A.; Ulrich, J.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Woods, P. J.; Wright, T.; Žugec, P.; Köster, U.

    2018-05-01

    The neutron capture cross section of some unstable nuclei is especially relevant for s-process nucleosynthesis studies. This magnitude is crucial to determine the local abundance pattern, which can yield valuable information of the s-process stellar environment. In this work we describe the neutron capture (n,γ) measurement on two of these nuclei of interest, 204Tl and 171Tm, from target production to the final measurement, performed successfully at the n_TOF facility at CERN in 2014 and 2015. Preliminary results on the ongoing experimental data analysis will also be shown. These results include the first ever experimental observation of capture resonances for these two nuclei.

  1. Search for Invisible Decays of Sub-GeV Dark Photons in Missing-Energy Events at the CERN SPS.

    PubMed

    Banerjee, D; Burtsev, V; Cooke, D; Crivelli, P; Depero, E; Dermenev, A V; Donskov, S V; Dubinin, F; Dusaev, R R; Emmenegger, S; Fabich, A; Frolov, V N; Gardikiotis, A; Gninenko, S N; Hösgen, M; Kachanov, V A; Karneyeu, A E; Ketzer, B; Kirpichnikov, D V; Kirsanov, M M; Kovalenko, S G; Kramarenko, V A; Kravchuk, L V; Krasnikov, N V; Kuleshov, S V; Lyubovitskij, V E; Lysan, V; Matveev, V A; Mikhailov, Yu V; Myalkovskiy, V V; Peshekhonov, V D; Peshekhonov, D V; Petuhov, O; Polyakov, V A; Radics, B; Rubbia, A; Samoylenko, V D; Tikhomirov, V O; Tlisov, D A; Toropin, A N; Trifonov, A Yu; Vasilishin, B; Vasquez Arenas, G; Ulloa, P; Zhukov, K; Zioutas, K

    2017-01-06

    We report on a direct search for sub-GeV dark photons (A^{'}), which might be produced in the reaction e^{-}Z→e^{-}ZA^{'} via kinetic mixing with photons by 100 GeV electrons incident on an active target in the NA64 experiment at the CERN SPS. The dark photons would decay invisibly into dark matter particles resulting in events with large missing energy. No evidence for such decays was found with 2.75×10^{9} electrons on target. We set new limits on the γ-A^{'} mixing strength and exclude the invisible A^{'} with a mass ≲100  MeV as an explanation of the muon g_{μ}-2 anomaly.

  2. Landscape of supersymmetric particle mass hierarchies and their signature space at the CERN Large Hadron Collider.

    PubMed

    Feldman, Daniel; Liu, Zuowei; Nath, Pran

    2007-12-21

    The minimal supersymmetric standard model with soft breaking has a large landscape of supersymmetric particle mass hierarchies. This number is reduced significantly in well-motivated scenarios such as minimal supergravity and alternatives. We carry out an analysis of the landscape for the first four lightest particles and identify at least 16 mass patterns, and provide benchmarks for each. We study the signature space for the patterns at the CERN Large Hadron Collider by analyzing the lepton+ (jet> or =2) + missing P{T} signals with 0, 1, 2, and 3 leptons. Correlations in missing P{T} are also analyzed. It is found that even with 10 fb{-1} of data a significant discrimination among patterns emerges.

  3. HR Presentation - New Contract Policy

    ScienceCinema

    None

    2018-06-21

    Presentation on CERN's contract policy as of 2009. Topics covered include: staff member survey, work environment, career development, financial and social benefits, HR department activities and policy analysis.

  4. Handbook of LHC Higgs Cross Sections: 4. Deciphering the Nature of the Higgs Sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Florian, D.

    This Report summarizes the results of the activities of the LHC Higgs Cross Section Working Group in the period 2014-2016. The main goal of the working group was to present the state-of-the-art of Higgs physics at the LHC, integrating all new results that have appeared in the last few years. The first part compiles the most up-to-date predictions of Higgs boson production cross sections and decay branching ratios, parton distribution functions, and off-shell Higgs boson production and interference effects. The second part discusses the recent progress in Higgs effective field theory predictions, followed by the third part on pseudo-observables, simplifiedmore » template cross section and fiducial cross section measurements, which give the baseline framework for Higgs boson property measurements. The fourth part deals with the beyond the Standard Model predictions of various benchmark scenarios of Minimal Supersymmetric Standard Model, extended scalar sector, Next-to-Minimal Supersymmetric Standard Model and exotic Higgs boson decays. This report follows three previous working-group reports: Handbook of LHC Higgs Cross Sections: 1. Inclusive Observables (CERN-2011-002), Handbook of LHC Higgs Cross Sections: 2. Differential Distributions (CERN-2012-002), and Handbook of LHC Higgs Cross Sections: 3. Higgs properties (CERN-2013-004). The current report serves as the baseline reference for Higgs physics in LHC Run 2 and beyond.« less

  5. How hadron collider experiments contributed to the development of QCD: from hard-scattering to the perfect liquid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tannenbaum, M. J.

    A revolution in elementary particle physics occurred during the period from the ICHEP1968 to the ICHEP1982 with the advent of the parton model from discoveries in Deeply Inelastic electron-proton Scattering at SLAC, neutrino experiments, hard-scattering observed in p+p collisions at the CERN ISR, the development of QCD, the discovery of the J/Ψ at BNL and SLAC and the clear observation of high transverse momentum jets at the CERN SPSmore » $$\\bar{p}$$ + p collider. These and other discoveries in this period led to the acceptance of QCD as the theory of the strong interactions. The desire to understand nuclear physics at high density such as in neutron stars led to the application of QCD to this problem and to the prediction of a Quark-Gluon Plasma (QGP) in nuclei at high energy density and temperatures. This eventually led to the construction of the Relativistic Heavy Ion Collider (RHIC) at BNL to observe superdense nuclear matter in the laboratory. This article discusses how experimental methods and results which confirmed QCD at the first hadron collider, the CERN ISR, played an important role in experiments at the first heavy ion collider, RHIC, leading to the discovery of the QGP as a perfect liquid as well as discoveries at RHIC and the LHC which continue to the present day.« less

  6. How hadron collider experiments contributed to the development of QCD: from hard-scattering to the perfect liquid

    DOE PAGES

    Tannenbaum, M. J.

    2018-01-30

    A revolution in elementary particle physics occurred during the period from the ICHEP1968 to the ICHEP1982 with the advent of the parton model from discoveries in Deeply Inelastic electron-proton Scattering at SLAC, neutrino experiments, hard-scattering observed in p+p collisions at the CERN ISR, the development of QCD, the discovery of the J/Ψ at BNL and SLAC and the clear observation of high transverse momentum jets at the CERN SPSmore » $$\\bar{p}$$ + p collider. These and other discoveries in this period led to the acceptance of QCD as the theory of the strong interactions. The desire to understand nuclear physics at high density such as in neutron stars led to the application of QCD to this problem and to the prediction of a Quark-Gluon Plasma (QGP) in nuclei at high energy density and temperatures. This eventually led to the construction of the Relativistic Heavy Ion Collider (RHIC) at BNL to observe superdense nuclear matter in the laboratory. This article discusses how experimental methods and results which confirmed QCD at the first hadron collider, the CERN ISR, played an important role in experiments at the first heavy ion collider, RHIC, leading to the discovery of the QGP as a perfect liquid as well as discoveries at RHIC and the LHC which continue to the present day.« less

  7. A free-jet Hg target operating in a high magnetic field intersecting a high-power proton beam

    NASA Astrophysics Data System (ADS)

    Graves, Van; Spampinato, Philip; Gabriel, Tony; Kirk, Harold; Simos, Nicholas; Tsang, Thomas; McDonald, Kirk; Peter Titus; Fabich, Adrian; Haseroth, Helmut; Lettry, Jacques

    2006-06-01

    A proof-of-principal experiment to investigate the interaction of a proton beam, high magnetic field, and high-Z target is planned to take place at CERN in early 2007. This experiment is part of the Muon Collider Collaboration, with participants from Brookhaven National Laboratory, Princeton University, Massachusetts Institute Of Technology, European Organization for Nuclear Research-CERN, Rutherford Appleton Laboratory, and Oak Ridge National Laboratory. An unconstrained mercury jet target system that interacts with a high power (1 MW) proton beam in a high magnetic field (15 T) is being designed. The Hg jet diameter is 1-cm with a velocity up to 20 m/s. A laser optical diagnostic system will be incorporated into the target design to permit observation of the dispersal of the jet resulting from interaction with a 24 GeV proton beam with up to 20×1012 ppp. The target system includes instruments for sensing mercury vapor, temperature, flow rate, and sump tank level, and the means to position the jet relative to the magnetic axis of a solenoid and the proton beam. The design considerations for the system include all issues dealing with safely handling approximately 23 l of Hg, transporting the target system and the mercury to CERN, decommissioning the experiment, and returning the mildly activated equipment and Hg to the US.

  8. A free-jet Hg target operating in a high magnetic field intersecting a high-power proton beam

    NASA Astrophysics Data System (ADS)

    Van Graves; Spampinato, Philip; Gabriel, Tony; Kirk, Harold; Simos, Nicholas; Tsang, Thomas; McDonald, Kirk; Peter Titus; Fabich, Adrian; Haseroth, Helmut; Lettry, Jacques

    2006-06-01

    A proof-of-principal experiment to investigate the interaction of a proton beam, high magnetic field, and high- Z target is planned to take place at CERN in early 2007. This experiment is part of the Muon Collider Collaboration, with participants from Brookhaven National Laboratory, Princeton University, Massachusetts Institute Of Technology, European Organization for Nuclear Research-CERN, Rutherford Appleton Laboratory, and Oak Ridge National Laboratory. An unconstrained mercury jet target system that interacts with a high power (1 MW) proton beam in a high magnetic field (15 T) is being designed. The Hg jet diameter is 1-cm with a velocity up to 20 m/s. A laser optical diagnostic system will be incorporated into the target design to permit observation of the dispersal of the jet resulting from interaction with a 24 GeV proton beam with up to 20×10 12 ppp. The target system includes instruments for sensing mercury vapor, temperature, flow rate, and sump tank level, and the means to position the jet relative to the magnetic axis of a solenoid and the proton beam. The design considerations for the system include all issues dealing with safely handling approximately 23 l of Hg, transporting the target system and the mercury to CERN, decommissioning the experiment, and returning the mildly activated equipment and Hg to the US.

  9. How hadron collider experiments contributed to the development of QCD: from hard-scattering to the perfect liquid

    NASA Astrophysics Data System (ADS)

    Tannenbaum, M. J.

    2018-05-01

    A revolution in elementary particle physics occurred during the period from the ICHEP1968 to the ICHEP1982 with the advent of the parton model from discoveries in Deeply Inelastic electron-proton Scattering at SLAC, neutrino experiments, hard-scattering observed in p+p collisions at the CERN ISR, the development of QCD, the discovery of the J/ Ψ at BNL and SLAC and the clear observation of high transverse momentum jets at the CERN SPS p¯ + p collider. These and other discoveries in this period led to the acceptance of QCD as the theory of the strong interactions. The desire to understand nuclear physics at high density such as in neutron stars led to the application of QCD to this problem and to the prediction of a Quark-Gluon Plasma (QGP) in nuclei at high energy density and temperatures. This eventually led to the construction of the Relativistic Heavy Ion Collider (RHIC) at BNL to observe superdense nuclear matter in the laboratory. This article discusses how experimental methods and results which confirmed QCD at the first hadron collider, the CERN ISR, played an important role in experiments at the first heavy ion collider, RHIC, leading to the discovery of the QGP as a perfect liquid as well as discoveries at RHIC and the LHC which continue to the present day.

  10. SHiP: a new facility with a dedicated detector to search for new neutral particles and studying tau neutrino properties

    NASA Astrophysics Data System (ADS)

    Shevchenko, V.

    2017-12-01

    SHiP (Search for Hidden Particles) is a new general purpose fixed target facility, whose Technical Proposal has been recently reviewed by the CERN SPS Committee and by the CERN Research Board. The two boards recommended that the experiment proceeds further to a Comprehensive Design phase in the context of the new CERNWorking group "Physics Beyond Colliders", aiming at presenting a CERN strategy for the European Strategy meeting of 2019. In the initial phase of SHiP, the 400 GeV proton beam extracted from the SPS will be dumped on a heavy target with the aim of integrating 2×1020 pot in 5 years. A dedicated detector, based on a long vacuum tank followed by a spectrometer and particle identification detectors, will allow probing a variety of models with light long-lived exotic particles and masses below O(10) GeV/c2. The main focus will be the physics of the so-called Hidden Portals, i.e. search for Dark Photons, Light scalars and pseudo-scalars, and Heavy Neutrinos. The sensitivity to Heavy Neutrinos will allow for the first time to probe, in the mass range between the kaon and the charm meson mass, a coupling range for which Baryogenesis and active neutrino masses could also be explained. Another dedicated detector will allow the study of neutrino cross-sections and angular distributions.

  11. The new ALICE DQM client: a web access to ROOT-based objects

    NASA Astrophysics Data System (ADS)

    von Haller, B.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Delort, C.; Dénes, E.; Diviá, R.; Fuchs, U.; Niedziela, J.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; Wegrzynek, A.

    2015-12-01

    A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The online Data Quality Monitoring (DQM) plays an essential role in the experiment operation by providing shifters with immediate feedback on the data being recorded in order to quickly identify and overcome problems. An immediate access to the DQM results is needed not only by shifters in the control room but also by detector experts worldwide. As a consequence, a new web application has been developed to dynamically display and manipulate the ROOT-based objects produced by the DQM system in a flexible and user friendly interface. The architecture and design of the tool, its main features and the technologies that were used, both on the server and the client side, are described. In particular, we detail how we took advantage of the most recent ROOT JavaScript I/O and web server library to give interactive access to ROOT objects stored in a database. We describe as well the use of modern web techniques and packages such as AJAX, DHTMLX and jQuery, which has been instrumental in the successful implementation of a reactive and efficient application. We finally present the resulting application and how code quality was ensured. We conclude with a roadmap for future technical and functional developments.

  12. Insight into acid-base nucleation experiments by comparison of the chemical composition of positive, negative, and neutral clusters.

    PubMed

    Bianchi, Federico; Praplan, Arnaud P; Sarnela, Nina; Dommen, Josef; Kürten, Andreas; Ortega, Ismael K; Schobesberger, Siegfried; Junninen, Heikki; Simon, Mario; Tröstl, Jasmin; Jokinen, Tuija; Sipilä, Mikko; Adamov, Alexey; Amorim, Antonio; Almeida, Joao; Breitenlechner, Martin; Duplissy, Jonathan; Ehrhart, Sebastian; Flagan, Richard C; Franchin, Alessandro; Hakala, Jani; Hansel, Armin; Heinritzi, Martin; Kangasluoma, Juha; Keskinen, Helmi; Kim, Jaeseok; Kirkby, Jasper; Laaksonen, Ari; Lawler, Michael J; Lehtipalo, Katrianne; Leiminger, Markus; Makhmutov, Vladimir; Mathot, Serge; Onnela, Antti; Petäjä, Tuukka; Riccobono, Francesco; Rissanen, Matti P; Rondo, Linda; Tomé, António; Virtanen, Annele; Viisanen, Yrjö; Williamson, Christina; Wimmer, Daniela; Winkler, Paul M; Ye, Penglin; Curtius, Joachim; Kulmala, Markku; Worsnop, Douglas R; Donahue, Neil M; Baltensperger, Urs

    2014-12-02

    We investigated the nucleation of sulfuric acid together with two bases (ammonia and dimethylamine), at the CLOUD chamber at CERN. The chemical composition of positive, negative, and neutral clusters was studied using three Atmospheric Pressure interface-Time Of Flight (APi-TOF) mass spectrometers: two were operated in positive and negative mode to detect the chamber ions, while the third was equipped with a nitrate ion chemical ionization source allowing detection of neutral clusters. Taking into account the possible fragmentation that can happen during the charging of the ions or within the first stage of the mass spectrometer, the cluster formation proceeded via essentially one-to-one acid-base addition for all of the clusters, independent of the type of the base. For the positive clusters, the charge is carried by one excess protonated base, while for the negative clusters it is carried by a deprotonated acid; the same is true for the neutral clusters after these have been ionized. During the experiments involving sulfuric acid and dimethylamine, it was possible to study the appearance time for all the clusters (positive, negative, and neutral). It appeared that, after the formation of the clusters containing three molecules of sulfuric acid, the clusters grow at a similar speed, independent of their charge. The growth rate is then probably limited by the arrival rate of sulfuric acid or cluster-cluster collision.

  13. Beam experiments with the Grenoble test electron cyclotron resonance ion source at iThemba LABS.

    PubMed

    Thomae, R; Conradie, J; Fourie, D; Mira, J; Nemulodi, F; Kuechler, D; Toivanen, V

    2016-02-01

    At iThemba Laboratory for Accelerator Based Sciences (iThemba LABS) an electron cyclotron ion source was installed and commissioned. This source is a copy of the Grenoble Test Source (GTS) for the production of highly charged ions. The source is similar to the GTS-LHC at CERN and named GTS2. A collaboration between the Accelerators and Beam Physics Group of CERN and the Accelerator and Engineering Department of iThemba LABS was proposed in which the development of high intensity argon and xenon beams is envisaged. In this paper, we present beam experiments with the GTS2 at iThemba LABS, in which the results of continuous wave and afterglow operation of xenon ion beams with oxygen as supporting gases are presented.

  14. Charm and the rise of the pp-bar total cross section

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, S.T.; Dash, J.W.

    We give a detailed description of the pp-bar forward amplitude through CERN SPS collider energies, using the flavored Pomeron model as an effective parametrization of nonperturbative QCD. We show that the rise in the total cross section between CERN ISR and SPS collider energies is consistent with the onset of charmed-particle production up to the level of a few millibarns, along with other processes, and in agreement with available data. In contrast with our estimates of charm production, perturbative QCD charm-production calculations are well below the data. We give estimates of the p-bar and K/sup +- / multiplicities at SPSmore » collider energies. We also present a simplified version of the flavoring model in order to facilitate comparisons between it and other parametrizations.« less

  15. Performance and advantages of a soft-core based parallel architecture for energy peak detection in the calorimeter Level 0 trigger for the NA62 experiment at CERN

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Barbanera, M.; Bizzarri, M.; Bonaiuto, V.; Ceccucci, A.; Checcucci, B.; De Simone, N.; Fantechi, R.; Federici, L.; Fucci, A.; Lupi, M.; Paoluzzi, G.; Papi, A.; Piccini, M.; Ryjov, V.; Salamon, A.; Salina, G.; Sargeni, F.; Venditti, S.

    2017-03-01

    The NA62 experiment at CERN SPS has started its data-taking. Its aim is to measure the branching ratio of the ultra-rare decay K+ → π+ν ν̅ . In this context, rejecting the background is a crucial topic. One of the main background to the measurement is represented by the K+ → π+π0 decay. In the 1-8.5 mrad decay region this background is rejected by the calorimetric trigger processor (Cal-L0). In this work we present the performance of a soft-core based parallel architecture built on FPGAs for the energy peak reconstruction as an alternative to an implementation completely founded on VHDL language.

  16. 25th Birthday Cern- Restaurant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2006-05-05

    Cérémonie du 25ème anniversaire du Cern avec plusieurs orateurs et la présence de nombreux autorités cantonales et communales genevoises et personnalités, directeurs généraux, ministres, chercheurs.... Le conseiller féderal et chef du département des affaires étrangères de la confédération Monsieur Pierre Aubert prend la parole pour célébrer à la fois les résultats très remarquables de la coopération internationale en matière scientifique, mais aussi la volonté politique des états européens de mettre en commun leurs ressources pour faire oeuvre d'avenir. Un grand hommage est aussi donné aux deux directeurs disparus, les prof.Bakker et Gregory.

  17. Extreme Energy Events Project: Construction of the detectors and installation in Italian High Schools

    NASA Astrophysics Data System (ADS)

    Abbrescia, M.; An, S.; Antolini, R.; Badalà, A.; Baldini Ferroli, R.; Bencivenni, G.; Blanco, F.; Bressan, E.; Chiavassa, A.; Chiri, C.; Cifarelli, L.; Cindolo, F.; Coccia, E.; de Pasquale, S.; di Giovanni, A.; D'Incecco, M.; Fabbri, F. L.; Frolov, V.; Garbini, M.; Gustavino, C.; Hatzifotiadou, D.; Imponente, G.; Kim, J.; La Rocca, P.; Librizzi, F.; Maggiora, A.; Menghetti, H.; Miozzi, S.; Moro, R.; Panareo, M.; Pappalardo, G. S.; Piragino, G.; Riggi, F.; Romano, F.; Sartorelli, G.; Sbarra, C.; Selvi, M.; Serci, S.; Williams, C.; Zichichi, A.; Zuyeuski, R.

    2008-04-01

    The EEE Project, conceived by its leader Antonino Zichichi, aims to detect Extreme Energy Events of cosmic rays with an array of muon telescopes distributed over the Italian territory. The Project involves Italian High Schools in order to introduce young people to Physics, also countervailing the recent crisis of university scientific classes inscriptions. The detectors for the EEE telescopes are Multigap Resistive Plate Chambers (MRPC) and have been constructed by teams of High School students who went in shift at the CERN laboratories. The mechanics and the electronics were developed by groups of researchers from CERN, the Italian Centro Fermi and INFN. The first group of schools of the EEE Project has inaugurated their telescopes recently. A status report of the Project and the preliminary results are presented.

  18. Pixelsex or Cosmic Revelation – how art & science can meet in public space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Otto Roth, Tim

    2009-10-28

    Tim Otto Roth is known for his large projects in public space linking art & science. In his presentation the German artist and media theorist demonstrates some of his latest projects - among others Cosmic Revelation which changed the KASCADE detector field for cosmic rays at the Karlsruhe Institute of Technology into a giant flashing light field. The Pixelsex project leads him to the question if the universe might be digital. In occasion of his one week residency at CERN Tim Otto Roth explores the material culture of particle physics and its ways of finding pictorial representations. Above all hemore » is interested in methods like the Monte Carlo simulation, but also in CERN as giant collaborative institution and consequently as birthplace for the World Wide Web.« less

  19. Space charge problems in high intensity RFQs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiss, M.

    1996-06-01

    Measurements were made to check the performance of the CERN high intensity RFQs (RFQ2A and RFQ2B) and assess the validity of the design approach; the study of space charge effects was undertaken in this context. RFQ2A and RFQ2B are 200 mA, 750 keV proton accelerators, operating at 202.56 MHz. Since the beginning of 1993, RFQ2B serves as injector to the CERN 50 MeV Alvarez linac (Linac 2). In 1992, both RFQs were on the test stand to undergo a series of beam measurements, which were compared with computations. The studies concerning the RFQ2A were more detailed and they are reportedmore » in this paper. {copyright} {ital 1996 American Institute of Physics.}« less

  20. Space Radiation Effects Laboratory

    NASA Technical Reports Server (NTRS)

    1969-01-01

    The SREL User's Handbook is designed to provide information needed by those who plan experiments involving the accelerators at this laboratory. Thus the Handbook will contain information on the properties of the machines, the beam parameters, the facilities and services provided for experimenters, etc. This information will be brought up to date as new equipment is added and modifications accomplished. This Handbook is influenced by the many excellent models prepared at other accelerator laboratories. In particular, the CERN Synchrocyclotron User's Handbook (November 1967) is closely followed in some sections, since the SREL Synchrocyclotron is a duplicate of the CERN machine. We wish to thank Dr. E. G. Michaelis for permission to draw so heavily on his work, particularly in Section II of this Handbook. We hope that the Handbook will prove useful, and will welcome suggestions and criticism.

  1. Pixelsex or Cosmic Revelation – how art & science can meet in public space

    ScienceCinema

    Otto Roth, Tim

    2018-05-18

    Tim Otto Roth is known for his large projects in public space linking art & science. In his presentation the German artist and media theorist demonstrates some of his latest projects - among others Cosmic Revelation which changed the KASCADE detector field for cosmic rays at the Karlsruhe Institute of Technology into a giant flashing light field. The Pixelsex project leads him to the question if the universe might be digital. In occasion of his one week residency at CERN Tim Otto Roth explores the material culture of particle physics and its ways of finding pictorial representations. Above all he is interested in methods like the Monte Carlo simulation, but also in CERN as giant collaborative institution and consequently as birthplace for the World Wide Web.

  2. SLHC, the High-Luminosity Upgrade (public event)

    ScienceCinema

    None

    2017-12-09

    In the morning of June 23rd a public event is organised in CERN's Council Chamber with the aim of providing the particle physics community with up-to-date information about the strategy for the LHC luminosity upgrade and to describe the current status of preparation work. The presentations will provide an overview of the various accelerator sub-projects, the LHC physics prospects and the upgrade plans of ATLAS and CMS. This event is organised in the framework of the SLHC-PP project, which receives funding from the European Commission for the preparatory phase of the LHC High Luminosity Upgrade project. Informing the public is among the objectives of this EU-funded project. A simultaneous transmission of this meeting will be broadcast, available at the following address: http://webcast.cern.ch/

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    campbell, myron

    To create a research and study abroad program that would allow U.S. undergraduate students access to the world-leading research facilities at the European Organization for Nuclear Research (CERN), the World Health Organization, various operations of the United Nations and other international organizations based in Geneva.The proposal is based on the unique opportunities currently existing in Geneva. The Large Hadron Collider (LHC) is now operational at CERN, data are being collected, and research results are already beginning to emerge. At the same time, a related reduction of activity at U.S. facilities devoted to particle physics is expected. In addition, the U.S.more » higher-education community has an ever-increasing focus on international organizations dealing with world health pandemics, arms control and human rights, a nexus also centered in Geneva.« less

  4. A front-end read out chip for the OPERA scintillator tracker

    NASA Astrophysics Data System (ADS)

    Lucotte, A.; Bondil, S.; Borer, K.; Campagne, J. E.; Cazes, A.; Hess, M.; de La Taille, C.; Martin-Chassard, G.; Raux, L.; Repellin, J. P.

    2004-04-01

    Multi-anode photomultipliers H7546 are used to readout signal from the OPERA Scintillator Tracker (CERN/SPSC 2000-028, SPSC/P318, LNGSP 25/2000; CERN/SPSC 2001-025, SPSC/M668, LNGS-EXP30/2001). A 32-channel front-end Read Out Chip prototype accommodating the H7546 has been designed at LAL. This device features a low-noise, variable gain preamplifier to correct for multi-anode non-uniformity, an auto-trigger capability 100% efficient at a 0.3 photo-electron, and a charge measurement extending over a large dynamic range [0-100] photo-electrons. In this article we describe the ASIC architecture that is being implemented for the Target Tracker in OPERA, with a special emphasis put on the designs and the measured performance.

  5. Preparation and characterization of 33S samples for 33S(n, α)30Si cross-section measurements at the n_TOF facility at CERN

    NASA Astrophysics Data System (ADS)

    Praena, J.; Ferrer, F. J.; Vollenberg, W.; Sabaté-Gilarte, M.; Fernández, B.; García-López, J.; Porras, I.; Quesada, J. M.; Altstadt, S.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Barbagallo, M.; Bečvář, F.; Belloni, F.; Berthoumieux, E.; Billowes, J.; Boccone, V.; Bosnar, D.; Brugger, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Carrapiço, C.; Cerutti, F.; Chiaveri, E.; Chin, M.; Colonna, N.; Cortés, G.; Cortés-Giraldo, M. A.; Diakaki, M.; Dietz, M.; Domingo-Pardo, C.; Dressler, R.; Durán, I.; Eleftheriadis, C.; Ferrari, A.; Fraval, K.; Furman, V.; Göbel, K.; Gómez-Hornillos, M. B.; Ganesan, S.; García, A. R.; Giubrone, G.; Gonçalves, I. F.; González-Romero, E.; Goverdovski, A.; Griesmayer, E.; Guerrero, C.; Gunsing, F.; Heftrich, T.; Hernández-Prieto, A.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Karadimos, D.; Katabuchi, T.; Ketlerov, V.; Khryachkov, V.; Kivel, N.; Koehler, P.; Kokkoris, M.; Kroll, J.; Krtička, M.; Lampoudis, C.; Langer, C.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Leong, L. S.; Lerendegui-Marco, J.; Losito, R.; Mallick, A.; Manousos, A.; Marganiec, J.; Martínez, T.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Mondelaers, W.; Paradela, C.; Pavlik, A.; Perkowski, J.; Plompen, A. J. M.; Rauscher, T.; Reifarth, R.; Riego-Perez, A.; Robles, M.; Rubbia, C.; Ryan, J. A.; Sarmento, R.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vermeulen, M. J.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Ware, T.; Weigand, M.; Weiss, C.; Wright, T.; Žugec, P.; n TOF Collaboration

    2018-05-01

    Thin 33S samples for the study of the 33S(n, α)30Si cross-section at the n_TOF facility at CERN were made by thermal evaporation of 33S powder onto a dedicated substrate made of kapton covered with thin layers of copper, chromium and titanium. This method has provided for the first time bare sulfur samples a few centimeters in diameter. The samples have shown an excellent adherence with no mass loss after few years and no sublimation in vacuum at room temperature. The determination of the mass thickness of 33S has been performed by means of Rutherford backscattering spectrometry. The samples have been successfully tested under neutron irradiation.

  6. Shielding design for the front end of the CERN SPL.

    PubMed

    Magistris, Matteo; Silari, Marco; Vincke, Helmut

    2005-01-01

    CERN is designing a 2.2-GeV Superconducting Proton Linac (SPL) with a beam power of 4 MW, to be used for the production of a neutrino superbeam. The SPL front end will initially accelerate 2 x 10(14) negative hydrogen ions per second up to an energy of 120 MeV. The FLUKA Monte Carlo code was employed for shielding design. The proposed shielding is a combined iron-concrete structure, which also takes into consideration the required RF wave-guide ducts and access labyrinths to the machine. Two beam-loss scenarios were investigated: (1) constant beam loss of 1 Wm(-1) over the whole accelerator length and (2) full beam loss occurring at various locations. A comparison with results based on simplified approaches is also presented.

  7. [The CERN and the megascience].

    PubMed

    Aguilar Peris, José

    2006-01-01

    In this work we analyse the biggest particle accelerator in the world: the LHC (Large Hadron Collider). The ring shaped tunnel is 27 km long and it is buried over 110 meters underground, straddling the border betwen France and Switzerland at the CERN laboratory near Geneva. Its mission is to recreate the conditions that existed shortly after the Big-Bang and to look for the hypothesised Higgs particle. The LHC will accelerate protons near the speed of the light and collide them head on at an energy of to 14 TeV (1 TeV = 10(12) eV). Keeping such high energy in the proton beams requires enormous magnetic fields which are generated by superconducting electromagnets chilled to less than two degrees above absolute zero. It is expected that LHC will be inaugurated in summer 2007.

  8. A Bonner Sphere Spectrometer for pulsed fields

    PubMed Central

    Aza, E.; Dinar, N.; Manessi, G. P.; Silari, M.

    2016-01-01

    The use of conventional Bonner Sphere Spectrometers (BSS) in pulsed neutron fields (PNF) is limited by the fact that proportional counters, usually employed as the thermal neutron detectors, suffer from dead time losses and show severe underestimation of the neutron interaction rate, which leads to strong distortion of the calculated spectrum. In order to avoid these limitations, an innovative BSS, called BSS-LUPIN, has been developed for measuring in PNF. This paper describes the physical characteristics of the device and its working principle, together with the results of Monte Carlo simulations of its response matrix. The BSS-LUPIN has been tested in the stray neutron field at the CERN Proton Synchrotron, by comparing the spectra obtained with the new device, the conventional CERN BSS and via Monte Carlo simulations. PMID:25948828

  9. Lectures from the European RTN Winter School on Strings, Supergravity and Gauge Theories, CERN, 16 20 January, 2006

    NASA Astrophysics Data System (ADS)

    Derendinger, J.-P.; Scrucca, C. A.; Uranga, A. M.

    2006-11-01

    This special issue is devoted to the proceedings of the conference 'Winter School on Strings, Supergravity and Gauge Theories', which took place at CERN, the European Centre for Nuclear Research, in Geneva, Switzerland, from the 16 to the 20 of January 2006. This event was organized in the framework of the European Mobility Research and Training Network entitled 'Constituents, Fundamental Forces and Symmetries of the Universe'. It is part of a yearly series of scientific schools which have become a traditional rendezvous for young researchers of the community. The previous one was held at SISSA, in Trieste, Italy, in February 2005, and the next one will take place again at CERN, in January 2007. The school was primarily meant for young doctoral students and postdoctoral researchers working in the area of string theory. It consisted of five general lectures of four hours each, whose notes are published in the present proceedings, and five working group discussion sessions, focused on specific topics of the network research program. It was attended by approximately 250 participants. The topics of the lectures were chosen to provide an introduction to some of the areas of recent progress and to the open problems in string theory. String theory is expected to provide insights into the description of systems where the role of gravity is crucial. One prominent example of such systems are time-dependent backgrounds with big bang singularities, whose status in string theory is reviewed in the lecture notes by Ben Craps. In another main problem in quantum gravity, string theory gives a fascinating microscopic description of black holes and their properties. The lectures by Shiraz Minwalla review the thermal properties of black holes from their microscopic description in terms of a holographically dual large N field theory. Progress in the description of black hole microstates, and its interplay with the macroscopic description in terms of supergravity solutions via the attractor mechanism, are covered by the lectures by Atish Dabholkar and Boris Pioline. A final important mainstream topic in string theory, being a higher-dimensional theory, is its compactification to four dimensions, and the computation of four-dimensional physical properties in terms of the properties of the internal space. The lectures by Mariana Graña review recent progress in the classification of the most general supersymmetric backgrounds describing the compactified dimensions, and their role in determining the number of massless scalar moduli fields in four dimensions. The conference was financially supported by the European Commission under contract MRTN-CT-2004-005104 and by CERN. It was jointly organized by the Physics Institute of the University of Neuchâtel and the Theory Unit of the Physics Division of CERN. It is a great pleasure for us to warmly thank the Theory Unit of CERN for its very kind hospitality and for the high quality of the services and infrastructure that it has provided. We also acknowledge helpful administrative assistance from the Physics Institute of the University of Neuchâtel. Special thanks go finally to Denis Frank for his very valuable help in preparing the conference web pages, and to J Rostant, A-M Perrin and M-S Vascotto for their continuous and very reliable assistance.

  10. "Life in the Universe" Final Event Video Now Available

    NASA Astrophysics Data System (ADS)

    2002-02-01

    ESO Video Clip 01/02 is issued on the web in conjunction with the release of a 20-min documentary video from the Final Event of the "Life in the Universe" programme. This unique event took place in November 2001 at CERN in Geneva, as part of the 2001 European Science and Technology Week, an initiative by the European Commission to raise the public awareness of science in Europe. The "Life in the Universe" programme comprised competitions in 23 European countries to identify the best projects from school students. The projects could be scientific or a piece of art, a theatrical performance, poetry or even a musical performance. The only restriction was that the final work must be based on scientific evidence. Winning teams from each country were invited to a "Final Event" at CERN on 8-11 November, 2001 to present their projects to a panel of International Experts during a special three-day event devoted to understanding the possibility of other life forms existing in our Universe. This Final Event also included a spectacular 90-min webcast from CERN with the highlights of the programme. The video describes the Final Event and the enthusiastic atmosphere when more than 200 young students and teachers from all over Europe met with some of the world's leading scientific experts of the field. The present video clip, with excerpts from the film, is available in four versions: two MPEG files and two streamer-versions of different sizes; the latter require RealPlayer software. Video Clip 01/02 may be freely reproduced. The 20-min video is available on request from ESO, for viewing in VHS and, for broadcasters, in Betacam-SP format. Please contact the ESO EPR Department for more details. Life in the Universe was jointly organised by the European Organisation for Nuclear Research (CERN) , the European Space Agency (ESA) and the European Southern Observatory (ESO) , in co-operation with the European Association for Astronomy Education (EAAE). Other research organisations were associated with the programme, e.g., the European Molecular Biology Laboratory (EMBL) and the European Synchrotron Radiation Facility (ESRF). Detailed information about the "Life in the Universe" programme can be found at the website b>http://www.lifeinuniverse.org and a webcast of this 90-min closing session in one of the large experimental halls at CERN is available on the web via that page. Most of the ESO PR Video Clips at the ESO website provide "animated" illustrations of the ongoing work and events at the European Southern Observatory. The most recent clip was: ESO PR Video Clips 08a-b/01 about The Eagle's EGGs (20 December 2001) . General information is available on the web about ESO videos.

  11. Web Proxy Auto Discovery for the WLCG

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Blomer, J.; Blumenfeld, B.; De Salvo, A.; Dewhurst, A.; Verguilov, V.

    2017-10-01

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily support that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which they direct to the nearest publicly accessible web proxy servers. The responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.

  12. La supraconductivité a 100 ans !

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lebrun, Philippe

    2011-04-14

    Il y a 100 ans, le 8 avril 1911, une découverte majeure était réalisée : celle de la supraconductivité. La supraconductivité est la caractéristique qu’ont certains métaux et alliages de perdre toute résistance électrique en dessous d’une température donnée. Cette renversante découverte, réalisée de manière presque fortuite par Kammerlingh Onnes de l’Université de Leyde (Pays-Bas) et son étudiant Gilles Holst, a ouvert un nouveau champ de recherche en physique et de fabuleuses perspectives d’applications technologiques. Du point de vue scientifique, la supraconductivité est en effet l’une des rares manifestations de la physique quantique à l’échelle macroscopique.  Du point de vuemore » des retombées techniques, elle est porteuse d’applications majeures dans le domaine de la santé, des communications et de l’énergie. 100 ans après, les physiciens n’ont toujours pas fini d’explorer ce phénomène et ses applications. Le CERN abrite des applications de la supraconductivité à des échelles inédites. L’accélérateur de particules LHC, avec ses milliers d’aimants supraconducteurs répartis sur 27 kilomètres de circonférence, est en effet la plus grande application mondiale de la supraconductivité. Il ne pourrait exister sans elle. Le CERN fête donc la découverte de la supraconductivité avec une conférence exceptionnelle donnée par Philippe Lebrun. Au cours de cette conférence, l’expérience historique de Kammerlingh Onnes sera reproduite. Philippe Lebrun racontera l’histoire de cette étonnante découverte, en la replaçant dans le contexte scientifique de l’époque. Il racontera les développements scientifiques et les applications du premier siècle de la supraconductivité. Conférence en français Merci de bien vouloir vous inscrire au : +41 22 767 76 76 ou cern.reception@cern.ch« less

  13. Web Proxy Auto Discovery for the WLCG

    DOE PAGES

    Dykstra, D.; Blomer, J.; Blumenfeld, B.; ...

    2017-11-23

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less

  14. Web Proxy Auto Discovery for the WLCG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykstra, D.; Blomer, J.; Blumenfeld, B.

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less

  15. Kitty for the kitty

    NASA Astrophysics Data System (ADS)

    2016-11-01

    Officials at the International Linear Collider (ILC) - a proposed successor to the Large Hadron Collider at CERN - have turned to Hello Kitty to help promote the project, which is set to be built in Japan.

  16. --No Title--

    Science.gov Websites

    CERN. Useful events will be selected by a trigger that consists of three levels (level 1, level 2 and the event filter). The Argonne HEP division is responsible for critical components of the level 2

  17. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  18. The transition radiation detector of the PAMELA space mission

    NASA Astrophysics Data System (ADS)

    Ambriola, M.; Bellotti, R.; Cafagna, F.; Circella, M.; de Marzo, C.; Giglietto, N.; Marangelli, B.; Mirizzi, N.; Romita, M.; Spinelli, P.

    2004-04-01

    PAMELA space mission objective is to flight a satellite-borne magnetic spectrometer built to fulfill the primary scientific goals of detecting antiparticles (antiprotons and positrons) and to measure spectra of particles in cosmic rays. The PAMELA telescope is composed of: a silicon tracker housed in a permanent magnet, a time-of-flight and an anticoincidence system both made of plastic scintillators, a silicon imaging calorimeter, a neutron detector and a Transition Radiation Detector (TRD). The TRD is composed of nine sensitive layers of straw tubes working in proportional mode for a total of 1024 channels. Each layer is interleaved with a radiator plane made of carbon fibers. The TRD characteristics will be described along with its performances studied at both CERN-PS and CERN-SPS facilities, using electrons, pions, muons and protons of different momenta.

  19. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE PAGES

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  20. 750 GeV diphoton excess at CERN LHC from a dark sector assisted scalar decay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Subhaditya; Patra, Sudhanwa; Sahoo, Nirakar

    2016-06-06

    We present a simple extension of the Standard Model (SM) to explain the recent diphoton excess, reported by CMS and ATLAS at CERN LHC. The SM is extended by a dark sector including a vector-like lepton doublet and a singlet of zero electromagnetic charge, which are odd under a Z{sub 2} symmetry. The charged particle of the vector-like lepton doublet assist the additional scalar, different from SM Higgs, to decay to di-photons of invariant mass around 750 GeV and thus explaining the excess observed at LHC. The admixture of neutral component of the vector-like lepton doublet and singlet constitute themore » dark matter of the Universe. We show the relevant parameter space for correct relic density and direct detection of dark matter.« less

  1. Final Scientific EFNUDAT Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kappeler, Franz

    2010-11-09

    F. Kappeler speaks about EFNUDAT synergies in astrophysics in this second session of the Final Scientific EFNUDAT Workshop. The workshop was organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive. EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluation Cross section measurements Experimental techniques Uncertainties and covariances Fission properties Current and future facilities; International Advisory Committee: C. Barreau (CENBG, France) T. Belgya (IKI KFKI, Hungary) E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany) R. Nolte (PTB, Germany)S. Pomp (TSLmore » UU, Sweden);Workshop Organizing Committee: Enrico Chiaveri (Chairman) Marco Calviani Samuel Andriamonje Eric Berthoumieux Carlos Guerrero Roberto Losito Vasilis Vlachoudis Workshop Assistant: Geraldine Jean.« less

  2. Evolution of optical fibre cabling components at CERN: Performance and technology trends analysis

    NASA Astrophysics Data System (ADS)

    Shoaie, Mohammad Amin; Meroli, Stefano; Machado, Simao; Ricci, Daniel

    2018-05-01

    CERN optical fibre infrastructure has been growing constantly over the past decade due to ever increasing connectivity demands. The provisioning plan and fibre installation of this vast laboratory is performed by Fibre Optics and Cabling Section at Engineering Department. In this paper we analyze the procurement data for essential fibre cabling components during a five-year interval to extract the existing trends and anticipate future directions. The analysis predicts high contribution of LC connector and an increasing usage of multi-fibre connectors. It is foreseen that single-mode fibres become the main fibre type for mid and long-range installations while air blowing would be the major installation technique. Performance assessment of various connectors shows that the expanded beam ferrule is favored for emerging on-board optical interconnections thanks to its scalable density and stable return-loss.

  3. A new information architecture, website and services for the CMS experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Lucas; Rusack, Eleanor; Zemleris, Vidmantas

    2012-01-01

    The age and size of the CMS collaboration at the LHC means it now has many hundreds of inhomogeneous web sites and services, and hundreds of thousands of documents. We describe a major initiative to create a single coherent CMS internal and public web site. This uses the Drupal web Content Management System (now supported by CERN/IT) on top of a standard LAMP stack (Linux, Apache, MySQL, and php/perl). The new navigation, content and search services are coherently integrated with numerous existing CERN services (CDS, EDMS, Indico, phonebook, Twiki) as well as many CMS internal Web services. We describe themore » information architecture, the system design, implementation and monitoring, the document and content database, security aspects, and our deployment strategy, which ensured continual smooth operation of all systems at all times.« less

  4. A new Information Architecture, Website and Services for the CMS Experiment

    NASA Astrophysics Data System (ADS)

    Taylor, Lucas; Rusack, Eleanor; Zemleris, Vidmantas

    2012-12-01

    The age and size of the CMS collaboration at the LHC means it now has many hundreds of inhomogeneous web sites and services, and hundreds of thousands of documents. We describe a major initiative to create a single coherent CMS internal and public web site. This uses the Drupal web Content Management System (now supported by CERN/IT) on top of a standard LAMP stack (Linux, Apache, MySQL, and php/perl). The new navigation, content and search services are coherently integrated with numerous existing CERN services (CDS, EDMS, Indico, phonebook, Twiki) as well as many CMS internal Web services. We describe the information architecture; the system design, implementation and monitoring; the document and content database; security aspects; and our deployment strategy, which ensured continual smooth operation of all systems at all times.

  5. Performance of the Prototype Readout System for the CMS Endcap Hadron Calorimeter Upgrade

    NASA Astrophysics Data System (ADS)

    Chaverin, Nate; Dittmann, Jay; Hatakeyama, Kenichi; Pastika, Nathaniel; CMS Collaboration

    2016-03-01

    The Compact Muon Solenoid (CMS) experiment at the CERN Large Hadron Collider (LHC) will upgrade the photodetectors and readout systems of the endcap hadron calorimeter during the technical stop scheduled for late 2016 and early 2017. A major milestone for this project was a highly successful testbeam run at CERN in August 2015. The testbeam run served as a full integration test of the electronics, allowing a study of the response of the preproduction electronics to the true detector light profile, as well as a test of the light yield of various new plastic scintillator materials. We present implications for the performance of the hadron calorimeter front-end electronics based on testbeam data, and we report on the production status of various components of the system in preparation for the upgrade.

  6. Higher moments of multiplicity fluctuations in a hadron-resonance gas with exact conservation laws

    NASA Astrophysics Data System (ADS)

    Fu, Jing-Hua

    2017-09-01

    Higher moments of multiplicity fluctuations of hadrons produced in central nucleus-nucleus collisions are studied within the hadron-resonance gas model in the canonical ensemble. Exact conservation of three charges, baryon number, electric charge, and strangeness is enforced in the large volume limit. Moments up to the fourth order of various particles are calculated at CERN Super Proton Synchrotron, BNL Relativistic Heavy Ion Collider (RHIC), and CERN Large Hadron Collider energies. The asymptotic fluctuations within a simplified model with only one conserved charge in the canonical ensemble are discussed where simple analytical expressions for moments of multiplicity distributions can be obtained. Moments products of net-proton, net-kaon, and net-charge distributions in Au + Au collisions at RHIC energies are calculated. The pseudorapidity coverage dependence of net-charge fluctuation is discussed.

  7. Lead ions and Coulomb’s Law at the LHC (CERN)

    NASA Astrophysics Data System (ADS)

    Cid-Vidal, Xabier; Cid, Ramon

    2018-03-01

    Although for most of the time the Large Hadron Collider (LHC) at CERN collides protons, for around one month every year lead ions are collided, to expand the diversity of the LHC research programme. Furthermore, in an effort not originally foreseen, proton-lead collisions are also taking place, with results of high interest to the physics community. All the large experiments of the LHC have now joined the heavy-ion programme, including the LHCb experiment, which was not at first expected to be part of it. The aim of this article is to introduce a few simple physical calculations relating to some electrical phenomena that occur when lead-ion bunches are running in the LHC, using Coulomb’s Law, to be taken to the secondary school classroom to help students understand some important physical concepts.

  8. CERN’s AFS replacement project

    NASA Astrophysics Data System (ADS)

    Iven, J.; Lamanna, M.; Pace, A.

    2017-10-01

    OpenAFS is the legacy solution for a variety of use cases at CERN, most notably home-directory services. OpenAFS has been used as the primary shared file-system for Linux (and other) clients for more than 20 years, but despite an excellent track record, the project’s age and architectural limitations are becoming more evident. We are now working to offer an alternative solution based on existing CERN storage services. The new solution will offer evolved functionality, and is expected to eventually benefit from operational synergies. In this paper we will present CERN’s usage and an analysis of our technical choices: we will focus on the alternatives chosen for the various use cases (among them EOS, CERNBox and CASTOR); on implementing the migration process over the coming years; and the challenges and opportunities of the migration.

  9. Beam experiments with the Grenoble test electron cyclotron resonance ion source at iThemba LABS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomae, R., E-mail: rthomae@tlabs.ac.za; Conradie, J.; Fourie, D.

    2016-02-15

    At iThemba Laboratory for Accelerator Based Sciences (iThemba LABS) an electron cyclotron ion source was installed and commissioned. This source is a copy of the Grenoble Test Source (GTS) for the production of highly charged ions. The source is similar to the GTS-LHC at CERN and named GTS2. A collaboration between the Accelerators and Beam Physics Group of CERN and the Accelerator and Engineering Department of iThemba LABS was proposed in which the development of high intensity argon and xenon beams is envisaged. In this paper, we present beam experiments with the GTS2 at iThemba LABS, in which the resultsmore » of continuous wave and afterglow operation of xenon ion beams with oxygen as supporting gases are presented.« less

  10. First results from the CERN axion solar telescope.

    PubMed

    Zioutas, K; Andriamonje, S; Arsov, V; Aune, S; Autiero, D; Avignone, F T; Barth, K; Belov, A; Beltrán, B; Bräuninger, H; Carmona, J M; Cebrián, S; Chesi, E; Collar, J I; Creswick, R; Dafni, T; Davenport, M; Di Lella, L; Eleftheriadis, C; Englhauser, J; Fanourakis, G; Farach, H; Ferrer, E; Fischer, H; Franz, J; Friedrich, P; Geralis, T; Giomataris, I; Gninenko, S; Goloubev, N; Hasinoff, M D; Heinsius, F H; Hoffmann, D H H; Irastorza, I G; Jacoby, J; Kang, D; Königsmann, K; Kotthaus, R; Krcmar, M; Kousouris, K; Kuster, M; Lakić, B; Lasseur, C; Liolios, A; Ljubicić, A; Lutz, G; Luzón, G; Miller, D W; Morales, A; Morales, J; Mutterer, M; Nikolaidis, A; Ortiz, A; Papaevangelou, T; Placci, A; Raffelt, G; Ruz, J; Riege, H; Sarsa, M L; Savvidis, I; Serber, W; Serpico, P; Semertzidis, Y; Stewart, L; Vieira, J D; Villar, J; Walckiers, L; Zachariadou, K

    2005-04-01

    Hypothetical axionlike particles with a two-photon interaction would be produced in the sun by the Primakoff process. In a laboratory magnetic field ("axion helioscope"), they would be transformed into x-rays with energies of a few keV. Using a decommissioned Large Hadron Collider test magnet, the CERN Axion Solar Telescope ran for about 6 months during 2003. The first results from the analysis of these data are presented here. No signal above background was observed, implying an upper limit to the axion-photon coupling g(agamma)<1.16x10(-10) GeV-1 at 95% C.L. for m(a) less, similar 0.02 eV. This limit, assumption-free, is comparable to the limit from stellar energy-loss arguments and considerably more restrictive than any previous experiment over a broad range of axion masses.

  11. Probing small parton densities in ultraperipheral A A and pA collisions at the CERN large Hadron Collider.

    PubMed

    Strikman, Mark; Vogt, Ramona; White, Sebastian

    2006-03-03

    We calculate photoproduction rates for several hard processes in ultraperipheral proton-lead and lead-lead collisions at the CERN Large Hadron Collider (LHC) with square root of sNN = 8.8 and 5.5 TeV, respectively, which could be triggered in the large LHC detectors. We use ATLAS as an example. The lead ion is treated as a source of (coherently produced) photons with energies and intensities greater than those of equivalent ep collisions at the DESY collider HERA. We find very large rates for both inclusive and diffractive production that will extend the HERA x range by nearly an order of magnitude for similar virtualities. We demonstrate that it is possible to reach the kinematic regime where nonlinear effects are larger than at HERA.

  12. Search for solar axions by the CERN axion solar telescope with 3He buffer gas: closing the hot dark matter gap.

    PubMed

    Arik, M; Aune, S; Barth, K; Belov, A; Borghi, S; Bräuninger, H; Cantatore, G; Carmona, J M; Cetin, S A; Collar, J I; Da Riva, E; Dafni, T; Davenport, M; Eleftheriadis, C; Elias, N; Fanourakis, G; Ferrer-Ribas, E; Friedrich, P; Galán, J; García, J A; Gardikiotis, A; Garza, J G; Gazis, E N; Geralis, T; Georgiopoulou, E; Giomataris, I; Gninenko, S; Gómez, H; Gómez Marzoa, M; Gruber, E; Guthörl, T; Hartmann, R; Hauf, S; Haug, F; Hasinoff, M D; Hoffmann, D H H; Iguaz, F J; Irastorza, I G; Jacoby, J; Jakovčić, K; Karuza, M; Königsmann, K; Kotthaus, R; Krčmar, M; Kuster, M; Lakić, B; Lang, P M; Laurent, J M; Liolios, A; Ljubičić, A; Luzón, G; Neff, S; Niinikoski, T; Nordt, A; Papaevangelou, T; Pivovaroff, M J; Raffelt, G; Riege, H; Rodríguez, A; Rosu, M; Ruz, J; Savvidis, I; Shilon, I; Silva, P S; Solanki, S K; Stewart, L; Tomás, A; Tsagri, M; van Bibber, K; Vafeiadis, T; Villar, J; Vogel, J K; Yildiz, S C; Zioutas, K

    2014-03-07

    The CERN Axion Solar Telescope has finished its search for solar axions with (3)He buffer gas, covering the search range 0.64 eV ≲ ma ≲ 1.17 eV. This closes the gap to the cosmological hot dark matter limit and actually overlaps with it. From the absence of excess x rays when the magnet was pointing to the Sun we set a typical upper limit on the axion-photon coupling of gaγ ≲ 3.3 × 10(-10)  GeV(-1) at 95% C.L., with the exact value depending on the pressure setting. Future direct solar axion searches will focus on increasing the sensitivity to smaller values of gaγ, for example by the currently discussed next generation helioscope International AXion Observatory.

  13. Preliminary Results From The First Flight of ATIC

    NASA Technical Reports Server (NTRS)

    Seo, E. S.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The Advanced Thin Ionization Calorimeter (ATIC) instrument is designed to measure the composition and energy spectra of Z = 1 to 28 cosmic rays over the energy range approximately 10 GeV - 100 TeV. The instrument was calibrated in September 1999 at CERN using accelerated electron, proton and pion beams. ATIC was launched as a long duration balloon test flight on 12/28/00 local time from McMurdo, Antarctica. After flying successfully for about 16 days the payload was recovered in excellent condition. Absolute calibration of the detector response was made using cosmic-ray muons. The data analysis algorithm which was developed with Monte Carlo simulations and validated with the CERN beam test will be used for the flight data analysis. Preliminary results of the proton and helium spectra will be reported in this paper.

  14. Preliminary Results From the First Flight of ATIC

    NASA Technical Reports Server (NTRS)

    Seo, E. S.; Adams, James H., Jr.; Ahn, H.; Ampe, J.; Bashindzhagyan, G.; Case, G.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The Advanced Thin Ionization Calorimeter (ATIC) instrument is designed to measure the composition C and energy spectra of Z = 1 to 28 cosmic rays over the energy range approximately 10 GeV - 100 TeV. The instrument was calibrated in September 1999 at CERN using accelerated electron, proton and pion beams. ATIC was launched as a long duration balloon test flight on 12/28/00 local time from McMurdo, Antarctica. After flying successfully for about 16 days the payload was recovered in excellent condition. Absolute calibration of the detector response was made using cosmic-ray muons. The data analysis algorithm which was developed with Monte Carlo simulations and validated with the CERN beam test will be used for the flight data analysis. Preliminary results of the protons and C helium spectra will be reported in this paper.

  15. Hangout with CERN: Reaching the Public with the Collaborative Tools of Social Media

    NASA Astrophysics Data System (ADS)

    Goldfarb, S.; Kahle, K. L. M.; Rao, A.

    2014-06-01

    On 4 July 2012, particle physics became a celebrity. Around 1,000,000,000 people (yes, 1 billion) [1] saw rebroadcasts of two technical presentations announcing the discovery of a new boson. The occasion was a joint seminar of the CMS [2] and ATLAS [3] collaborations, and the target audience were particle physicists. Yet the world ate it up like a sporting event. Roughly two days later, in a parallel session of ICHEP in Melbourne, Australia [4], a group of physicists decided to explain the significance of this discovery to the public. They used a tool called "Hangout", part of the relatively new Google+ social media platform [5], to converse directly with the public via a webcast videoconference. The demand to join this Hangout [6] overloaded the server several times. In the end, a compromise involving Q&A via comments was set up, and the conversation was underway. We present a new project born shortly after this experience called Hangout with CERN [7], and discuss its success in creating an effective conversational channel between the public and particle physicists. We review earlier efforts by both CMS and ATLAS contributing to this development, and then describe the current programme, involving nearly all aspects of CERN, and some topics that go well beyond that. We conclude by discussing the potential of the programme both to improve our accountability to the public and to train our community for public communication.

  16. Simulation and measurements of the response of an air ionisation chamber exposed to a mixed high-energy radiation field.

    PubMed

    Vincke, Helmut; Forkel-Wirth, Doris; Perrin, Daniel; Theis, Chris

    2005-01-01

    CERN's radiation protection group operates a network of simple and robust ionisation chambers that are installed inside CERN's accelerator tunnels. These ionisation chambers are used for the remote reading of ambient dose rate equivalents inside the machines during beam-off periods. This Radiation Protection Monitor for dose rates due to Induced Radioactivity ('PMI', trade name: PTW, Type 34031) is a non-confined air ionisation plastic chamber which is operated under atmospheric pressure. Besides its current field of operation it is planned to extend the use of this detector in the Large Hadron Collider to measure radiation under beam operation conditions to obtain an indication of the machine performance. Until now, studies of the PMI detector have been limited to the response to photons. In order to evaluate its response to other radiation components, this chamber type was tested at CERF, the high-energy reference field facility at CERN. Six PMI detectors were installed around a copper target being irradiated by a mixed hadron beam with a momentum of 120 GeV c(-1). Each of the chosen detector positions was defined by a different radiation field, varying in type and energy of the incident particles. For all positions, detailed measurements and FLUKA simulations of the detector response were performed. This paper presents the promising comparison between the measurements and simulations and analyses the influence of the different particle types on the resulting detector response.

  17. Nuclear data activities at the n_TOF facility at CERN

    NASA Astrophysics Data System (ADS)

    Gunsing, F.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Bacak, M.; Balibrea-Correa, J.; Barbagallo, M.; Barros, S.; Bečvář, F.; Beinrucker, C.; Belloni, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brugger, M.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Castelluccio, D. M.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Colonna, N.; Cortés-Giraldo, M. A.; Cortés, G.; Cosentino, L.; Damone, L. A.; Deo, K.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Frost, R. J. W.; Furman, V.; Ganesan, S.; García, A. R.; Gawlik, A.; Gheorghe, I.; Glodariu, T.; Gonçalves, I. F.; González, E.; Goverdovski, A.; Griesmayer, E.; Guerrero, C.; Göbel, K.; Harada, H.; Heftrich, T.; Heinitz, S.; Hernández-Prieto, A.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Katabuchi, T.; Kavrigin, P.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lerendegui, J.; Licata, M.; Lo Meo, S.; Lonsdale, S. J.; Losito, R.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Montesano, S.; Musumarra, A.; Nolte, R.; Oprea, A.; Palomo-Pinto, F. R.; Paradela, C.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Quesada, J. M.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego-Perez, A.; Robles, M.; Rout, P.; Radeck, D.; Rubbia, C.; Ryan, J. A.; Sabaté-Gilarte, M.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Stamatopoulos, A.; Suryanarayana, S. V.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Warren, S.; Weigand, M.; Weiss, C.; Wolf, C.; Woods, P. J.; Wright, T.; Žugec, P.

    2016-10-01

    Nuclear data in general, and neutron-induced reaction cross sections in particular, are important for a wide variety of research fields. They play a key role in the safety and criticality assessment of nuclear technology, not only for existing power reactors but also for radiation dosimetry, medical applications, the transmutation of nuclear waste, accelerator-driven systems, fuel cycle investigations and future reactor systems as in Generation IV. Applications of nuclear data are also related to research fields as the study of nuclear level densities and stellar nucleosynthesis. Simulations and calculations of nuclear technology applications largely rely on evaluated nuclear data libraries. The evaluations in these libraries are based both on experimental data and theoretical models. Experimental nuclear reaction data are compiled on a worldwide basis by the international network of Nuclear Reaction Data Centres (NRDC) in the EXFOR database. The EXFOR database forms an important link between nuclear data measurements and the evaluated data libraries. CERN's neutron time-of-flight facility n_TOF has produced a considerable amount of experimental data since it has become fully operational with the start of the scientific measurement programme in 2001. While for a long period a single measurement station (EAR1) located at 185 m from the neutron production target was available, the construction of a second beam line at 20 m (EAR2) in 2014 has substantially increased the measurement capabilities of the facility. An outline of the experimental nuclear data activities at CERN's neutron time-of-flight facility n_TOF will be presented.

  18. Footprints of Fascination: Digital Traces of Public Engagement with Particle Physics on CERN's Social Media Platforms.

    PubMed

    Kahle, Kate; Sharon, Aviv J; Baram-Tsabari, Ayelet

    2016-01-01

    Although the scientific community increasingly recognizes that its communication with the public may shape civic engagement with science, few studies have characterized how this communication occurs online. Social media plays a growing role in this engagement, yet it is not known if or how different platforms support different types of engagement. This study sets out to explore how users engage with science communication items on different platforms of social media, and what are the characteristics of the items that tend to attract large numbers of user interactions. Here, user interactions with almost identical items on five of CERN's social media platforms were quantitatively compared over an eight-week period, including likes, comments, shares, click-throughs, and time spent on CERN's site. The most popular items were qualitatively analyzed for content features. Findings indicate that as audience size of a social media platform grows, the total rate of engagement with content tends to grow as well. However, per user, engagement tends to decline with audience size. Across all platforms, similar topics tend to consistently receive high engagement. In particular, awe-inspiring imagery tends to frequently attract high engagement across platforms, independent of newsworthiness. To our knowledge, this study provides the first cross-platform characterization of public engagement with science on social media. Findings, although focused on particle physics, have a multidisciplinary nature; they may serve to benchmark social media analytics for assessing science communication activities in various domains. Evidence-based suggestions for practitioners are also offered.

  19. Integration of XRootD into the cloud infrastructure for ALICE data analysis

    NASA Astrophysics Data System (ADS)

    Kompaniets, Mikhail; Shadura, Oksana; Svirin, Pavlo; Yurchenko, Volodymyr; Zarochentsev, Andrey

    2015-12-01

    Cloud technologies allow easy load balancing between different tasks and projects. From the viewpoint of the data analysis in the ALICE experiment, cloud allows to deploy software using Cern Virtual Machine (CernVM) and CernVM File System (CVMFS), to run different (including outdated) versions of software for long term data preservation and to dynamically allocate resources for different computing activities, e.g. grid site, ALICE Analysis Facility (AAF) and possible usage for local projects or other LHC experiments. We present a cloud solution for Tier-3 sites based on OpenStack and Ceph distributed storage with an integrated XRootD based storage element (SE). One of the key features of the solution is based on idea that Ceph has been used as a backend for Cinder Block Storage service for OpenStack, and in the same time as a storage backend for XRootD, with redundancy and availability of data preserved by Ceph settings. For faster and easier OpenStack deployment was applied the Packstack solution, which is based on the Puppet configuration management system. Ceph installation and configuration operations are structured and converted to Puppet manifests describing node configurations and integrated into Packstack. This solution can be easily deployed, maintained and used even in small groups with limited computing resources and small organizations, which usually have lack of IT support. The proposed infrastructure has been tested on two different clouds (SPbSU & BITP) and integrates successfully with the ALICE data analysis model.

  20. High duty factor plasma generator for CERN's Superconducting Proton Linac.

    PubMed

    Lettry, J; Kronberger, M; Scrivens, R; Chaudet, E; Faircloth, D; Favre, G; Geisser, J-M; Küchler, D; Mathot, S; Midttun, O; Paoluzzi, M; Schmitzer, C; Steyaert, D

    2010-02-01

    CERN's Linac4 is a 160 MeV linear accelerator currently under construction. It will inject negatively charged hydrogen ions into CERN's PS-Booster. Its ion source is a noncesiated rf driven H(-) volume source directly inspired from the one of DESY and is aimed to deliver pulses of 80 mA of H(-) during 0.4 ms at a 2 Hz repetition rate. The Superconducting Proton Linac (SPL) project is part of the luminosity upgrade of the Large Hadron Collider. It consists of an extension of Linac4 up to 5 GeV and is foreseen to deliver protons to a future 50 GeV synchrotron (PS2). For the SPL high power option (HP-SPL), the ion source would deliver pulses of 80 mA of H(-) during 1.2 ms and operate at a 50 Hz repetition rate. This significant upgrade motivates the design of the new water cooled plasma generator presented in this paper. Its engineering is based on the results of a finite element thermal study of the Linac4 H(-) plasma generator that identified critical components and thermal barriers. A cooling system is proposed which achieves the required heat dissipation and maintains the original functionality. Materials with higher thermal conductivity are selected and, wherever possible, thermal barriers resulting from low pressure contacts are removed by brazing metals on insulators. The AlN plasma chamber cooling circuit is inspired from the approach chosen for the cesiated high duty factor rf H(-) source operating at SNS.

  1. A small scale remote cooling system for a superconducting cyclotron magnet

    NASA Astrophysics Data System (ADS)

    Haug, F.; Berkowitz Zamorra, D.; Michels, M.; Gomez Bosch, R.; Schmid, J.; Striebel, A.; Krueger, A.; Diez, M.; Jakob, M.; Keh, M.; Herberger, W.; Oesterle, D.

    2017-02-01

    Through a technology transfer program CERN is involved in the R&D of a compact superconducting cyclotron for future clinical radioisotope production, a project led by the Spanish research institute CIEMAT. For the remote cooling of the LTc superconducting magnet operating at 4.5 K, CERN has designed a small scale refrigeration system, the Cryogenic Supply System (CSS). This refrigeration system consists of a commercial two-stage 1.5 W @ 4.2 K GM cryocooler and a separate forced flow circuit. The forced flow circuit extracts the cooling power of the first and the second stage cold tips, respectively. Both units are installed in a common vacuum vessel and, at the final configuration, a low loss transfer line will provide the link to the magnet cryostat for the cooling of the thermal shield with helium at 40 K and the two superconducting coils with two-phase helium at 4.5 K. Currently the CSS is in the testing phase at CERN in stand-alone mode without the magnet and the transfer line. We have added a “validation unit” housed in the vacuum vessel of the CSS representing the thermo-hydraulic part of the cyclotron magnet. It is equipped with electrical heaters which allow the simulation of the thermal loads of the magnet cryostat. A cooling power of 1.4 W at 4.5 K and 25 W at the thermal shield temperature level has been measured. The data produced confirm the design principle of the CSS which could be validated.

  2. Numerical simulations of energy deposition caused by 50 MeV—50 TeV proton beams in copper and graphite targets

    NASA Astrophysics Data System (ADS)

    Nie, Y.; Schmidt, R.; Chetvertkova, V.; Rosell-Tarragó, G.; Burkart, F.; Wollmann, D.

    2017-08-01

    The conceptual design of the Future Circular Collider (FCC) is being carried out actively in an international collaboration hosted by CERN, for the post-Large Hadron Collider (LHC) era. The target center-of-mass energy of proton-proton collisions for the FCC is 100 TeV, nearly an order of magnitude higher than for LHC. The existing CERN accelerators will be used to prepare the beams for FCC. Concerning beam-related machine protection of the whole accelerator chain, it is critical to assess the consequences of beam impact on various accelerator components in the cases of controlled and uncontrolled beam losses. In this paper, we study the energy deposition of protons in solid copper and graphite targets, since the two materials are widely used in magnets, beam screens, collimators, and beam absorbers. Nominal injection and extraction energies in the hadron accelerator complex at CERN were selected in the range of 50 MeV-50 TeV. Three beam sizes were studied for each energy, corresponding to typical values of the betatron function. Specifically for thin targets, comparisons between fluka simulations and analytical Bethe equation calculations were carried out, which showed that the damage potential of a few-millimeter-thick graphite target and submillimeter-thick copper foil can be well estimated directly by the Bethe equation. The paper provides a valuable reference for the quick evaluation of potential damage to accelerator elements over a large range of beam parameters when beam loss occurs.

  3. Streamlining CASTOR to manage the LHC data torrent

    NASA Astrophysics Data System (ADS)

    Lo Presti, G.; Espinal Curull, X.; Cano, E.; Fiorini, B.; Ieri, A.; Murray, S.; Ponce, S.; Sindrilaru, E.

    2014-06-01

    This contribution describes the evolution of the main CERN storage system, CASTOR, as it manages the bulk data stream of the LHC and other CERN experiments, achieving over 90 PB of stored data by the end of LHC Run 1. This evolution was marked by the introduction of policies to optimize the tape sub-system throughput, going towards a cold storage system where data placement is managed by the experiments' production managers. More efficient tape migrations and recalls have been implemented and deployed where bulk meta-data operations greatly reduce the overhead due to small files. A repack facility is now integrated in the system and it has been enhanced in order to automate the repacking of several tens of petabytes, required in 2014 in order to prepare for the next LHC run. Finally the scheduling system has been evolved to integrate the internal monitoring. To efficiently manage the service a solid monitoring infrastructure is required, able to analyze the logs produced by the different components (about 1 kHz of log messages). A new system has been developed and deployed, which uses a transport messaging layer provided by the CERN-IT Agile Infrastructure and exploits technologies including Hadoop and HBase. This enables efficient data mining by making use of MapReduce techniques, and real-time data aggregation and visualization. The outlook for the future is also presented. Directions and possible evolution will be discussed in view of the restart of data taking activities.

  4. Numerical simulation of electromagnetic fields and impedance of CERN LINAC4 H(-) source taking into account the effect of the plasma.

    PubMed

    Grudiev, A; Lettry, J; Mattei, S; Paoluzzi, M; Scrivens, R

    2014-02-01

    Numerical simulation of the CERN LINAC4 H(-) source 2 MHz RF system has been performed taking into account a realistic geometry from 3D Computer Aided Design model using commercial FEM high frequency simulation code. The effect of the plasma has been added to the model by the approximation of a homogenous electrically conducting medium. Electric and magnetic fields, RF power losses, and impedance of the circuit have been calculated for different values of the plasma conductivity. Three different regimes have been found depending on the plasma conductivity: (1) Zero or low plasma conductivity results in RF electric field induced by the RF antenna being mainly capacitive and has axial direction; (2) Intermediate conductivity results in the expulsion of capacitive electric field from plasma and the RF power coupling, which is increasing linearly with the plasma conductivity, is mainly dominated by the inductive azimuthal electric field; (3) High conductivity results in the shielding of both the electric and magnetic fields from plasma due to the skin effect, which reduces RF power coupling to plasma. From these simulations and measurements of the RF power coupling on the CERN source, a value of the plasma conductivity has been derived. It agrees well with an analytical estimate calculated from the measured plasma parameters. In addition, the simulated and measured impedances with and without plasma show very good agreement as well demonstrating validity of the plasma model used in the RF simulations.

  5. Footprints of Fascination: Digital Traces of Public Engagement with Particle Physics on CERN's Social Media Platforms

    PubMed Central

    Baram-Tsabari, Ayelet

    2016-01-01

    Although the scientific community increasingly recognizes that its communication with the public may shape civic engagement with science, few studies have characterized how this communication occurs online. Social media plays a growing role in this engagement, yet it is not known if or how different platforms support different types of engagement. This study sets out to explore how users engage with science communication items on different platforms of social media, and what are the characteristics of the items that tend to attract large numbers of user interactions. Here, user interactions with almost identical items on five of CERN's social media platforms were quantitatively compared over an eight-week period, including likes, comments, shares, click-throughs, and time spent on CERN's site. The most popular items were qualitatively analyzed for content features. Findings indicate that as audience size of a social media platform grows, the total rate of engagement with content tends to grow as well. However, per user, engagement tends to decline with audience size. Across all platforms, similar topics tend to consistently receive high engagement. In particular, awe-inspiring imagery tends to frequently attract high engagement across platforms, independent of newsworthiness. To our knowledge, this study provides the first cross-platform characterization of public engagement with science on social media. Findings, although focused on particle physics, have a multidisciplinary nature; they may serve to benchmark social media analytics for assessing science communication activities in various domains. Evidence-based suggestions for practitioners are also offered. PMID:27232498

  6. "Infinitos"

    NASA Astrophysics Data System (ADS)

    1994-04-01

    On Friday, 22 April 1994, a new science exhibition ``Infinitos", arranged jointly by Lisboa'94, CERN and ESO, will open at the Museu de Electricidade on the waterfront of Lisbon, the capital of Portugal. In a series of spectacular displays, it illustrates man's current understanding of how the Universe works - from the tiniest structures of matter to the most far flung galaxies. On this day, it will be inaugurated by the President of Lisboa'94, Prof. Vitor Constancio, the Portuguese Science Minister, Prof. L. Valente de Oliveira, Prof. C. Llewellyn Smith, Director General of CERN [2] and Dr. P. Creola, President of ESO Council. This exhibition is part of a rich cultural programme taking place at Lisbon during 1994 in the frame of ``Lisboa 94 - European City of Culture", after which it will travel to major cities around Europe. The frontiers of our knowledge push into inner space - the structure of the smallest components of matter - and into outer space - the dramatic phenomena of distant galaxies. Two of Europe's leading science organisations are playing a crucial role in this great human adventure. The European Laboratory for Particle Physics, CERN, operates the mighty accelerators and colliding beam machines to penetrate deep into matter and recreate the conditions which prevailed in the Universe a tiny fraction of a second after the Big Bang. The European Southern Observatory, ESO, operates the largest optical observatory in the world with a range of advanced telescopes searching the sky to study the evolution and content of our Universe. The ``Infinitos'' exhibition uses many modern exhibition techniques, including sophisticated audio-visual presentations and interactive video programmes. Visitors enter through a gallery of portraits of the most celebrated scientists from the 16th to 20th centuries and an exhibition of art inspired by scientific research. After passing a cosmic ray detector showing the streams of particles which pour down constantly from outer space, visitors continue into a central area where they are confronted with the essential questions of astro- and particle physics, f.inst. ``What is the Universe made of?'', ``How was the Universe created?'', ``What is in the sky?'', ``What is Dark Matter?'', ``Where does the stuff in our bodies come from?'', and ``Are we alone in the Universe?'' A central theme of this display is ``What we don't know''. In the second part of the exhibition visitors are shown the instruments and techniques used in today's big science research which will help to provide the answers. There are special displays on Europe's future large research projects such as the Large Hadron Collider (LHC) at CERN, which will bring protons into head-on collision at higher energies (14 TeV) than ever before to allow scientists to penetrate still further into the structure of matter and recreate the conditions prevailing in the Universe just 10-12 seconds after the "Big Bang" when the temperature was 10^16 degrees. Another highlight is a large interactive model of ESO's Very Large Telescope (VLT), the world's most ambitious optical telescope project, now under construction. The telescope's unequalled potential for exciting astronomical observations at the outer reaches of the Universe is clearly explained. Special emphasis is given to the contribution of Portuguese research institutes to the work of CERN and ESO, and particle physicists and astronomers from Portugal will be present at the exhibition to talk to visitors about their work. This exhibition will remain open until 12 June 1994 and will be a major attraction, also to the many tourists visiting this year's European City of Culture. 1. This is a joint Press Release of Lisboa'94, CERN and ESO. 2. CERN, the European Laboratory for Particle Physics, has its headquarters in Geneva. At present, its Member States are Austria, Belgium, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, Netherlands, Norway, Poland, Portugal, the Slovak Republic, Spain, Sweden, Switzerland and the United Kingdom. Israel, the Russian Federation, Turkey, Yugoslavia (status suspended after UN embargo, June 1992), the European Commission and Unesco have observer status.

  7. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    NASA Astrophysics Data System (ADS)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.

  8. Fermilab Today

    Science.gov Websites

    all the animal species that were present in pre-European times," Walton said. One reason for the Cern in Geneva. The latest test has seen particle physics grid sites in the UK exchanging data at high

  9. Spin and model determination of Z‧ - boson in lepton pair production at CERN LHC

    NASA Astrophysics Data System (ADS)

    Tsytrinov, A. V.; Pankov, A. A.; Serenkova, I. A.; Bednyakov, V. A.

    2017-12-01

    Many new physics models predict production of heavy resonances in Drell-Yan channel and can be observed at the CERN LHC. If a new resonance is discovered as a peak in the dilepton invariant mass distribution at the LHC, the identification of its spin and couplings can be done by measuring production rates and angular distributions of the decay products. Here we discuss the spin-1 identification of Z‧-boson for a set of representative models (SSM, E6, LR, and ALR) against the spin-2 RS graviton resonance and a spin-0 sneutrino resonance with the same mass and producing the same number of events under the resonance peak. We use the center-edge asymmetry for spin identification, as well as the total dilepton production cross section for the distinguishing the considered Z‧-boson models from one another.

  10. Upgrade of the cryogenic infrastructure of SM18, CERN main test facility for superconducting magnets and RF cavities

    NASA Astrophysics Data System (ADS)

    Perin, A.; Dhalla, F.; Gayet, P.; Serio, L.

    2017-12-01

    SM18 is CERN main facility for testing superconducting accelerator magnets and superconducting RF cavities. Its cryogenic infrastructure will have to be significantly upgraded in the coming years, starting in 2019, to meet the testing requirements for the LHC High Luminosity project and for the R&D program for superconducting magnets and RF equipment until 2023 and beyond. This article presents the assessment of the cryogenic needs based on the foreseen test program and on past testing experience. The current configuration of the cryogenic infrastructure is presented and several possible upgrade scenarios are discussed. The chosen upgrade configuration is then described and the characteristics of the main newly required cryogenic equipment, in particular a new 35 g/s helium liquefier, are presented. The upgrade implementation strategy and plan to meet the required schedule are then described.

  11. Design approach for the development of a cryomodule for compact crab cavities for Hi-Lumi LHC

    NASA Astrophysics Data System (ADS)

    Pattalwar, Shrikant; Jones, Thomas; Templeton, Niklas; Goudket, Philippe; McIntosh, Peter; Wheelhouse, Alan; Burt, Graeme; Hall, Ben; Wright, Loren; Peterson, Tom

    2014-01-01

    A prototype Superconducting RF (SRF) cryomodule, comprising multiple compact crab cavities is foreseen to realise a local crab crossing scheme for the "Hi-Lumi LHC", a project launched by CERN to increase the luminosity performance of LHC. A cryomodule with two cavities will be initially installed and tested on the SPS drive accelerator at CERN to evaluate performance with high-intensity proton beams. A series of boundary conditions influence the design of the cryomodule prototype, arising from; the complexity of the cavity design, the requirement for multiple RF couplers, the close proximity to the second LHC beam pipe and the tight space constraints in the SPS and LHC tunnels. As a result, the design of the helium vessel and the cryomodule has become extremely challenging. This paper assesses some of the critical cryogenic and engineering design requirements and describes an optimised cryomodule solution for the evaluation tests on SPS.

  12. PAMELA Space Mission: The Transition Radiation Detector

    NASA Astrophysics Data System (ADS)

    Ambriola, M.; Bellotti, R.; Cafagna, F.; Circella, M.; De Marzo, C.; Giglietto, N.; Marangelli, B.; Mirizzi, N.; Romita, M.; Spinelli, P.

    2003-07-01

    PAMELA telescope is a satellite-b orne magnetic spectrometer built to fulfill the primary scientific objectives of detecting antiparticles (antiprotons and positrons) in the cosmic rays, and to measure spectra of particles in cosmic rays. The PAMELA telescope is currently under integration and is composed of: a silicon tracker housed in a permanent magnet, a time of flight and an anticoincidence system both made of plastic scintillators, a silicon imaging calorimeter, a neutron detector and a Transition Radiation Detector (TRD). The TRD detector is composed of 9 sensitive layers of straw tubes working in proportional mode for a total of 1024 channels. Each layer is interleaved with a radiator plane made of carbon fibers. The TRD detector characteristics will be described along with its performance studied exposing the detector to particle beams of electrons, pions, muons and protons of different momenta at both CERN-PS and CERN-SPS facilities.

  13. Production of deuterium, tritium, and He 3 in central Pb + Pb collisions at 20 A , 30 A , 40 A , 80 A , and 158 A  GeV at the CERN Super Proton Synchrotron

    DOE PAGES

    Anticic, T.; Baatar, B.; Bartke, J.; ...

    2016-10-13

    Production of d, t, and 3He nuclei in central Pb + Pb interactions was studied at five collision energies (more » $$\\sqrt{s}$$$_ {NN}$$= 6.3, 7.6, 8.8, 12.3, and 17.3 GeV) with the NA49 detector at the CERN Super Proton Synchrotron.Transverse momentum spectra, rapidity distributions, and particle ratios were measured. Yields are compared to predictions of statistical models. Phase-space distributions of light nuclei are discussed and compared to those of protons in the context of a coalescence approach. Finally, the coalescence parameters B 2 and B 3, as well as coalescence radii for d and 3He were determined as a function of transverse mass at all energies.« less

  14. The CMS Tier0 goes cloud and grid for LHC Run 2

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threadedmore » framework to deal with the increased event complexity and to ensure efficient use of the resources. Furthermore, this contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.« less

  15. Integration of Oracle and Hadoop: Hybrid Databases Affordable at Scale

    NASA Astrophysics Data System (ADS)

    Canali, L.; Baranowski, Z.; Kothuri, P.

    2017-10-01

    This work reports on the activities aimed at integrating Oracle and Hadoop technologies for the use cases of CERN database services and in particular on the development of solutions for offloading data and queries from Oracle databases into Hadoop-based systems. The goal and interest of this investigation is to increase the scalability and optimize the cost/performance footprint for some of our largest Oracle databases. These concepts have been applied, among others, to build offline copies of CERN accelerator controls and logging databases. The tested solution allows to run reports on the controls data offloaded in Hadoop without affecting the critical production database, providing both performance benefits and cost reduction for the underlying infrastructure. Other use cases discussed include building hybrid database solutions with Oracle and Hadoop, offering the combined advantages of a mature relational database system with a scalable analytics engine.

  16. Towards a Future Linear Collider and The Linear Collider Studies at CERN

    ScienceCinema

    Heuer, Rolf-Dieter

    2018-06-15

    During the week 18-22 October, more than 400 physicists will meet at CERN and in the CICG (International Conference Centre Geneva) to review the global progress towards a future linear collider. The 2010 International Workshop on Linear Colliders will study the physics, detectors and accelerator complex of a linear collider covering both the CLIC and ILC options. Among the topics presented and discussed will be the progress towards the CLIC Conceptual Design Report in 2011, the ILC Technical Design Report in 2012, physics and detector studies linked to these reports, and an increasing numbers of common working group activities. The seminar will give an overview of these topics and also CERN’s linear collider studies, focusing on current activities and initial plans for the period 2011-16. n.b: The Council Chamber is also reserved for this colloquium with a live transmission from the Main Auditorium.

  17. Migration of the CERN IT Data Centre Support System to ServiceNow

    NASA Astrophysics Data System (ADS)

    Alvarez Alonso, R.; Arneodo, G.; Barring, O.; Bonfillou, E.; Coelho dos Santos, M.; Dore, V.; Lefebure, V.; Fedorko, I.; Grossir, A.; Hefferman, J.; Mendez Lorenzo, P.; Moller, M.; Pera Mira, O.; Salter, W.; Trevisani, F.; Toteva, Z.

    2014-06-01

    The large potential and flexibility of the ServiceNow infrastructure based on "best practises" methods is allowing the migration of some of the ticketing systems traditionally used for the monitoring of the servers and services available at the CERN IT Computer Centre. This migration enables the standardization and globalization of the ticketing and control systems implementing a generic system extensible to other departments and users. One of the activities of the Service Management project together with the Computing Facilities group has been the migration of the ITCM structure based on Remedy to ServiceNow within the context of one of the ITIL processes called Event Management. The experience gained during the first months of operation has been instrumental towards the migration to ServiceNow of other service monitoring systems and databases. The usage of this structure is also extended to the service tracking at the Wigner Centre in Budapest.

  18. Research on data from the ATLAS experiment at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purohit, Milind V.

    2015-07-31

    In this report senior investigator Prof. Milind V. Purohit describes research done with data from the ATLAS experiment at CERN. This includes preparing papers on the performance of the CSC detector, searches for SUSY using a new modern ''big data'' technique, and a search for supersymmetry (SUSY) using the "zero leptons razor" (0LRaz) technique. The prediction of the W=Z+jets background processes by the ATLAS simulation prior to the fit is found to be overestimated in the phase space of interest. In all new signal regions presented in this analysis the number of events observed is consistent with the post-fit SMmore » expectations. Assuming R-parity conservation, the limit on the gluino mass exceeds 1150 GeV at 95% confidence level, for an LSP mass smaller than 100 GeV. Other USC personnel who participated in this project during the period of this grant were a graduate student, Anton Kravchenko.« less

  19. Studies of the beam extraction system of the GTS-LHC electron cyclotron resonance ion source at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toivanen, V., E-mail: ville.aleksi.toivanen@cern.ch; Küchler, D.

    2016-02-15

    The 14.5 GHz GTS-LHC Electron Cyclotron Resonance Ion Source (ECRIS) provides multiply charged heavy ion beams for the CERN experimental program. The GTS-LHC beam formation has been studied extensively with lead, argon, and xenon beams with varied beam extraction conditions using the ion optical code IBSimu. The simulation model predicts self-consistently the formation of triangular and hollow beam structures which are often associated with ECRIS ion beams, as well as beam loss patterns which match the observed beam induced markings in the extraction region. These studies provide a better understanding of the properties of the extracted beams and a waymore » to diagnose the extraction system performance and limitations, which is otherwise challenging due to the lack of direct diagnostics in this region and the limited availability of the ion source for development work.« less

  20. Evaluation of the Huawei UDS cloud storage system for CERN specific data

    NASA Astrophysics Data System (ADS)

    Zotes Resines, M.; Heikkila, S. S.; Duellmann, D.; Adde, G.; Toebbicke, R.; Hughes, J.; Wang, L.

    2014-06-01

    Cloud storage is an emerging architecture aiming to provide increased scalability and access performance, compared to more traditional solutions. CERN is evaluating this promise using Huawei UDS and OpenStack SWIFT storage deployments, focusing on the needs of high-energy physics. Both deployed setups implement S3, one of the protocols that are emerging as a standard in the cloud storage market. A set of client machines is used to generate I/O load patterns to evaluate the storage system performance. The presented read and write test results indicate scalability both in metadata and data perspectives. Futher the Huawei UDS cloud storage is shown to be able to recover from a major failure of losing 16 disks. Both cloud storages are finally demonstrated to function as back-end storage systems to a filesystem, which is used to deliver high energy physics software.

  1. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    NASA Astrophysics Data System (ADS)

    Viegas, F.; Malon, D.; Cranshaw, J.; Dimitrov, G.; Nowak, M.; Nairz, A.; Goossens, L.; Gallas, E.; Gamboa, C.; Wong, A.; Vinek, E.

    2010-04-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  2. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    NASA Astrophysics Data System (ADS)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  3. Operation and performance of the EEE network array for the detection of cosmic rays

    NASA Astrophysics Data System (ADS)

    Abbrescia, M.; Avanzini, C.; Baldini, L.; Baldini Ferroli, R.; Batignani, G.; Bencivenni, G.; Bossini, E.; Chiavassa, A.; Cicalò, C.; Cifarelli, L.; Coccia, E.; Corvaglia, A.; De Gruttola, D.; De Pasquale, S.; Di Giovanni, A.; D'Incecco, M.; Dreucci, M.; Fabbri, F. L.; Fattibene, E.; Ferraro, A.; Frolov, V.; Galeotti, P.; Garbini, M.; Gemme, G.; Gnesi, I.; Grazzi, S.; Gustavino, C.; Hatzifotiadou, D.; La Rocca, P.; Licciulli, F.; Maggiora, A.; Maragoto Rodriguez, O.; Maron, G.; Martelli, B.; Mazziotta, M. N.; Miozzi, S.; Nania, R.; Noferini, F.; Nozzoli, F.; Panareo, M.; Panetta, M. P.; Paoletti, R.; Park, W.; Perasso, L.; Pilo, F.; Piragino, G.; Riggi, F.; Righini, G. C.; Sartorelli, G.; Scapparone, E.; Schioppa, M.; Scribano, A.; Selvi, M.; Serci, S.; Siddi, E.; Squarcia, S.; Stori, L.; Taiuti, M.; Terreni, G.; Visnyei, O. B.; Vistoli, M. C.; Votano, L.; Williams, M. C. S.; Zani, S.; Zichichi, A.; Zuyeuski, R.

    2017-02-01

    The EEE (Extreme Energy Events) Project is an experiment for the detection of cosmic ray muons by means of a sparse array of telescopes, each made of three Multigap Resistive Plate Chambers (MRPC), distributed over all the Italian territory and at CERN. The main scientific goals of the Project are the investigation of the properties of the local muon flux, the detection of Extensive Air Showers (EAS) and the search for long-distance correlations between far telescopes. The Project is also characterized by a strong educational and outreach aspect since the telescopes are managed by teams of students and teachers who had previously constructed them at CERN. In this paper an overall description of the experiment is given, including the design, construction and performance of the telescopes. The operation of the whole array, which currently consists of more than 50 telescopes, is also presented by showing the most recent physics results.

  4. RICH upgrade in LHCb experiment

    NASA Astrophysics Data System (ADS)

    Pistone, A.; LHCb RICH Collaboration

    2017-01-01

    The LHCb experiment is dedicated to precision measurements of CP violation and rare decays of B hadrons at the Large Hadron Collider (LHC) at CERN (Geneva). The second long shutdown of the LHC is currently scheduled to begin in 2019. During this period the LHCb experiment with all its sub-detectors will be upgraded in order to run at an instantaneous luminosity of 2 × 10^{33} cm ^{-2} s ^{-1} , about a factor 5 higher than the current luminosity, and to read out data at a rate of 40MHz into a flexible software-based trigger. The Ring Imaging CHerenkov (RICH) system will require new photon detectors and modifications to the optics of the upstream detector. Tests of the prototype of the smallest constituent of the new RICH system have been performed during testbeam sessions at the North Area test beam facility at CERN in the last years.

  5. Opto-mechanical design of vacuum laser resonator for the OSQAR experiment

    NASA Astrophysics Data System (ADS)

    Hošek, Jan; Macúchová, Karolina; Nemcová, Šárka; Kunc, Štěpán.; Šulc, Miroslav

    2015-01-01

    This paper gives short overview of laser-based experiment OSQAR at CERN which is focused on search of axions and axion-like particles. The OSQAR experiment uses two experimental methods for axion search - measurement of the ultra-fine vacuum magnetic birefringence and a method based on the "Light shining through the wall" experiment. Because both experimental methods have reached its attainable limits of sensitivity we have focused on designing a vacuum laser resonator. The resonator will increase the number of convertible photons and their endurance time within the magnetic field. This paper presents an opto-mechanical design of a two component transportable vacuum laser resonator. Developed optical resonator mechanical design allows to be used as a 0.8 meter long prototype laser resonator for laboratory testing and after transportation and replacement of the mirrors it can be mounted on the LHC magnet in CERN to form a 20 meter long vacuum laser resonator.

  6. KTAG: The Kaon Identification Detector for CERN experiment NA62

    NASA Astrophysics Data System (ADS)

    Fry, J. R.; CERN NA62 Collaboration

    2016-07-01

    In the study of ultra-rare kaon decays, CERN experiment NA62 exploits an unseparated monochromatic (75 GeV/c) beam of charged particles of flux 800 MHz, of which 50 MHz are K+. Kaons are identified with more than 95% efficiency, a time resolution of better than 100 ps, and misidentification of less than 10-4 using KTAG, a differential, ring-focussed, Cherenkov detector. KTAG utilises 8 sets of 48 Hamamatsu PMTs, of which 32 are of type 9880 and 16 of type 7400, with signals fed directly to the differential inputs of NINO front-end boards and then to TDC cards within the TEL62 system. Leading and trailing edges of the PMT signal are digitised, enabling slewing corrections to be made, and a mean hit rate of 5 MHz per PMT is supported. The electronics is housed within a cooled and insulated Faraday cage with environmental monitoring capabilities.

  7. Mapping remote and multidisciplinary learning barriers: lessons from challenge-based innovation at CERN

    NASA Astrophysics Data System (ADS)

    Jensen, Matilde Bisballe; Utriainen, Tuuli Maria; Steinert, Martin

    2018-01-01

    This paper presents the experienced difficulties of students participating in the multidisciplinary, remote collaborating engineering design course challenge-based innovation at CERN. This is with the aim to identify learning barriers and improve future learning experiences. We statistically analyse the rated differences between distinct design activities, educational background and remote vs. co-located collaboration. The analysis is based on a quantitative and qualitative questionnaire (N = 37). Our analysis found significant ranking differences between remote and co-located activities. This questions whether the remote factor might be a barrier for the originally intended learning goals. Further a correlation between analytical and converging design phases was identified. Hence, future facilitators are suggested to help students in the transition from one design phase to the next rather than only teaching methods in the individual design phases. Finally, we discuss how educators address the identified learning barriers when designing future courses including multidisciplinary or remote collaboration.

  8. First results from a combined analysis of CERN computing infrastructure metrics

    NASA Astrophysics Data System (ADS)

    Duellmann, Dirk; Nieke, Christian

    2017-10-01

    The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.

  9. 25th Birthday Cern- Restaurant

    ScienceCinema

    None

    2017-12-09

    Cérémonie du 25ème anniversaire du Cern avec plusieurs orateurs et la présence de nombreux autorités cantonales et communales genevoises et personnalités, directeurs généraux, ministres, chercheurs.... Le conseiller féderal et chef du département des affaires étrangères de la confédération Monsieur Pierre Aubert prend la parole pour célébrer à la fois les résultats très remarquables de la coopération internationale en matière scientifique, mais aussi la volonté politique des états européens de mettre en commun leurs ressources pour faire oeuvre d'avenir. Un grand hommage est aussi donné aux deux directeurs disparus, les prof.Bakker et Gregory.

  10. Web Based Monitoring in the CMS Experiment at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badgett, William; Borrello, Laura; Chakaberia, Irakli

    2014-09-03

    The Compact Muon Solenoid (CMS) is a large and complex general purpose experiment at the CERN Large Hadron Collider (LHC), built and maintained by many collaborators from around the world. Efficient operation of the detector requires widespread and timely access to a broad range of monitoring and status information. To this end the Web Based Monitoring (WBM) system was developed to present data to users located anywhere from many underlying heterogeneous sources, from real time messaging systems to relational databases. This system provides the power to combine and correlate data in both graphical and tabular formats of interest to themore » experimenters, including data such as beam conditions, luminosity, trigger rates, detector conditions, and many others, allowing for flexibility on the user side. This paper describes the WBM system architecture and describes how the system was used during the first major data taking run of the LHC.« less

  11. High energy beam impact tests on a LHC tertiary collimator at the CERN high-radiation to materials facility

    NASA Astrophysics Data System (ADS)

    Cauchi, Marija; Aberle, O.; Assmann, R. W.; Bertarelli, A.; Carra, F.; Cornelis, K.; Dallocchio, A.; Deboy, D.; Lari, L.; Redaelli, S.; Rossi, A.; Salvachua, B.; Mollicone, P.; Sammut, N.

    2014-02-01

    The correct functioning of a collimation system is crucial to safely operate highly energetic particle accelerators, such as the Large Hadron Collider (LHC). The requirements to handle high intensity beams can be demanding. In this respect, investigating the consequences of LHC particle beams hitting tertiary collimators (TCTs) in the experimental regions is a fundamental issue for machine protection. An experimental test was designed to investigate the robustness and effects of beam accidents on a fully assembled collimator, based on accident scenarios in the LHC. This experiment, carried out at the CERN High-Radiation to Materials (HiRadMat) facility, involved 440 GeV proton beam impacts of different intensities on the jaws of a horizontal TCT. This paper presents the experimental setup and the preliminary results obtained, together with some first outcomes from visual inspection and a comparison of such results with numerical simulations.

  12. Preliminary design of CERN Future Circular Collider tunnel: first evaluation of the radiation environment in critical areas for electronics

    NASA Astrophysics Data System (ADS)

    Infantino, Angelo; Alía, Rubén García; Besana, Maria Ilaria; Brugger, Markus; Cerutti, Francesco

    2017-09-01

    As part of its post-LHC high energy physics program, CERN is conducting a study for a new proton-proton collider, called Future Circular Collider (FCC-hh), running at center-of-mass energies of up to 100 TeV in a new 100 km tunnel. The study includes a 90-350 GeV lepton collider (FCC-ee) as well as a lepton-hadron option (FCC-he). In this work, FLUKA Monte Carlo simulation was extensively used to perform a first evaluation of the radiation environment in critical areas for electronics in the FCC-hh tunnel. The model of the tunnel was created based on the original civil engineering studies already performed and further integrated in the existing FLUKA models of the beam line. The radiation levels in critical areas, such as the racks for electronics and cables, power converters, service areas, local tunnel extensions was evaluated.

  13. ATLAS Live: Collaborative Information Streams

    NASA Astrophysics Data System (ADS)

    Goldfarb, Steven; ATLAS Collaboration

    2011-12-01

    I report on a pilot project launched in 2010 focusing on facilitating communication and information exchange within the ATLAS Collaboration, through the combination of digital signage software and webcasting. The project, called ATLAS Live, implements video streams of information, ranging from detailed detector and data status to educational and outreach material. The content, including text, images, video and audio, is collected, visualised and scheduled using digital signage software. The system is robust and flexible, utilizing scripts to input data from remote sources, such as the CERN Document Server, Indico, or any available URL, and to integrate these sources into professional-quality streams, including text scrolling, transition effects, inter and intra-screen divisibility. Information is published via the encoding and webcasting of standard video streams, viewable on all common platforms, using a web browser or other common video tool. Authorisation is enforced at the level of the streaming and at the web portals, using the CERN SSO system.

  14. CVD diamond detectors for ionizing radiation

    NASA Astrophysics Data System (ADS)

    Friedl, M.; Adam, W.; Bauer, C.; Berdermann, E.; Bergonzo, P.; Bogani, F.; Borchi, E.; Brambilla, A.; Bruzzi, M.; Colledani, C.; Conway, J.; Dabrowski, W.; Delpierre, P.; Deneuville, A.; Dulinski, W.; van Eijk, B.; Fallou, A.; Fizzotti, F.; Foulon, F.; Gan, K. K.; Gheeraert, E.; Grigoriev, E.; Hallewell, G.; Hall-Wilton, R.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H.; Kania, D.; Kaplon, J.; Karl, C.; Kass, R.; Knöpfle, K. T.; Krammer, M.; Logiudice, A.; Lu, R.; Manfredi, P. F.; Manfredotti, C.; Marshall, R. D.; Meier, D.; Mishina, M.; Oh, A.; Pan, L. S.; Palmieri, V. G.; Pernegger, H.; Pernicka, M.; Peitz, A.; Pirollo, S.; Polesello, P.; Pretzl, K.; Re, V.; Riester, J. L.; Roe, S.; Roff, D.; Rudge, A.; Schnetzer, S.; Sciortino, S.; Speziali, V.; Stelzer, H.; Stone, R.; Tapper, R. J.; Tesarek, R.; Thomson, G. B.; Trawick, M.; Trischuk, W.; Vittone, E.; Walsh, A. M.; Wedenig, R.; Weilhammer, P.; Ziock, H.; Zoeller, M.; RD42 Collaboration

    1999-10-01

    In future HEP accelerators, such as the LHC (CERN), detectors and electronics in the vertex region of the experiments will suffer from extreme radiation. Thus radiation hardness is required for both detectors and electronics to survive in this harsh environment. CVD diamond, which is investigated by the RD42 Collaboration at CERN, can meet these requirements. Samples of up to 2×4 cm2 have been grown and refined for better charge collection properties, which are measured with a β source or in a testbeam. A large number of diamond samples has been irradiated with hadrons to fluences of up to 5×10 15 cm-2 to study the effects of radiation. Both strip and pixel detectors were prepared in various geometries. Samples with strip metallization have been tested with both slow and fast readout electronics, and the first diamond pixel detector proved fully functional with LHC electronics.

  15. The CMS TierO goes Cloud and Grid for LHC Run 2

    NASA Astrophysics Data System (ADS)

    Hufnagel, Dirk

    2015-12-01

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threaded framework to deal with the increased event complexity and to ensure efficient use of the resources. This contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.

  16. Measurements of 55Fe activity in activated steel samples with GEMPix

    NASA Astrophysics Data System (ADS)

    Curioni, A.; Dinar, N.; La Torre, F. P.; Leidner, J.; Murtas, F.; Puddu, S.; Silari, M.

    2017-03-01

    In this paper we present a novel method, based on the recently developed GEMPix detector, to measure the 55Fe content in samples of metallic material activated during operation of CERN accelerators and experimental facilities. The GEMPix, a gas detector with highly pixelated read-out, has been obtained by coupling a triple Gas Electron Multiplier (GEM) to a quad Timepix ASIC. Sample preparation, measurements performed on 45 samples and data analysis are described. The calibration factor (counts per second per unit specific activity) has been obtained via measurements of the 55Fe activity determined by radiochemical analysis of the same samples. Detection limit and sensitivity to the current Swiss exemption limit are calculated. Comparison with radiochemical analysis shows inconsistency for the sensitivity for only two samples, most likely due to underestimated uncertainties of the GEMPix analysis. An operative test phase of this technique is already planned at CERN.

  17. Measurements of π ^± differential yields from the surface of the T2K replica target for incoming 31 GeV/ c protons with the NA61/SHINE spectrometer at the CERN SPS

    NASA Astrophysics Data System (ADS)

    Abgrall, N.; Aduszkiewicz, A.; Ajaz, M.; Ali, Y.; Andronov, E.; Antićić, T.; Antoniou, N.; Baatar, B.; Bay, F.; Blondel, A.; Blümer, J.; Bogomilov, M.; Brandin, A.; Bravar, A.; Brzychczyk, J.; Bunyatov, S. A.; Busygina, O.; Christakoglou, P.; Ćirković, M.; Czopowicz, T.; Davis, N.; Debieux, S.; Dembinski, H.; Deveaux, M.; Diakonos, F.; Di Luise, S.; Dominik, W.; Dumarchez, J.; Dynowski, K.; Engel, R.; Ereditato, A.; Feofilov, G. A.; Fodor, Z.; Garibov, A.; Gaździcki, M.; Golubeva, M.; Grebieszkow, K.; Grzeszczuk, A.; Guber, F.; Haesler, A.; Hasegawa, T.; Hervé, A. E.; Hierholzer, M.; Igolkin, S.; Ivashkin, A.; Johnson, S. R.; Kadija, K.; Kapoyannis, A.; Kaptur, E.; Kisiel, J.; Kobayashi, T.; Kolesnikov, V. I.; Kolev, D.; Kondratiev, V. P.; Korzenev, A.; Kowalik, K.; Kowalski, S.; Koziel, M.; Krasnoperov, A.; Kuich, M.; Kurepin, A.; Larsen, D.; László, A.; Lewicki, M.; Lyubushkin, V. V.; Maćkowiak-Pawłowska, M.; Maksiak, B.; Malakhov, A. I.; Manić, D.; Marcinek, A.; Marino, A. D.; Marton, K.; Mathes, H.-J.; Matulewicz, T.; Matveev, V.; Melkumov, G. L.; Messerly, B.; Mills, G. B.; Morozov, S.; Mrówczyński, S.; Nagai, Y.; Nakadaira, T.; Naskręt, M.; Nirkko, M.; Nishikawa, K.; Panagiotou, A. D.; Paolone, V.; Pavin, M.; Petukhov, O.; Pistillo, C.; Płaneta, R.; Popov, B. A.; Posiadała-Zezula, M.; Puławski, S.; Puzović, J.; Rauch, W.; Ravonel, M.; Redij, A.; Renfordt, R.; Richter-Wąs, E.; Robert, A.; Röhrich, D.; Rondio, E.; Roth, M.; Rubbia, A.; Rumberger, B. T.; Rustamov, A.; Rybczynski, M.; Sadovsky, A.; Sakashita, K.; Sarnecki, R.; Schmidt, K.; Sekiguchi, T.; Selyuzhenkov, I.; Seryakov, A.; Seyboth, P.; Sgalaberna, D.; Shibata, M.; Słodkowski, M.; Staszel, P.; Stefanek, G.; Stepaniak, J.; Ströbele, H.; Šuša, T.; Szuba, M.; Tada, M.; Taranenko, A.; Tefelska, A.; Tefelski, D.; Tereshchenko, V.; Tsenov, R.; Turko, L.; Ulrich, R.; Unger, M.; Vassiliou, M.; Veberič, D.; Vechernin, V. V.; Vesztergombi, G.; Vinogradov, L.; Wilczek, A.; Włodarczyk, Z.; Wojtaszek-Szwarc, A.; Wyszyński, O.; Yarritu, K.; Zambelli, L.; Zimmerman, E. D.; Friend, M.; Galymov, V.; Hartz, M.; Hiraki, T.; Ichikawa, A.; Kubo, H.; Matsuoka, K.; Murakami, A.; Nakaya, T.; Suzuki, K.; Tzanov, M.; Yu, M.

    2016-11-01

    Measurements of particle emission from a replica of the T2K 90 cm-long carbon target were performed in the NA61/SHINE experiment at CERN SPS, using data collected during a high-statistics run in 2009. An efficient use of the long-target measurements for neutrino flux predictions in T2K requires dedicated reconstruction and analysis techniques. Fully-corrected differential yields of π ^± -mesons from the surface of the T2K replica target for incoming 31 GeV/ c protons are presented. A possible strategy to implement these results into the T2K neutrino beam predictions is discussed and the propagation of the uncertainties of these results to the final neutrino flux is performed.

  18. Towards a Future Linear Collider and The Linear Collider Studies at CERN

    ScienceCinema

    Stapnes, Steinar

    2017-12-18

    During the week 18-22 October, more than 400 physicists will meet at CERN and in the CICG (International Conference Centre Geneva) to review the global progress towards a future linear collider. The 2010 International Workshop on Linear Colliders will study the physics, detectors and accelerator complex of a linear collider covering both the CLIC and ILC options. Among the topics presented and discussed will be the progress towards the CLIC Conceptual Design Report in 2011, the ILC Technical Design Report in 2012, physics and detector studies linked to these reports, and an increasing numbers of common working group activities. The seminar will give an overview of these topics and also CERN’s linear collider studies, focusing on current activities and initial plans for the period 2011-16. n.b: The Council Chamber is also reserved for this colloquium with a live transmission from the Main Auditorium.

  19. Princeton University High Energy Physics Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marlow, Daniel R.

    This is the Final Report on research conducted by the Princeton Elementary Particles group over the approximately three-year period from May 1, 2012 to April 30, 2015. The goal of our research is to investigate the fundamental constituents of matter, their fields, and their interactions; to understand the properties of space and time; and to study the profound relationships between cosmology and particle physics. During the funding period covered by this report, the group has been organized into a subgroup concentrating on the theory of particles, strings, and cosmology; and four subgroups performing major experiments at laboratories around the world: CERN, Daya Bay, Gran Sasso as well as detector R\\&D on the Princeton campus. Highlights in of this research include the discovery of the Higgs Boson at CERN and the measurement ofmore » $$\\sin^22\\theta_{13}$$ by the Daya Bay experiment. In both cases, Princeton researchers supported by this grant played key roles.« less

  20. Global atmospheric particle formation from CERN CLOUD measurements.

    PubMed

    Dunne, Eimear M; Gordon, Hamish; Kürten, Andreas; Almeida, João; Duplissy, Jonathan; Williamson, Christina; Ortega, Ismael K; Pringle, Kirsty J; Adamov, Alexey; Baltensperger, Urs; Barmet, Peter; Benduhn, Francois; Bianchi, Federico; Breitenlechner, Martin; Clarke, Antony; Curtius, Joachim; Dommen, Josef; Donahue, Neil M; Ehrhart, Sebastian; Flagan, Richard C; Franchin, Alessandro; Guida, Roberto; Hakala, Jani; Hansel, Armin; Heinritzi, Martin; Jokinen, Tuija; Kangasluoma, Juha; Kirkby, Jasper; Kulmala, Markku; Kupc, Agnieszka; Lawler, Michael J; Lehtipalo, Katrianne; Makhmutov, Vladimir; Mann, Graham; Mathot, Serge; Merikanto, Joonas; Miettinen, Pasi; Nenes, Athanasios; Onnela, Antti; Rap, Alexandru; Reddington, Carly L S; Riccobono, Francesco; Richards, Nigel A D; Rissanen, Matti P; Rondo, Linda; Sarnela, Nina; Schobesberger, Siegfried; Sengupta, Kamalika; Simon, Mario; Sipilä, Mikko; Smith, James N; Stozkhov, Yuri; Tomé, Antonio; Tröstl, Jasmin; Wagner, Paul E; Wimmer, Daniela; Winkler, Paul M; Worsnop, Douglas R; Carslaw, Kenneth S

    2016-12-02

    Fundamental questions remain about the origin of newly formed atmospheric aerosol particles because data from laboratory measurements have been insufficient to build global models. In contrast, gas-phase chemistry models have been based on laboratory kinetics measurements for decades. We built a global model of aerosol formation by using extensive laboratory measurements of rates of nucleation involving sulfuric acid, ammonia, ions, and organic compounds conducted in the CERN CLOUD (Cosmics Leaving Outdoor Droplets) chamber. The simulations and a comparison with atmospheric observations show that nearly all nucleation throughout the present-day atmosphere involves ammonia or biogenic organic compounds, in addition to sulfuric acid. A considerable fraction of nucleation involves ions, but the relatively weak dependence on ion concentrations indicates that for the processes studied, variations in cosmic ray intensity do not appreciably affect climate through nucleation in the present-day atmosphere. Copyright © 2016, American Association for the Advancement of Science.

Top