Sample records for cern pilot cloud

  1. Scaling the CERN OpenStack cloud

    NASA Astrophysics Data System (ADS)

    Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.

    2015-12-01

    CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.

  2. Helix Nebula and CERN: A Symbiotic approach to exploiting commercial clouds

    NASA Astrophysics Data System (ADS)

    Barreiro Megino, Fernando H.; Jones, Robert; Kucharczyk, Katarzyna; Medrano Llamas, Ramón; van der Ster, Daniel

    2014-06-01

    The recent paradigm shift toward cloud computing in IT, and general interest in "Big Data" in particular, have demonstrated that the computing requirements of HEP are no longer globally unique. Indeed, the CERN IT department and LHC experiments have already made significant R&D investments in delivering and exploiting cloud computing resources. While a number of technical evaluations of interesting commercial offerings from global IT enterprises have been performed by various physics labs, further technical, security, sociological, and legal issues need to be address before their large-scale adoption by the research community can be envisaged. Helix Nebula - the Science Cloud is an initiative that explores these questions by joining the forces of three European research institutes (CERN, ESA and EMBL) with leading European commercial IT enterprises. The goals of Helix Nebula are to establish a cloud platform federating multiple commercial cloud providers, along with new business models, which can sustain the cloud marketplace for years to come. This contribution will summarize the participation of CERN in Helix Nebula. We will explain CERN's flagship use-case and the model used to integrate several cloud providers with an LHC experiment's workload management system. During the first proof of concept, this project contributed over 40.000 CPU-days of Monte Carlo production throughput to the ATLAS experiment with marginal manpower required. CERN's experience, together with that of ESA and EMBL, is providing a great insight into the cloud computing industry and highlighted several challenges that are being tackled in order to ease the export of the scientific workloads to the cloud environments.

  3. Commissioning the CERN IT Agile Infrastructure with experiment workloads

    NASA Astrophysics Data System (ADS)

    Medrano Llamas, Ramón; Harald Barreiro Megino, Fernando; Kucharczyk, Katarzyna; Kamil Denis, Marek; Cinquilli, Mattia

    2014-06-01

    In order to ease the management of their infrastructure, most of the WLCG sites are adopting cloud based strategies. In the case of CERN, the Tier 0 of the WLCG, is completely restructuring the resource and configuration management of their computing center under the codename Agile Infrastructure. Its goal is to manage 15,000 Virtual Machines by means of an OpenStack middleware in order to unify all the resources in CERN's two datacenters: the one placed in Meyrin and the new on in Wigner, Hungary. During the commissioning of this infrastructure, CERN IT is offering an attractive amount of computing resources to the experiments (800 cores for ATLAS and CMS) through a private cloud interface. ATLAS and CMS have joined forces to exploit them by running stress tests and simulation workloads since November 2012. This work will describe the experience of the first deployments of the current experiment workloads on the CERN private cloud testbed. The paper is organized as follows: the first section will explain the integration of the experiment workload management systems (WMS) with the cloud resources. The second section will revisit the performance and stress testing performed with HammerCloud in order to evaluate and compare the suitability for the experiment workloads. The third section will go deeper into the dynamic provisioning techniques, such as the use of the cloud APIs directly by the WMS. The paper finishes with a review of the conclusions and the challenges ahead.

  4. Status and Roadmap of CernVM

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Blomer, J.; Buncic, P.; Charalampidis, I.; Ganis, G.; Meusel, R.

    2015-12-01

    Cloud resources nowadays contribute an essential share of resources for computing in high-energy physics. Such resources can be either provided by private or public IaaS clouds (e.g. OpenStack, Amazon EC2, Google Compute Engine) or by volunteers computers (e.g. LHC@Home 2.0). In any case, experiments need to prepare a virtual machine image that provides the execution environment for the physics application at hand. The CernVM virtual machine since version 3 is a minimal and versatile virtual machine image capable of booting different operating systems. The virtual machine image is less than 20 megabyte in size. The actual operating system is delivered on demand by the CernVM File System. CernVM 3 has matured from a prototype to a production environment. It is used, for instance, to run LHC applications in the cloud, to tune event generators using a network of volunteer computers, and as a container for the historic Scientific Linux 5 and Scientific Linux 4 based software environments in the course of long-term data preservation efforts of the ALICE, CMS, and ALEPH experiments. We present experience and lessons learned from the use of CernVM at scale. We also provide an outlook on the upcoming developments. These developments include adding support for Scientific Linux 7, the use of container virtualization, such as provided by Docker, and the streamlining of virtual machine contextualization towards the cloud-init industry standard.

  5. CERN Computing in Commercial Clouds

    NASA Astrophysics Data System (ADS)

    Cordeiro, C.; Field, L.; Garrido Bear, B.; Giordano, D.; Jones, B.; Keeble, O.; Manzi, A.; Martelli, E.; McCance, G.; Moreno-García, D.; Traylen, S.

    2017-10-01

    By the end of 2016 more than 10 Million core-hours of computing resources have been delivered by several commercial cloud providers to the four LHC experiments to run their production workloads, from simulation to full chain processing. In this paper we describe the experience gained at CERN in procuring and exploiting commercial cloud resources for the computing needs of the LHC experiments. The mechanisms used for provisioning, monitoring, accounting, alarming and benchmarking will be discussed, as well as the involvement of the LHC collaborations in terms of managing the workflows of the experiments within a multicloud environment.

  6. Helix Nebula - the Science Cloud: a public-private partnership to build a multidisciplinary cloud platform for data intensive science

    NASA Astrophysics Data System (ADS)

    Jones, Bob; Casu, Francesco

    2013-04-01

    The feasibility of using commercial cloud services for scientific research is of great interest to research organisations such as CERN, ESA and EMBL, to the suppliers of cloud-based services and to the national and European funding agencies. Through the Helix Nebula - the Science Cloud [1] initiative and with the support of the European Commission, these stakeholders are driving a two year pilot-phase during which procurement processes and governance issues for a framework of public/private partnership will be appraised. Three initial flagship use cases from high energy physics, molecular biology and earth-observation are being used to validate the approach, enable a cost-benefit analysis to be undertaken and prepare the next stage of the Science Cloud Strategic Plan [2] to be developed and approved. The power of Helix Nebula lies in a shared set of services for initially 3 very different sciences each supporting a global community and thus building a common e-Science platform. Of particular relevance is the ESA sponsored flagship application SuperSites Exploitation Platform (SSEP [3]) that offers the global geo-hazard community a common platform for the correlation and processing of observation data for supersites monitoring. The US-NSF Earth Cube [4] and Ocean Observatory Initiative [5] (OOI) are taking a similar approach for data intensive science. The work of Helix Nebula and its recent architecture model [6] has shown that is it technically feasible to allow publicly funded infrastructures, such as EGI [7] and GEANT [8], to interoperate with commercial cloud services. Such hybrid systems are in the interest of the existing users of publicly funded infrastructures and funding agencies because they will provide "freedom of choice" over the type of computing resources to be consumed and the manner in which they can be obtained. But to offer such freedom-of choice across a spectrum of suppliers, various issues such as intellectual property, legal responsibility, service quality agreements and related issues need to be addressed. Investigating these issues is one of the goals of the Helix Nebula initiative. The next generation of researchers will put aside the historical categorisation of research as a neatly defined set of disciplines and integrate the data from different sources and instruments into complex models that are as applicable to earth observation or biomedicine as they are to high-energy physics. This aggregation of datasets and development of new models will accelerate scientific development but will only be possible if the issues of data intensive science described above are addressed. The culture of science has the possibility to develop with the availability of Helix Nebula as a "Science Cloud" because: • Large scale datasets from many disciplines will be accessible • Scientists and others will be able to develop and contribute open source tools to expand the set of services available • Collaboration of scientists will take place around the on-demand availability of data, tools and services • Cross-domain research will advance at a faster pace due to the availability of a common platform. References: 1 http://www.helix-nebula.eu/ 2 http://cdsweb.cern.ch/record/1374172/files/CERN-OPEN-2011-036.pdf 3 http://www.helix-nebula.eu/index.php/helix-nebula-use-cases/uc3.html 4 http://www.nsf.gov/geo/earthcube/ 5 http://www.oceanobservatories.org/ 6 http://cdsweb.cern.ch/record/1478364/files/HelixNebula-NOTE-2012-001.pdf 7 http://www.nsf.gov/geo/earthcube/ 8 http://www.geant.net/

  7. Temperature characterisation of the CLOUD chamber at CERN

    NASA Astrophysics Data System (ADS)

    Dias, A. M.; Almeida, J.; Kirkby, J.; Mathot, S.; Onnela, A.; Vogel, A.; Ehrhart, S.

    2014-12-01

    Temperature stability, uniformity and absolute scale inside the CLOUD (Cosmics Leaving OUtdoor Droplets) chamber at CERN are important for experiments on aerosol particle nucleation and ice/liquid cloud formation. In order to measure the air temperature, a comprehensive set of arrays ("strings") of platinum resistance thermometers, thermocouples and optical sensors have been installed inside the 26 m3 chamber. The thermal sensors must meet several challenging design requirements: ultra-clean materials, 0.01 K measurement sensitivity, high absolute precision (<0.1 K), 200 K - 373 K range, ability to operate in high electric fields (20 kV/m), and fast response in air (~1 s) in order to measure rapid changes of temperature during ice/liquid cloud formation in the chamber by adiabatic pressure reductions. This presentation will focus on the design of the thermometer strings and the thermal performance of the chamber during the CLOUD8 and CLOUD9 campaigns, 2013-2014, together with the planned upgrades of the CLOUD thermal system.

  8. Evaluation of the Huawei UDS cloud storage system for CERN specific data

    NASA Astrophysics Data System (ADS)

    Zotes Resines, M.; Heikkila, S. S.; Duellmann, D.; Adde, G.; Toebbicke, R.; Hughes, J.; Wang, L.

    2014-06-01

    Cloud storage is an emerging architecture aiming to provide increased scalability and access performance, compared to more traditional solutions. CERN is evaluating this promise using Huawei UDS and OpenStack SWIFT storage deployments, focusing on the needs of high-energy physics. Both deployed setups implement S3, one of the protocols that are emerging as a standard in the cloud storage market. A set of client machines is used to generate I/O load patterns to evaluate the storage system performance. The presented read and write test results indicate scalability both in metadata and data perspectives. Futher the Huawei UDS cloud storage is shown to be able to recover from a major failure of losing 16 disks. Both cloud storages are finally demonstrated to function as back-end storage systems to a filesystem, which is used to deliver high energy physics software.

  9. Integrating Containers in the CERN Private Cloud

    NASA Astrophysics Data System (ADS)

    Noel, Bertrand; Michelino, Davide; Velten, Mathieu; Rocha, Ricardo; Trigazis, Spyridon

    2017-10-01

    Containers remain a hot topic in computing, with new use cases and tools appearing every day. Basic functionality such as spawning containers seems to have settled, but topics like volume support or networking are still evolving. Solutions like Docker Swarm, Kubernetes or Mesos provide similar functionality but target different use cases, exposing distinct interfaces and APIs. The CERN private cloud is made of thousands of nodes and users, with many different use cases. A single solution for container deployment would not cover every one of them, and supporting multiple solutions involves repeating the same process multiple times for integration with authentication services, storage services or networking. In this paper we describe OpenStack Magnum as the solution to offer container management in the CERN cloud. We will cover its main functionality and some advanced use cases using Docker Swarm and Kubernetes, highlighting some relevant differences between the two. We will describe the most common use cases in HEP and how we integrated popular services like CVMFS or AFS in the most transparent way possible, along with some limitations found. Finally we will look into ongoing work on advanced scheduling for both Swarm and Kubernetes, support for running batch like workloads and integration of container networking technologies with the CERN infrastructure.

  10. CERN launches high-school internship programme

    NASA Astrophysics Data System (ADS)

    Johnston, Hamish

    2017-07-01

    The CERN particle-physics lab has hosted 22 high-school students from Hungary in a pilot programme designed to show teenagers how science, technology, engineering and mathematics is used at the particle-physics lab.

  11. Knowledge and Technology: Sharing With Society

    NASA Astrophysics Data System (ADS)

    Benvenuti, Cristoforo; Sutton, Christine; Wenninger, Horst

    The following sections are included: * A Core Mission of CERN * Medical Accelerators: A Tool for Tumour Therapy * Medipix: The Image is the Message * Crystal Clear: From Higgs to PET * Solar Collectors: When Nothing is Better * The TARC Experiment at CERN: Modern Alchemy * A CLOUD Chamber with a Silvery Lining * References

  12. Integration of XRootD into the cloud infrastructure for ALICE data analysis

    NASA Astrophysics Data System (ADS)

    Kompaniets, Mikhail; Shadura, Oksana; Svirin, Pavlo; Yurchenko, Volodymyr; Zarochentsev, Andrey

    2015-12-01

    Cloud technologies allow easy load balancing between different tasks and projects. From the viewpoint of the data analysis in the ALICE experiment, cloud allows to deploy software using Cern Virtual Machine (CernVM) and CernVM File System (CVMFS), to run different (including outdated) versions of software for long term data preservation and to dynamically allocate resources for different computing activities, e.g. grid site, ALICE Analysis Facility (AAF) and possible usage for local projects or other LHC experiments. We present a cloud solution for Tier-3 sites based on OpenStack and Ceph distributed storage with an integrated XRootD based storage element (SE). One of the key features of the solution is based on idea that Ceph has been used as a backend for Cinder Block Storage service for OpenStack, and in the same time as a storage backend for XRootD, with redundancy and availability of data preserved by Ceph settings. For faster and easier OpenStack deployment was applied the Packstack solution, which is based on the Puppet configuration management system. Ceph installation and configuration operations are structured and converted to Puppet manifests describing node configurations and integrated into Packstack. This solution can be easily deployed, maintained and used even in small groups with limited computing resources and small organizations, which usually have lack of IT support. The proposed infrastructure has been tested on two different clouds (SPbSU & BITP) and integrates successfully with the ALICE data analysis model.

  13. SUMMARY OF THE ECL2 WORKSHOP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FISCHER,W.

    We summarize the ECL2 workshop on electron cloud clearing, which was held at CERN in early March 2007, and highlight a number of novel ideas for electron cloud suppression, such as continuous clearing electrodes based on enamel, slotted structures, and electrete inserts.

  14. Review of CERN Data Centre Infrastructure

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Bell, T.; van Eldik, J.; McCance, G.; Panzer-Steindel, B.; Coelho dos Santos, M.; Traylen and, S.; Schwickerath, U.

    2012-12-01

    The CERN Data Centre is reviewing strategies for optimizing the use of the existing infrastructure and expanding to a new data centre by studying how other large sites are being operated. Over the past six months, CERN has been investigating modern and widely-used tools and procedures used for virtualisation, clouds and fabric management in order to reduce operational effort, increase agility and support unattended remote data centres. This paper gives the details on the project's motivations, current status and areas for future investigation.

  15. High fidelity 3-dimensional models of beam-electron cloud interactions in circular accelerators

    NASA Astrophysics Data System (ADS)

    Feiz Zarrin Ghalam, Ali

    Electron cloud is a low-density electron profile created inside the vacuum chamber of circular machines with positively charged beams. Electron cloud limits the peak current of the beam and degrades the beams' quality through luminosity degradation, emittance growth and head to tail or bunch to bunch instability. The adverse effects of electron cloud on long-term beam dynamics becomes more and more important as the beams go to higher and higher energies. This problem has become a major concern in many future circular machines design like the Large Hadron Collider (LHC) under construction at European Center for Nuclear Research (CERN). Due to the importance of the problem several simulation models have been developed to model long-term beam-electron cloud interaction. These models are based on "single kick approximation" where the electron cloud is assumed to be concentrated at one thin slab around the ring. While this model is efficient in terms of computational costs, it does not reflect the real physical situation as the forces from electron cloud to the beam are non-linear contrary to this model's assumption. To address the existing codes limitation, in this thesis a new model is developed to continuously model the beam-electron cloud interaction. The code is derived from a 3-D parallel Particle-In-Cell (PIC) model (QuickPIC) originally used for plasma wakefield acceleration research. To make the original model fit into circular machines environment, betatron and synchrotron equations of motions have been added to the code, also the effect of chromaticity, lattice structure have been included. QuickPIC is then benchmarked against one of the codes developed based on single kick approximation (HEAD-TAIL) for the transverse spot size of the beam in CERN-LHC. The growth predicted by QuickPIC is less than the one predicted by HEAD-TAIL. The code is then used to investigate the effect of electron cloud image charges on the long-term beam dynamics, particularly on the transverse tune shift of the beam at CERN Super Proton Synchrotron (SPS) ring. The force from the electron cloud image charges on the beam cancels the force due to cloud compression formed on the beam axis and therefore the tune shift is mainly due to the uniform electron cloud density. (Abstract shortened by UMI.)

  16. Integration of end-user Cloud storage for CMS analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riahi, Hassen; Aimar, Alberto; Ayllon, Alejandro Alvarez

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achievemore » results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with the CMS distributed computing model. We describe the new challenges faced in data management between Grid and Cloud and how they were addressed, along with details of the support for Cloud storage recently introduced into the WLCG data movement middleware, FTS3. Finally, the commissioning experience of CERNBox for the distributed data analysis activity is also presented.« less

  17. Integration of end-user Cloud storage for CMS analysis

    DOE PAGES

    Riahi, Hassen; Aimar, Alberto; Ayllon, Alejandro Alvarez; ...

    2017-05-19

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achievemore » results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with the CMS distributed computing model. We describe the new challenges faced in data management between Grid and Cloud and how they were addressed, along with details of the support for Cloud storage recently introduced into the WLCG data movement middleware, FTS3. Finally, the commissioning experience of CERNBox for the distributed data analysis activity is also presented.« less

  18. Managing a tier-2 computer centre with a private cloud infrastructure

    NASA Astrophysics Data System (ADS)

    Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara

    2014-06-01

    In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.

  19. Global EOS: exploring the 300-ms-latency region

    NASA Astrophysics Data System (ADS)

    Mascetti, L.; Jericho, D.; Hsu, C.-Y.

    2017-10-01

    EOS, the CERN open-source distributed disk storage system, provides the highperformance storage solution for HEP analysis and the back-end for various work-flows. Recently EOS became the back-end of CERNBox, the cloud synchronisation service for CERN users. EOS can be used to take advantage of wide-area distributed installations: for the last few years CERN EOS uses a common deployment across two computer centres (Geneva-Meyrin and Budapest-Wigner) about 1,000 km apart (∼20-ms latency) with about 200 PB of disk (JBOD). In late 2015, the CERN-IT Storage group and AARNET (Australia) set-up a challenging R&D project: a single EOS instance between CERN and AARNET with more than 300ms latency (16,500 km apart). This paper will report about the success in deploy and run a distributed storage system between Europe (Geneva, Budapest), Australia (Melbourne) and later in Asia (ASGC Taipei), allowing different type of data placement and data access across these four sites.

  20. Data Mining as a Service (DMaaS)

    NASA Astrophysics Data System (ADS)

    Tejedor, E.; Piparo, D.; Mascetti, L.; Moscicki, J.; Lamanna, M.; Mato, P.

    2016-10-01

    Data Mining as a Service (DMaaS) is a software and computing infrastructure that allows interactive mining of scientific data in the cloud. It allows users to run advanced data analyses by leveraging the widely adopted Jupyter notebook interface. Furthermore, the system makes it easier to share results and scientific code, access scientific software, produce tutorials and demonstrations as well as preserve the analyses of scientists. This paper describes how a first pilot of the DMaaS service is being deployed at CERN, starting from the notebook interface that has been fully integrated with the ROOT analysis framework, in order to provide all the tools for scientists to run their analyses. Additionally, we characterise the service backend, which combines a set of IT services such as user authentication, virtual computing infrastructure, mass storage, file synchronisation, development portals or batch systems. The added value acquired by the combination of the aforementioned categories of services is discussed, focusing on the opportunities offered by the CERNBox synchronisation service and its massive storage backend, EOS.

  1. The Cloud Area Padovana: from pilot to production

    NASA Astrophysics Data System (ADS)

    Andreetto, P.; Costa, F.; Crescente, A.; Dorigo, A.; Fantinel, S.; Fanzago, F.; Sgaravatto, M.; Traldi, S.; Verlato, M.; Zangrando, L.

    2017-10-01

    The Cloud Area Padovana has been running for almost two years. This is an OpenStack-based scientific cloud, spread across two different sites: the INFN Padova Unit and the INFN Legnaro National Labs. The hardware resources have been scaled horizontally and vertically, by upgrading some hypervisors and by adding new ones: currently it provides about 1100 cores. Some in-house developments were also integrated in the OpenStack dashboard, such as a tool for user and project registrations with direct support for the INFN-AAI Identity Provider as a new option for the user authentication. In collaboration with the EU-funded Indigo DataCloud project, the integration with Docker-based containers has been experimented with and will be available in production soon. This computing facility now satisfies the computational and storage demands of more than 70 users affiliated with about 20 research projects. We present here the architecture of this Cloud infrastructure, the tools and procedures used to operate it. We also focus on the lessons learnt in these two years, describing the problems that were found and the corrective actions that had to be applied. We also discuss about the chosen strategy for upgrades, which combines the need to promptly integrate the OpenStack new developments, the demand to reduce the downtimes of the infrastructure, and the need to limit the effort requested for such updates. We also discuss how this Cloud infrastructure is being used. In particular we focus on two big physics experiments which are intensively exploiting this computing facility: CMS and SPES. CMS deployed on the cloud a complex computational infrastructure, composed of several user interfaces for job submission in the Grid environment/local batch queues or for interactive processes; this is fully integrated with the local Tier-2 facility. To avoid a static allocation of the resources, an elastic cluster, based on cernVM, has been configured: it allows to automatically create and delete virtual machines according to the user needs. SPES, using a client-server system called TraceWin, exploits INFN’s virtual resources performing a very large number of simulations on about a thousand nodes elastically managed.

  2. The CMS Tier0 goes cloud and grid for LHC Run 2

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threadedmore » framework to deal with the increased event complexity and to ensure efficient use of the resources. Furthermore, this contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.« less

  3. The CMS TierO goes Cloud and Grid for LHC Run 2

    NASA Astrophysics Data System (ADS)

    Hufnagel, Dirk

    2015-12-01

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threaded framework to deal with the increased event complexity and to ensure efficient use of the resources. This contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.

  4. Global atmospheric particle formation from CERN CLOUD measurements.

    PubMed

    Dunne, Eimear M; Gordon, Hamish; Kürten, Andreas; Almeida, João; Duplissy, Jonathan; Williamson, Christina; Ortega, Ismael K; Pringle, Kirsty J; Adamov, Alexey; Baltensperger, Urs; Barmet, Peter; Benduhn, Francois; Bianchi, Federico; Breitenlechner, Martin; Clarke, Antony; Curtius, Joachim; Dommen, Josef; Donahue, Neil M; Ehrhart, Sebastian; Flagan, Richard C; Franchin, Alessandro; Guida, Roberto; Hakala, Jani; Hansel, Armin; Heinritzi, Martin; Jokinen, Tuija; Kangasluoma, Juha; Kirkby, Jasper; Kulmala, Markku; Kupc, Agnieszka; Lawler, Michael J; Lehtipalo, Katrianne; Makhmutov, Vladimir; Mann, Graham; Mathot, Serge; Merikanto, Joonas; Miettinen, Pasi; Nenes, Athanasios; Onnela, Antti; Rap, Alexandru; Reddington, Carly L S; Riccobono, Francesco; Richards, Nigel A D; Rissanen, Matti P; Rondo, Linda; Sarnela, Nina; Schobesberger, Siegfried; Sengupta, Kamalika; Simon, Mario; Sipilä, Mikko; Smith, James N; Stozkhov, Yuri; Tomé, Antonio; Tröstl, Jasmin; Wagner, Paul E; Wimmer, Daniela; Winkler, Paul M; Worsnop, Douglas R; Carslaw, Kenneth S

    2016-12-02

    Fundamental questions remain about the origin of newly formed atmospheric aerosol particles because data from laboratory measurements have been insufficient to build global models. In contrast, gas-phase chemistry models have been based on laboratory kinetics measurements for decades. We built a global model of aerosol formation by using extensive laboratory measurements of rates of nucleation involving sulfuric acid, ammonia, ions, and organic compounds conducted in the CERN CLOUD (Cosmics Leaving Outdoor Droplets) chamber. The simulations and a comparison with atmospheric observations show that nearly all nucleation throughout the present-day atmosphere involves ammonia or biogenic organic compounds, in addition to sulfuric acid. A considerable fraction of nucleation involves ions, but the relatively weak dependence on ion concentrations indicates that for the processes studied, variations in cosmic ray intensity do not appreciably affect climate through nucleation in the present-day atmosphere. Copyright © 2016, American Association for the Advancement of Science.

  5. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License Notice: Published under licence in Journal of Physics: Conference Series by IOP Publishing Ltd.

  6. DNS load balancing in the CERN cloud

    NASA Astrophysics Data System (ADS)

    Reguero Naredo, Ignacio; Lobato Pardavila, Lorena

    2017-10-01

    Load Balancing is one of the technologies enabling deployment of large-scale applications on cloud resources. A DNS Load Balancer Daemon (LBD) has been developed at CERN as a cost-effective way to balance applications accepting DNS timing dynamics and not requiring persistence. It currently serves over 450 load-balanced aliases with two small VMs acting as master and slave. The aliases are mapped to DNS subdomains. These subdomains are managed with DDNS according to a load metric, which is collected from the alias member nodes with SNMP. During the last years, several improvements were brought to the software, for instance: support for IPv6, parallelization of the status requests, implementing the client in Python to allow for multiple aliases with differentiated states on the same machine or support for application state. The configuration of the Load Balancer is currently managed by a Puppet type. It discovers the alias member nodes and gets the alias definitions from the Ermis REST service. The Aiermis self-service GUI for the management of the LB aliases has been produced and is based on the Ermis service above that implements a form of Load Balancing as a Service (LBaaS). The Ermis REST API has authorisation based in Foreman hostgroups. The CERN DNS LBD is Open Software with Apache 2 license.

  7. A practical approach to virtualization in HEP

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Aguado Sánchez, C.; Blomer, J.; Harutyunyan, A.; Mudrinic, M.

    2011-01-01

    In the attempt to solve the problem of processing data coming from LHC experiments at CERN at a rate of 15PB per year, for almost a decade the High Enery Physics (HEP) community has focused its efforts on the development of the Worldwide LHC Computing Grid. This generated large interest and expectations promising to revolutionize computing. Meanwhile, having initially taken part in the Grid standardization process, industry has moved in a different direction and started promoting the Cloud Computing paradigm which aims to solve problems on a similar scale and in equally seamless way as it was expected in the idealized Grid approach. A key enabling technology behind Cloud computing is server virtualization. In early 2008, an R&D project was established in the PH-SFT group at CERN to investigate how virtualization technology could be used to improve and simplify the daily interaction of physicists with experiment software frameworks and the Grid infrastructure. In this article we shall first briefly compare Grid and Cloud computing paradigms and then summarize the results of the R&D activity pointing out where and how virtualization technology could be effectively used in our field in order to maximize practical benefits whilst avoiding potential pitfalls.

  8. Integration of cloud-based storage in BES III computing environment

    NASA Astrophysics Data System (ADS)

    Wang, L.; Hernandez, F.; Deng, Z.

    2014-06-01

    We present an on-going work that aims to evaluate the suitability of cloud-based storage as a supplement to the Lustre file system for storing experimental data for the BES III physics experiment and as a backend for storing files belonging to individual members of the collaboration. In particular, we discuss our findings regarding the support of cloud-based storage in the software stack of the experiment. We report on our development work that improves the support of CERN' s ROOT data analysis framework and allows efficient remote access to data through several cloud storage protocols. We also present our efforts providing the experiment with efficient command line tools for navigating and interacting with cloud storage-based data repositories both from interactive sessions and grid jobs.

  9. Intercomparison study and optical asphericity measurements of small ice particles in the CERN CLOUD experiment

    NASA Astrophysics Data System (ADS)

    Nichman, Leonid; Järvinen, Emma; Dorsey, James; Connolly, Paul; Duplissy, Jonathan; Fuchs, Claudia; Ignatius, Karoliina; Sengupta, Kamalika; Stratmann, Frank; Möhler, Ottmar; Schnaiter, Martin; Gallagher, Martin

    2017-09-01

    Optical probes are frequently used for the detection of microphysical cloud particle properties such as liquid and ice phase, size and morphology. These properties can eventually influence the angular light scattering properties of cirrus clouds as well as the growth and accretion mechanisms of single cloud particles. In this study we compare four commonly used optical probes to examine their response to small cloud particles of different phase and asphericity. Cloud simulation experiments were conducted at the Cosmics Leaving OUtdoor Droplets (CLOUD) chamber at European Organisation for Nuclear Research (CERN). The chamber was operated in a series of multi-step adiabatic expansions to produce growth and sublimation of ice particles at super- and subsaturated ice conditions and for initial temperatures of -30, -40 and -50 °C. The experiments were performed for ice cloud formation via homogeneous ice nucleation. We report the optical observations of small ice particles in deep convection and in situ cirrus simulations. Ice crystal asphericity deduced from measurements of spatially resolved single particle light scattering patterns by the Particle Phase Discriminator mark 2 (PPD-2K, Karlsruhe edition) were compared with Cloud and Aerosol Spectrometer with Polarisation (CASPOL) measurements and image roundness captured by the 3View Cloud Particle Imager (3V-CPI). Averaged path light scattering properties of the simulated ice clouds were measured using the Scattering Intensity Measurements for the Optical detectioN of icE (SIMONE) and single particle scattering properties were measured by the CASPOL. We show the ambiguity of several optical measurements in ice fraction determination of homogeneously frozen ice in the case where sublimating quasi-spherical ice particles are present. Moreover, most of the instruments have difficulties of producing reliable ice fraction if small aspherical ice particles are present, and all of the instruments cannot separate perfectly spherical ice particles from supercooled droplets. Correlation analysis of bulk averaged path depolarisation measurements and single particle measurements of these clouds showed higher R2 values at high concentrations and small diameters, but these results require further confirmation. We find that none of these instruments were able to determine unambiguously the phase of the small particles. These results have implications for the interpretation of atmospheric measurements and parametrisations for modelling, particularly for low particle number concentration clouds.

  10. Future Approach to tier-0 extension

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Cordeiro, C.; Giordano, D.; Traylen, S.; Moreno García, D.

    2017-10-01

    The current tier-0 processing at CERN is done on two managed sites, the CERN computer centre and the Wigner computer centre. With the proliferation of public cloud resources at increasingly competitive prices, we have been investigating how to transparently increase our compute capacity to include these providers. The approach taken has been to integrate these resources using our existing deployment and computer management tools and to provide them in a way that exposes them to users as part of the same site. The paper will describe the architecture, the toolset and the current production experiences of this model.

  11. Deploying the ATLAS Metadata Interface (AMI) on the cloud with Jenkins

    NASA Astrophysics Data System (ADS)

    Lambert, F.; Odier, J.; Fulachier, J.; ATLAS Collaboration

    2017-10-01

    The ATLAS Metadata Interface (AMI) is a mature application of more than 15 years of existence. Mainly used by the ATLAS experiment at CERN, it consists of a very generic tool ecosystem for metadata aggregation and cataloguing. AMI is used by the ATLAS production system, therefore the service must guarantee a high level of availability. We describe our monitoring and administration systems, and the Jenkins-based strategy used to dynamically test and deploy cloud OpenStack nodes on demand.

  12. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    NASA Astrophysics Data System (ADS)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  13. Using S3 cloud storage with ROOT and CvmFS

    NASA Astrophysics Data System (ADS)

    Arsuaga-Ríos, María; Heikkilä, Seppo S.; Duellmann, Dirk; Meusel, René; Blomer, Jakob; Couturier, Ben

    2015-12-01

    Amazon S3 is a widely adopted web API for scalable cloud storage that could also fulfill storage requirements of the high-energy physics community. CERN has been evaluating this option using some key HEP applications such as ROOT and the CernVM filesystem (CvmFS) with S3 back-ends. In this contribution we present an evaluation of two versions of the Huawei UDS storage system stressed with a large number of clients executing HEP software applications. The performance of concurrently storing individual objects is presented alongside with more complex data access patterns as produced by the ROOT data analysis framework. Both Huawei UDS generations show a successful scalability by supporting multiple byte-range requests in contrast with Amazon S3 or Ceph which do not support these commonly used HEP operations. We further report the S3 integration with recent CvmFS versions and summarize the experience with CvmFS/S3 for publishing daily releases of the full LHCb experiment software stack.

  14. Temperature uniformity in the CERN CLOUD chamber

    NASA Astrophysics Data System (ADS)

    Dias, António; Ehrhart, Sebastian; Vogel, Alexander; Williamson, Christina; Almeida, João; Kirkby, Jasper; Mathot, Serge; Mumford, Samuel; Onnela, Antti

    2017-12-01

    The CLOUD (Cosmics Leaving OUtdoor Droplets) experiment at CERN (European Council for Nuclear Research) investigates the nucleation and growth of aerosol particles under atmospheric conditions and their activation into cloud droplets. A key feature of the CLOUD experiment is precise control of the experimental parameters. Temperature uniformity and stability in the chamber are important since many of the processes under study are sensitive to temperature and also to contaminants that can be released from the stainless steel walls by upward temperature fluctuations. The air enclosed within the 26 m3 CLOUD chamber is equipped with several arrays (strings) of high precision, fast-response thermometers to measure its temperature. Here we present a study of the air temperature uniformity inside the CLOUD chamber under various experimental conditions. Measurements were performed under calibration conditions and run conditions, which are distinguished by the flow rate of fresh air and trace gases entering the chamber at 20 and up to 210 L min-1, respectively. During steady-state calibration runs between -70 and +20 °C, the air temperature uniformity is better than ±0.06 °C in the radial direction and ±0.1 °C in the vertical direction. Larger non-uniformities are present during experimental runs, depending on the temperature control of the make-up air and trace gases (since some trace gases require elevated temperatures until injection into the chamber). The temperature stability is ±0.04 °C over periods of several hours during either calibration or steady-state run conditions. During rapid adiabatic expansions to activate cloud droplets and ice particles, the chamber walls are up to 10 °C warmer than the enclosed air. This results in temperature differences of ±1.5 °C in the vertical direction and ±1 °C in the horizontal direction, while the air returns to its equilibrium temperature with a time constant of about 200 s.

  15. Helix Nebula: Enabling federation of existing data infrastructures and data services to an overarching cross-domain e-infrastructure

    NASA Astrophysics Data System (ADS)

    Lengert, Wolfgang; Farres, Jordi; Lanari, Riccardo; Casu, Francesco; Manunta, Michele; Lassalle-Balier, Gerard

    2014-05-01

    Helix Nebula has established a growing public private partnership of more than 30 commercial cloud providers, SMEs, and publicly funded research organisations and e-infrastructures. The Helix Nebula strategy is to establish a federated cloud service across Europe. Three high-profile flagships, sponsored by CERN (high energy physics), EMBL (life sciences) and ESA/DLR/CNES/CNR (earth science), have been deployed and extensively tested within this federated environment. The commitments behind these initial flagships have created a critical mass that attracts suppliers and users to the initiative, to work together towards an "Information as a Service" market place. Significant progress in implementing the following 4 programmatic goals (as outlined in the strategic Plan Ref.1) has been achieved: × Goal #1 Establish a Cloud Computing Infrastructure for the European Research Area (ERA) serving as a platform for innovation and evolution of the overall infrastructure. × Goal #2 Identify and adopt suitable policies for trust, security and privacy on a European-level can be provided by the European Cloud Computing framework and infrastructure. × Goal #3 Create a light-weight governance structure for the future European Cloud Computing Infrastructure that involves all the stakeholders and can evolve over time as the infrastructure, services and user-base grows. × Goal #4 Define a funding scheme involving the three stake-holder groups (service suppliers, users, EC and national funding agencies) into a Public-Private-Partnership model to implement a Cloud Computing Infrastructure that delivers a sustainable business environment adhering to European level policies. Now in 2014 a first version of this generic cross-domain e-infrastructure is ready to go into operations building on federation of European industry and contributors (data, tools, knowledge, ...). This presentation describes how Helix Nebula is being used in the domain of earth science focusing on geohazards. The so called "Supersite Exploitation Platform" (SSEP) provides scientists an overarching federated e-infrastructure with a very fast access to (i) large volume of data (EO/non-space data), (ii) computing resources (e.g. hybrid cloud/grid), (iii) processing software (e.g. toolboxes, RTMs, retrieval baselines, visualization routines), and (iv) general platform capabilities (e.g. user management and access control, accounting, information portal, collaborative tools, social networks etc.). In this federation each data provider remains in full control of the implementation of its data policy. This presentation outlines the Architecture (technical and services) supporting very heterogeneous science domains as well as the procedures for new-comers to join the Helix Nebula Market Place. Ref.1 http://cds.cern.ch/record/1374172/files/CERN-OPEN-2011-036.pdf

  16. Self-service for software development projects and HPC activities

    NASA Astrophysics Data System (ADS)

    Husejko, M.; Høimyr, N.; Gonzalez, A.; Koloventzos, G.; Asbury, D.; Trzcinska, A.; Agtzidis, I.; Botrel, G.; Otto, J.

    2014-05-01

    This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.

  17. CERN automatic audio-conference service

    NASA Astrophysics Data System (ADS)

    Sierra Moral, Rodrigo

    2010-04-01

    Scientists from all over the world need to collaborate with CERN on a daily basis. They must be able to communicate effectively on their joint projects at any time; as a result telephone conferences have become indispensable and widely used. Managed by 6 operators, CERN already has more than 20000 hours and 5700 audio-conferences per year. However, the traditional telephone based audio-conference system needed to be modernized in three ways. Firstly, to provide the participants with more autonomy in the organization of their conferences; secondly, to eliminate the constraints of manual intervention by operators; and thirdly, to integrate the audio-conferences into a collaborative working framework. The large number, and hence cost, of the conferences prohibited externalization and so the CERN telecommunications team drew up a specification to implement a new system. It was decided to use a new commercial collaborative audio-conference solution based on the SIP protocol. The system was tested as the first European pilot and several improvements (such as billing, security, redundancy...) were implemented based on CERN's recommendations. The new automatic conference system has been operational since the second half of 2006. It is very popular for the users and has doubled the number of conferences in the past two years.

  18. The diverse use of clouds by CMS

    DOE PAGES

    Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; ...

    2015-12-23

    The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of themore » trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. Lastly, we present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.« less

  19. A pilot study of distributed knowledge management and clinical decision support in the cloud.

    PubMed

    Dixon, Brian E; Simonaitis, Linas; Goldberg, Howard S; Paterno, Marilyn D; Schaeffer, Molly; Hongsermeier, Tonya; Wright, Adam; Middleton, Blackford

    2013-09-01

    Implement and perform pilot testing of web-based clinical decision support services using a novel framework for creating and managing clinical knowledge in a distributed fashion using the cloud. The pilot sought to (1) develop and test connectivity to an external clinical decision support (CDS) service, (2) assess the exchange of data to and knowledge from the external CDS service, and (3) capture lessons to guide expansion to more practice sites and users. The Clinical Decision Support Consortium created a repository of shared CDS knowledge for managing hypertension, diabetes, and coronary artery disease in a community cloud hosted by Partners HealthCare. A limited data set for primary care patients at a separate health system was securely transmitted to a CDS rules engine hosted in the cloud. Preventive care reminders triggered by the limited data set were returned for display to clinician end users for review and display. During a pilot study, we (1) monitored connectivity and system performance, (2) studied the exchange of data and decision support reminders between the two health systems, and (3) captured lessons. During the six month pilot study, there were 1339 patient encounters in which information was successfully exchanged. Preventive care reminders were displayed during 57% of patient visits, most often reminding physicians to monitor blood pressure for hypertensive patients (29%) and order eye exams for patients with diabetes (28%). Lessons learned were grouped into five themes: performance, governance, semantic interoperability, ongoing adjustments, and usability. Remote, asynchronous cloud-based decision support performed reasonably well, although issues concerning governance, semantic interoperability, and usability remain key challenges for successful adoption and use of cloud-based CDS that will require collaboration between biomedical informatics and computer science disciplines. Decision support in the cloud is feasible and may be a reasonable path toward achieving better support of clinical decision-making across the widest range of health care providers. Published by Elsevier B.V.

  20. Methods for Discerning Cloud Reflectivity Changes due to the Indirect Effect of Aerosol: A Pilot-study for Triana

    NASA Technical Reports Server (NTRS)

    Kinne, S.; Wiscombe, Warren; Einaudi, Franco (Technical Monitor)

    2001-01-01

    Understanding the effect of aerosol on cloud systems is one of the major challenges in atmospheric and climate research. Local studies suggest a multitude of influences on cloud properties. Yet the overall effect on cloud albedo, a critical parameter in climate simulations, remains uncertain. NASA's Triana mission will provide, from its EPIC multi-spectral imager, simultaneous data on aerosol properties and cloud reflectivity. With Triana's unique position in space these data will be available not only globally but also over the entire daytime, well suited to accommodate the often short lifetimes of aerosol and investigations around diurnal cycles. This pilot study explores the ability to detect relationships between aerosol properties and cloud reflectivity with sophisticated statistical methods. Sample results using data from the EOS Terra platform to simulate Triana are presented.

  1. High Performance Computing (HPC) Innovation Service Portal Pilots Cloud Computing (HPC-ISP Pilot Cloud Computing)

    DTIC Science & Technology

    2011-08-01

    5 Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis...classification of streaming data. Example input images (top left). All digit prototypes (cluster centers) found, with size proportional to frequency (top...Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis 1 http

  2. PREFACE: International Conference on Computing in High Energy and Nuclear Physics (CHEP 2010)

    NASA Astrophysics Data System (ADS)

    Lin, Simon C.; Shen, Stella; Neufeld, Niko; Gutsche, Oliver; Cattaneo, Marco; Fisk, Ian; Panzer-Steindel, Bernd; Di Meglio, Alberto; Lokajicek, Milos

    2011-12-01

    The International Conference on Computing in High Energy and Nuclear Physics (CHEP) was held at Academia Sinica in Taipei from 18-22 October 2010. CHEP is a major series of international conferences for physicists and computing professionals from the worldwide High Energy and Nuclear Physics community, Computer Science, and Information Technology. The CHEP conference provides an international forum to exchange information on computing progress and needs for the community, and to review recent, ongoing and future activities. CHEP conferences are held at roughly 18 month intervals, alternating between Europe, Asia, America and other parts of the world. Recent CHEP conferences have been held in Prauge, Czech Republic (2009); Victoria, Canada (2007); Mumbai, India (2006); Interlaken, Switzerland (2004); San Diego, California(2003); Beijing, China (2001); Padova, Italy (2000) CHEP 2010 was organized by Academia Sinica Grid Computing Centre. There was an International Advisory Committee (IAC) setting the overall themes of the conference, a Programme Committee (PC) responsible for the content, as well as Conference Secretariat responsible for the conference infrastructure. There were over 500 attendees with a program that included plenary sessions of invited speakers, a number of parallel sessions comprising around 260 oral and 200 poster presentations, and industrial exhibitions. We thank all the presenters, for the excellent scientific content of their contributions to the conference. Conference tracks covered topics on Online Computing, Event Processing, Software Engineering, Data Stores, and Databases, Distributed Processing and Analysis, Computing Fabrics and Networking Technologies, Grid and Cloud Middleware, and Collaborative Tools. The conference included excursions to various attractions in Northern Taiwan, including Sanhsia Tsu Shih Temple, Yingko, Chiufen Village, the Northeast Coast National Scenic Area, Keelung, Yehliu Geopark, and Wulai Aboriginal Village, as well as two banquets held at the Grand Hotel and Grand Formosa Regent in Taipei. The next CHEP conference will be held in New York, the United States on 21-25 May 2012. We would like to thank the National Science Council of Taiwan, the EU ACEOLE project, commercial sponsors, and the International Advisory Committee and the Programme Committee members for all their support and help. Special thanks to the Programme Committee members for their careful choice of conference contributions and enormous effort in reviewing and editing about 340 post conference proceedings papers. Simon C Lin CHEP 2010 Conference Chair and Proceedings Editor Taipei, Taiwan November 2011 Track Editors/ Programme Committee Chair Simon C Lin, Academia Sinica, Taiwan Online Computing Track Y H Chang, National Central University, Taiwan Harry Cheung, Fermilab, USA Niko Neufeld, CERN, Switzerland Event Processing Track Fabio Cossutti, INFN Trieste, Italy Oliver Gutsche, Fermilab, USA Ryosuke Itoh, KEK, Japan Software Engineering, Data Stores, and Databases Track Marco Cattaneo, CERN, Switzerland Gang Chen, Chinese Academy of Sciences, China Stefan Roiser, CERN, Switzerland Distributed Processing and Analysis Track Kai-Feng Chen, National Taiwan University, Taiwan Ulrik Egede, Imperial College London, UK Ian Fisk, Fermilab, USA Fons Rademakers, CERN, Switzerland Torre Wenaus, BNL, USA Computing Fabrics and Networking Technologies Track Harvey Newman, Caltech, USA Bernd Panzer-Steindel, CERN, Switzerland Antonio Wong, BNL, USA Ian Fisk, Fermilab, USA Niko Neufeld, CERN, Switzerland Grid and Cloud Middleware Track Alberto Di Meglio, CERN, Switzerland Markus Schulz, CERN, Switzerland Collaborative Tools Track Joao Correia Fernandes, CERN, Switzerland Philippe Galvez, Caltech, USA Milos Lokajicek, FZU Prague, Czech Republic International Advisory Committee Chair: Simon C. Lin , Academia Sinica, Taiwan Members: Mohammad Al-Turany , FAIR, Germany Sunanda Banerjee, Fermilab, USA Dario Barberis, CERN & Genoa University/INFN, Switzerland Lothar Bauerdick, Fermilab, USA Ian Bird, CERN, Switzerland Amber Boehnlein, US Department of Energy, USA Kors Bos, CERN, Switzerland Federico Carminati, CERN, Switzerland Philippe Charpentier, CERN, Switzerland Gang Chen, Institute of High Energy Physics, China Peter Clarke, University of Edinburgh, UK Michael Ernst, Brookhaven National Laboratory, USA David Foster, CERN, Switzerland Merino Gonzalo, CIEMAT, Spain John Gordon, STFC-RAL, UK Volker Guelzow, Deutsches Elektronen-Synchrotron DESY, Hamburg, Germany John Harvey, CERN, Switzerland Frederic Hemmer, CERN, Switzerland Hafeez Hoorani, NCP, Pakistan Viatcheslav Ilyin, Moscow State University, Russia Matthias Kasemann, DESY, Germany Nobuhiko Katayama, KEK, Japan Milos Lokajícek, FZU Prague, Czech Republic David Malon, ANL, USA Pere Mato Vila, CERN, Switzerland Mirco Mazzucato, INFN CNAF, Italy Richard Mount, SLAC, USA Harvey Newman, Caltech, USA Mitsuaki Nozaki, KEK, Japan Farid Ould-Saada, University of Oslo, Norway Ruth Pordes, Fermilab, USA Hiroshi Sakamoto, The University of Tokyo, Japan Alberto Santoro, UERJ, Brazil Jim Shank, Boston University, USA Alan Silverman, CERN, Switzerland Randy Sobie , University of Victoria, Canada Dongchul Son, Kyungpook National University, South Korea Reda Tafirout , TRIUMF, Canada Victoria White, Fermilab, USA Guy Wormser, LAL, France Frank Wuerthwein, UCSD, USA Charles Young, SLAC, USA

  3. Cloud Computing for DoD

    DTIC Science & Technology

    2012-05-01

    cloud computing 17 NASA Nebula Platform •  Cloud computing pilot program at NASA Ames •  Integrates open-source components into seamless, self...Mission support •  Education and public outreach (NASA Nebula , 2010) 18 NSF Supported Cloud Research •  Support for Cloud Computing in...Mell, P. & Grance, T. (2011). The NIST Definition of Cloud Computing. NIST Special Publication 800-145 •  NASA Nebula (2010). Retrieved from

  4. Chemical Processing of Organics within Clouds: Pilot Study at Whiteface Mountain in Upstate NY

    NASA Astrophysics Data System (ADS)

    Lance, S.; Carlton, A. G.; Barth, M. C.; Schwab, J. J.; Minder, J. R.; Freedman, J. M.; Zhang, J.; Brandt, R. E.; Casson, P.; Brewer, M.; Orlowski, D.; Christiansen, A.

    2017-12-01

    Aqueous chemical processing within cloud and fog water has been identified as a key process in the formation of secondary organic aerosol (SOA) mass, which is found abundantly throughout the troposphere. Yet, significant uncertainty remains regarding the organic chemical reactions taking place within clouds and the conditions under which those reactions occur. Routine longterm measurements from the Whiteface Mountain (WFM) Research Observatory in upstate NY provide a unique and broad view of regional air quality relevant to the formation of particulate matter within clouds, largely due to the fact that the summit of WFM is within non-precipitating clouds 30-50% in summertime and the site is undisturbed by local sources. An NSF-funded Cloud Chemistry Workshop in Sept 2016 brought together key researchers at WFM to lay out the most pertinent scientific questions relevant to heterogeneous chemistry occurring within fogs and clouds and to discuss preliminary model intercomparisons. The workshop culminated in a plan to coordinate chemical analyses of cloud water samples focused on chemical constituents thought to be most relevant for SOA formation. Workshop participants also recommended that a pilot study be conducted at WFM to better characterize the meteorological conditions, airflow patterns and clouds intercepting the site, in preparation for future intensive field operations focused on the chemical processing of organics within clouds. This presentation will highlight the experimental design and preliminary observations from the pilot study taking place at WFM in August 2017. Upwind below-cloud measurements of aerosol CCN activation efficiency, size distribution and chemical composition will be compared with similar measurements made at the summit. Under certain conditions, we anticipate that aerosols measured at the summit between cloud events will be representative of cloud droplet residuals recently detrained from the frequent shallow cumulus intercepting the summit. Wind LIDAR and radiosonde observations will be used to link the below-cloud and summit observations. These pre- and post- `cloud processed' aerosols will also be compared with the chemical composition of cloud water samples to evaluate changes to the organic partitioning in the aqueous and aerosol phases.

  5. The Experimental Cloud Lidar Pilot Study (ECLIPS) for cloud-radiation research

    NASA Technical Reports Server (NTRS)

    Platt, C. M.; Young, S. A.; Carswell, A. I.; Pal, S. R.; Mccormick, M. P.; Winker, D. M.; Delguasta, M.; Stefanutti, L.; Eberhard, W. L.; Hardesty, M.

    1994-01-01

    The Experimental Cloud Lidar Pilot Study (ECLIPS) was initiated to obtain statistics on cloud-base height, extinction, optical depth, cloud brokenness, and surface fluxes. Two observational phases have taken place, in October-December 1989 and April-July 1991, with intensive 30-day periods being selected within the two time intervals. Data are being archived at NASA Langley Research Center and, once there, are readily available to the international scientific community. This article describes the scale of the study in terms of its international involvement and in the range of data being recorded. Lidar observations of cloud height and backscatter coefficient have been taken from a number of ground-based stations spread around the globe. Solar shortwave and infrared longwave fluxes and infrared beam radiance have been measured at the surface wherever possible. The observations have been tailored to occur around the overpass times of the NOAA weather satellites. This article describes in some detail the various retrieval methods used to obtain results on cloud-base height, extinction coefficient, and infrared emittance, paying particular attention to the uncertainties involved.

  6. ATLAS computing on Swiss Cloud SWITCHengines

    NASA Astrophysics Data System (ADS)

    Haug, S.; Sciacca, F. G.; ATLAS Collaboration

    2017-10-01

    Consolidation towards more computing at flat budgets beyond what pure chip technology can offer, is a requirement for the full scientific exploitation of the future data from the Large Hadron Collider at CERN in Geneva. One consolidation measure is to exploit cloud infrastructures whenever they are financially competitive. We report on the technical solutions and the performances used and achieved running simulation tasks for the ATLAS experiment on SWITCHengines. SWITCHengines is a new infrastructure as a service offered to Swiss academia by the National Research and Education Network SWITCH. While solutions and performances are general, financial considerations and policies, on which we also report, are country specific.

  7. 14 CFR 91.129 - Operations in Class D airspace.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Unless required by the applicable distance-from-cloud criteria, each pilot operating a large or turbine... applicable distance-from-cloud criteria requires glide path interception closer in, operate that airplane at... required by the prescribed departure procedure for that airport or the applicable distance from clouds...

  8. 14 CFR 91.129 - Operations in Class D airspace.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Unless required by the applicable distance-from-cloud criteria, each pilot operating a large or turbine... applicable distance-from-cloud criteria requires glide path interception closer in, operate that airplane at... required by the prescribed departure procedure for that airport or the applicable distance from clouds...

  9. 14 CFR 91.129 - Operations in Class D airspace.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Unless required by the applicable distance-from-cloud criteria, each pilot operating a large or turbine... applicable distance-from-cloud criteria requires glide path interception closer in, operate that airplane at... required by the prescribed departure procedure for that airport or the applicable distance from clouds...

  10. 14 CFR 91.129 - Operations in Class D airspace.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Unless required by the applicable distance-from-cloud criteria, each pilot operating a large or turbine... applicable distance-from-cloud criteria requires glide path interception closer in, operate that airplane at... required by the prescribed departure procedure for that airport or the applicable distance from clouds...

  11. The kaon identification system in the NA62 experiment at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romano, A.

    2015-07-01

    The main goal of the NA62 experiment at CERN is to measure the branching ratio of the ultra-rare K{sup +} → π{sup +} ν ν-bar decay with 10% accuracy. NA62 will use a 750 MHz high-energy un-separated charged hadron beam, with kaons corresponding to ∼6% of the beam, and a kaon decay-in-flight technique. The positive identification of kaons is performed with a differential Cherenkov detector (CEDAR), filled with Nitrogen gas and placed in the incoming beam. To stand the kaon rate (45 MHz average) and meet the performances required in NA62, the Cherenkov detector has been upgraded (KTAG) with newmore » photon detectors, readout, mechanics and cooling systems. The KTAG provides a fast identification of kaons with an efficiency of at least 95% and precise time information with a resolution below 100 ps. A half-equipped KTAG detector has been commissioned during a technical run at CERN in 2012, while the fully equipped detector, its readout and front-end have been commissioned during a pilot run at CERN in October 2014. The measured time resolution and efficiency are within the required performances. (authors)« less

  12. The pointing errors of geosynchronous satellites

    NASA Technical Reports Server (NTRS)

    Sikdar, D. N.; Das, A.

    1971-01-01

    A study of the correlation between cloud motion and wind field was initiated. Cloud heights and displacements were being obtained from a ceilometer and movie pictures, while winds were measured from pilot balloon observations on a near-simultaneous basis. Cloud motion vectors were obtained from time-lapse cloud pictures, using the WINDCO program, for 27, 28 July, 1969, in the Atlantic. The relationship between observed features of cloud clusters and the ambient wind field derived from cloud trajectories on a wide range of space and time scales is discussed.

  13. Interoperating Cloud-based Virtual Farms

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Colamaria, F.; Colella, D.; Casula, E.; Elia, D.; Franco, A.; Lusso, S.; Luparello, G.; Masera, M.; Miniello, G.; Mura, D.; Piano, S.; Vallero, S.; Venaruzzo, M.; Vino, G.

    2015-12-01

    The present work aims at optimizing the use of computing resources available at the grid Italian Tier-2 sites of the ALICE experiment at CERN LHC by making them accessible to interactive distributed analysis, thanks to modern solutions based on cloud computing. The scalability and elasticity of the computing resources via dynamic (“on-demand”) provisioning is essentially limited by the size of the computing site, reaching the theoretical optimum only in the asymptotic case of infinite resources. The main challenge of the project is to overcome this limitation by federating different sites through a distributed cloud facility. Storage capacities of the participating sites are seen as a single federated storage area, preventing the need of mirroring data across them: high data access efficiency is guaranteed by location-aware analysis software and storage interfaces, in a transparent way from an end-user perspective. Moreover, the interactive analysis on the federated cloud reduces the execution time with respect to grid batch jobs. The tests of the investigated solutions for both cloud computing and distributed storage on wide area network will be presented.

  14. Monitoring Evolution at CERN

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Fiorini, B.; Murphy, S.; Pigueiras, L.; Santos, M.

    2015-12-01

    Over the past two years, the operation of the CERN Data Centres went through significant changes with the introduction of new mechanisms for hardware procurement, new services for cloud provisioning and configuration management, among other improvements. These changes resulted in an increase of resources being operated in a more dynamic environment. Today, the CERN Data Centres provide over 11000 multi-core processor servers, 130 PB disk servers, 100 PB tape robots, and 150 high performance tape drives. To cope with these developments, an evolution of the data centre monitoring tools was also required. This modernisation was based on a number of guiding rules: sustain the increase of resources, adapt to the new dynamic nature of the data centres, make monitoring data easier to share, give more flexibility to Service Managers on how they publish and consume monitoring metrics and logs, establish a common repository of monitoring data, optimise the handling of monitoring notifications, and replace the previous toolset by new open source technologies with large adoption and community support. This contribution describes how these improvements were delivered, present the architecture and technologies of the new monitoring tools, and review the experience of its production deployment.

  15. Factors Influencing the Adoption of Cloud Storage by Information Technology Decision Makers

    ERIC Educational Resources Information Center

    Wheelock, Michael D.

    2013-01-01

    This dissertation uses a survey methodology to determine the factors behind the decision to adopt cloud storage. The dependent variable in the study is the intent to adopt cloud storage. Four independent variables are utilized including need, security, cost-effectiveness and reliability. The survey includes a pilot test, field test and statistical…

  16. Context-aware distributed cloud computing using CloudScheduler

    NASA Astrophysics Data System (ADS)

    Seuster, R.; Leavett-Brown, CR; Casteels, K.; Driemel, C.; Paterson, M.; Ring, D.; Sobie, RJ; Taylor, RP; Weldon, J.

    2017-10-01

    The distributed cloud using the CloudScheduler VM provisioning service is one of the longest running systems for HEP workloads. It has run millions of jobs for ATLAS and Belle II over the past few years using private and commercial clouds around the world. Our goal is to scale the distributed cloud to the 10,000-core level, with the ability to run any type of application (low I/O, high I/O and high memory) on any cloud. To achieve this goal, we have been implementing changes that utilize context-aware computing designs that are currently employed in the mobile communication industry. Context-awareness makes use of real-time and archived data to respond to user or system requirements. In our distributed cloud, we have many opportunistic clouds with no local HEP services, software or storage repositories. A context-aware design significantly improves the reliability and performance of our system by locating the nearest location of the required services. We describe how we are collecting and managing contextual information from our workload management systems, the clouds, the virtual machines and our services. This information is used not only to monitor the system but also to carry out automated corrective actions. We are incrementally adding new alerting and response services to our distributed cloud. This will enable us to scale the number of clouds and virtual machines. Further, a context-aware design will enable us to run analysis or high I/O application on opportunistic clouds. We envisage an open-source HTTP data federation (for example, the DynaFed system at CERN) as a service that would provide us access to existing storage elements used by the HEP experiments.

  17. Genomic Data Commons and Genomic Cloud Pilots - Google Hangout

    Cancer.gov

    Join us for a live, moderated discussion about two NCI efforts to expand access to cancer genomics data: the Genomic Data Commons and Genomic Cloud Pilots. NCI subject matters experts will include Louis M. Staudt, M.D., Ph.D., Director Center for Cancer Genomics, Warren Kibbe, Ph.D., Director, NCI Center for Biomedical Informatics and Information Technology, and moderated by Anthony Kerlavage, Ph.D., Chief, Cancer Informatics Branch, Center for Biomedical Informatics and Information Technology. We welcome your questions before and during the Hangout on Twitter using the hashtag #AskNCI.

  18. "Black cloud" vs. "white cloud" physicians - Myth or reality in apheresis medicine?

    PubMed

    Pham, Huy P; Raju, Dheeraj; Jiang, Ning; Williams, Lance A

    2017-08-01

    Many practitioners believe in the phenomenon of either being labeled a "black cloud" or "white cloud" while on-call. A "white-cloud" physician is one who usually gets fewer cases. A "black-cloud" is one who often has more cases. It is unclear if the designation is only superstitious or if there is some merit. Our aim is to objectively assess this phenomenon in apheresis medicine at our center. A one-year prospective study from 12/2014 to 11/2015 was designed to evaluate the number of times apheresis physicians and nurses were involved with emergent apheresis procedures between the hours from 10 PM and 7 AM. Other parameters collected include the names of the physician, apheresis nurse, type of emergent apheresis procedure, day of the week, and season of the year. During the study period, 32 emergent procedures (or "black-cloud" events) occurred. The median time between two consecutive events was 8 days (range: 1-34 days). We found no statistically significant association between the "black-cloud" events and attending physicians, nurses, day of the week, or season of the year by Chi-square and Fisher's analyses. However, exploratory analysis using association rule demonstrated that "black-cloud" events were more likely to happen on Thursday (2.19 times), with attending physician 2 (1.18 times), and during winter (1.15 times). The results of this pilot study may support the common perception that some physicians or nurses are either "black cloud" or "white cloud". A larger, multi-center study population is needed to validate the results of this pilot study. © 2016 Wiley Periodicals, Inc.

  19. Design and performance of the virtualization platform for offline computing on the ATLAS TDAQ Farm

    NASA Astrophysics Data System (ADS)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Twomey, M. S.; Zaytsev, A.

    2014-06-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 there is an opportunity to use the computing resources of the experiments' large trigger farms for other data processing activities. In the case of the ATLAS experiment, the TDAQ farm, consisting of more than 1500 compute nodes, is suitable for running Monte Carlo (MC) production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of the design and deployment of a virtualized platform running on this computing resource and of its use to run large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to guarantee the security and the usability of the ATLAS private network, and to minimize interference with TDAQ's usage of the farm. Openstack has been chosen to provide a cloud management layer. The experience gained in the last 3.5 months shows that the use of the TDAQ farm for the MC simulation contributes to the ATLAS data processing at the level of a large Tier-1 WLCG site, despite the opportunistic nature of the underlying computing resources being used.

  20. Cloud services for the Fermilab scientific stakeholders

    DOE PAGES

    Timm, S.; Garzoglio, G.; Mhashilkar, P.; ...

    2015-12-23

    As part of the Fermilab/KISTI cooperative research project, Fermilab has successfully run an experimental simulation workflow at scale on a federation of Amazon Web Services (AWS), FermiCloud, and local FermiGrid resources. We used the CernVM-FS (CVMFS) file system to deliver the application software. We established Squid caching servers in AWS as well, using the Shoal system to let each individual virtual machine find the closest squid server. We also developed an automatic virtual machine conversion system so that we could transition virtual machines made on FermiCloud to Amazon Web Services. We used this system to successfully run a cosmic raymore » simulation of the NOvA detector at Fermilab, making use of both AWS spot pricing and network bandwidth discounts to minimize the cost. On FermiCloud we also were able to run the workflow at the scale of 1000 virtual machines, using a private network routable inside of Fermilab. As a result, we present in detail the technological improvements that were used to make this work a reality.« less

  1. Cloud services for the Fermilab scientific stakeholders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timm, S.; Garzoglio, G.; Mhashilkar, P.

    As part of the Fermilab/KISTI cooperative research project, Fermilab has successfully run an experimental simulation workflow at scale on a federation of Amazon Web Services (AWS), FermiCloud, and local FermiGrid resources. We used the CernVM-FS (CVMFS) file system to deliver the application software. We established Squid caching servers in AWS as well, using the Shoal system to let each individual virtual machine find the closest squid server. We also developed an automatic virtual machine conversion system so that we could transition virtual machines made on FermiCloud to Amazon Web Services. We used this system to successfully run a cosmic raymore » simulation of the NOvA detector at Fermilab, making use of both AWS spot pricing and network bandwidth discounts to minimize the cost. On FermiCloud we also were able to run the workflow at the scale of 1000 virtual machines, using a private network routable inside of Fermilab. As a result, we present in detail the technological improvements that were used to make this work a reality.« less

  2. First results from a combined analysis of CERN computing infrastructure metrics

    NASA Astrophysics Data System (ADS)

    Duellmann, Dirk; Nieke, Christian

    2017-10-01

    The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.

  3. Vertical wind velocity measurements using a five-hole probe with remotely piloted aircraft to study aerosol-cloud interactions

    NASA Astrophysics Data System (ADS)

    Calmer, Radiance; Roberts, Gregory C.; Preissler, Jana; Sanchez, Kevin J.; Derrien, Solène; O'Dowd, Colin

    2018-05-01

    The importance of vertical wind velocities (in particular positive vertical wind velocities or updrafts) in atmospheric science has motivated the need to deploy multi-hole probes developed for manned aircraft in small remotely piloted aircraft (RPA). In atmospheric research, lightweight RPAs ( < 2.5 kg) are now able to accurately measure atmospheric wind vectors, even in a cloud, which provides essential observing tools for understanding aerosol-cloud interactions. The European project BACCHUS (impact of Biogenic versus Anthropogenic emissions on Clouds and Climate: towards a Holistic UnderStanding) focuses on these specific interactions. In particular, vertical wind velocity at cloud base is a key parameter for studying aerosol-cloud interactions. To measure the three components of wind, a RPA is equipped with a five-hole probe, pressure sensors, and an inertial navigation system (INS). The five-hole probe is calibrated on a multi-axis platform, and the probe-INS system is validated in a wind tunnel. Once mounted on a RPA, power spectral density (PSD) functions and turbulent kinetic energy (TKE) derived from the five-hole probe are compared with sonic anemometers on a meteorological mast. During a BACCHUS field campaign at Mace Head Atmospheric Research Station (Ireland), a fleet of RPAs was deployed to profile the atmosphere and complement ground-based and satellite observations of physical and chemical properties of aerosols, clouds, and meteorological state parameters. The five-hole probe was flown on straight-and-level legs to measure vertical wind velocities within clouds. The vertical velocity measurements from the RPA are validated with vertical velocities derived from a ground-based cloud radar by showing that both measurements yield model-simulated cloud droplet number concentrations within 10 %. The updraft velocity distributions illustrate distinct relationships between vertical cloud fields in different meteorological conditions.

  4. How to deal with petabytes of data: the LHC Grid project

    NASA Astrophysics Data System (ADS)

    Britton, D.; Lloyd, S. L.

    2014-06-01

    We review the Grid computing system developed by the international community to deal with the petabytes of data coming from the Large Hadron Collider at CERN in Geneva with particular emphasis on the ATLAS experiment and the UK Grid project, GridPP. Although these developments were started over a decade ago, this article explains their continued relevance as part of the ‘Big Data’ problem and how the Grid has been forerunner of today's cloud computing.

  5. Montecarlo Simulations for a Lep Experiment with Unix Workstation Clusters

    NASA Astrophysics Data System (ADS)

    Bonesini, M.; Calegari, A.; Rossi, P.; Rossi, V.

    Modular systems of RISC CPU based computers have been implemented for large productions of Montecarlo simulated events for the DELPHI experiment at CERN. From a pilot system based on DEC 5000 CPU’s, a full size system based on a CONVEX C3820 UNIX supercomputer and a cluster of HP 735 workstations has been put into operation as a joint effort between INFN Milano and CILEA.

  6. Agile Infrastructure Monitoring

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.

    2014-06-01

    At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.

  7. Global atmospheric particle formation from CERN CLOUD measurements

    NASA Astrophysics Data System (ADS)

    Dunne, Eimear M.; Gordon, Hamish; Carslaw, Kenneth S.

    2017-04-01

    New particle formation (or nucleation) is acknowledged as a significant source of climate-relevant aerosol throughout the atmosphere. However, performing atmospherically relevant nucleation experiments in a laboratory setting is extremely challenging. As a result, until now, the parameterisations used to represent new particle formation in global aerosol models were largely based on in-situ observations or theoretical nucleation models, and usually only represented the binary H2SO4-H2O system. Several different chemicals can affect particle formation rates, even at extremely low trace concentrations, which are technically challenging to measure directly. Nucleation rates also respond to environmental changes in e.g. temperature in a highly non-linear fashion. The CERN CLOUD experiment was designed to provide the most controlled and accurate nucleation rate measurements to date, over the full range of free tropospheric temperatures and down to sulphuric acid concentrations of the order of 105 cm-3. We will present a parameterisation of inorganic nucleation rates for use in global models, based on these measurements, which includes four separate nucleation pathways: binary neutral, binary ion-induced, ternary neutral, and ternary ion-induced. Both inorganic and organic nucleation parameterisations derived from CLOUD measurements have been implemented in the GLOMAP global aerosol model. The parameterisations depend on temperature and on concentrations of sulphuric acid, ammonia, organic vapours, and ions. One of CLOUD's main original goals was to determine the sensitivity of atmospheric aerosol to changes in the nucleation rate over a solar cycle. We will show that, in a present-day atmosphere, the changes in climate-relevant aerosol (in the form of cloud-level cloud condensation nuclei) over a solar cycle are on average about 0.1%, with local changes of less than 1%. In contrast, anthropogenic changes in ammonia since pre-industrial times were estimated to have a much greater influence, resulting in a radiative forcing of between -0.62 and -0.66 W m-2. Including ternary inorganic pathways in GLOMAP improved the model's agreement with free tropospheric observations, especially aircraft measurements. The further inclusion of an organic parameterisation, which increased nucleation in the summertime boundary layer, brought our results more in line with observations made at surface stations. We therefore believe that, while the addition of other nucleation pathways (such as amine-induced nucleation) will doubtless improve agreement with local in-situ measurements, this model set-up provides a good representation of the global atmosphere as a whole. By presenting this novel parameterisation at EGU, we hope to encourage its uptake among the aerosol modelling community.

  8. NIH Data Commons Pilot Phase | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    The NIH, under the BD2K program, will be launching a Data Commons Pilot Phase to test ways to store, access and share Findable, Accessible, Interoperable and Reusable (FAIR) biomedical data and associated tools in the cloud. The NIH Data Commons Pilot Phase is expected to span fiscal years 2017-2020, with an estimated total budget of approximately $55.5 Million, pending available funds.

  9. 14 CFR Appendix I to Part 141 - Additional Aircraft Category and/or Class Rating Course

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... (b) For a private pilot certificate, the following aeronautical knowledge areas must be included in a... Aviation Administration for private pilot privileges, limitations, and flight operations; (2) Safe and..., including knowledge and effects of fronts, frontal characteristics, cloud formations, icing, and upper-air...

  10. 14 CFR Appendix I to Part 141 - Additional Aircraft Category and/or Class Rating Course

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... (b) For a private pilot certificate, the following aeronautical knowledge areas must be included in a... Aviation Administration for private pilot privileges, limitations, and flight operations; (2) Safe and..., including knowledge and effects of fronts, frontal characteristics, cloud formations, icing, and upper-air...

  11. x509-free access to WLCG resources

    NASA Astrophysics Data System (ADS)

    Short, H.; Manzi, A.; De Notaris, V.; Keeble, O.; Kiryanov, A.; Mikkonen, H.; Tedesco, P.; Wartel, R.

    2017-10-01

    Access to WLCG resources is authenticated using an x509 and PKI infrastructure. Even though HEP users have always been exposed to certificates directly, the development of modern Web Applications by the LHC experiments calls for simplified authentication processes keeping the underlying software unmodified. In this work we will show a solution with the goal of providing access to WLCG resources using the user’s home organisations credentials, without the need for user-acquired x509 certificates. In particular, we focus on identity providers within eduGAIN, which interconnects research and education organisations worldwide, and enables the trustworthy exchange of identity-related information. eduGAIN has been integrated at CERN in the SSO infrastructure so that users can authenticate without the need of a CERN account. This solution achieves x509-free access to Grid resources with the help of two services: STS and an online CA. The STS (Security Token Service) allows credential translation from the SAML2 format used by Identity Federations to the VOMS-enabled x509 used by most of the Grid. The IOTA CA (Identifier-Only Trust Assurance Certification Authority) is responsible for the automatic issuing of short-lived x509 certificates. The IOTA CA deployed at CERN has been accepted by EUGridPMA as the CERN LCG IOTA CA, included in the IGTF trust anchor distribution and installed by the sites in WLCG. We will also describe the first pilot projects which are integrating the solution.

  12. ATLAS Live: Collaborative Information Streams

    NASA Astrophysics Data System (ADS)

    Goldfarb, Steven; ATLAS Collaboration

    2011-12-01

    I report on a pilot project launched in 2010 focusing on facilitating communication and information exchange within the ATLAS Collaboration, through the combination of digital signage software and webcasting. The project, called ATLAS Live, implements video streams of information, ranging from detailed detector and data status to educational and outreach material. The content, including text, images, video and audio, is collected, visualised and scheduled using digital signage software. The system is robust and flexible, utilizing scripts to input data from remote sources, such as the CERN Document Server, Indico, or any available URL, and to integrate these sources into professional-quality streams, including text scrolling, transition effects, inter and intra-screen divisibility. Information is published via the encoding and webcasting of standard video streams, viewable on all common platforms, using a web browser or other common video tool. Authorisation is enforced at the level of the streaming and at the web portals, using the CERN SSO system.

  13. CERNBox + EOS: end-user storage for science

    NASA Astrophysics Data System (ADS)

    Mascetti, L.; Gonzalez Labrador, H.; Lamanna, M.; Mościcki, JT; Peters, AJ

    2015-12-01

    CERNBox is a cloud synchronisation service for end-users: it allows syncing and sharing files on all major mobile and desktop platforms (Linux, Windows, MacOSX, Android, iOS) aiming to provide offline availability to any data stored in the CERN EOS infrastructure. The successful beta phase of the service confirmed the high demand in the community for an easily accessible cloud storage solution such as CERNBox. Integration of the CERNBox service with the EOS storage back-end is the next step towards providing “sync and share” capabilities for scientific and engineering use-cases. In this report we will present lessons learnt in offering the CERNBox service, key technical aspects of CERNBox/EOS integration and new, emerging usage possibilities. The latter includes the ongoing integration of “sync and share” capabilities with the LHC data analysis tools and transfer services.

  14. Causes of General Aviation Weather-Related, Non-Fatal Incidents: Analysis Using NASA Aviation Safety Reporting System Data

    DTIC Science & Technology

    2010-09-01

    first, fol- lowed by detailed analysis, finishing with a recap of the same conclusions. In technical terms, this cognitively primes11 the reader and...lowering ceiling, clouds, fog, rain, rising cloud tops, merging cloud layers) b) icing c) thunderstorms d) turbulence 11 In cognitive priming... stylistic differences in the way pilots tend to handle weather. In fact, each group seems to have problems with the exact worst category of weather with

  15. The SHiP experiment at CERN SPS

    NASA Astrophysics Data System (ADS)

    Di Crescenzo, A.; SHiP Collaboration

    2016-01-01

    SHiP is a new general purpose fixed target facility, whose Technical Proposal has been recently submitted to the CERN SPS Committee. In its initial phase, the 400GeV proton beam extracted from the SPS will be dumped on a heavy target with the aim of integrating 2×1020 pot in 5years. A dedicated detector located downstream of the target, based on a long vacuum tank followed by a spectrometer and particle identification detectors, will allow probing a variety of models with light long-lived exotic particles and masses below a few GeV/c2. The beam dump is also an ideal source of tau neutrinos, the less known particle in the Standard Model. Another dedicated detector, based on the Emulsion Cloud Chamber technology already used in the OPERA experiment, will allow to perform for the first time measurements of the tau neutrino deep inelastic scattering cross section. Tau neutrinos will be distinguished from tau anti-neutrinos, thus providing the first observation of the tau anti-neutrino.

  16. Global positioning system supported pilot's display

    NASA Technical Reports Server (NTRS)

    Scott, Marshall M., Jr.; Erdogan, Temel; Schwalb, Andrew P.; Curley, Charles H.

    1991-01-01

    The hardware, software, and operation of the Microwave Scanning Beam Landing System (MSBLS) Flight Inspection System Pilot's Display is discussed. The Pilot's Display is used in conjunction with flight inspection tests that certify the Microwave Scanning Beam Landing System used at Space Shuttle landing facilities throughout the world. The Pilot's Display was developed for the pilot of test aircraft to set up and fly a given test flight path determined by the flight inspection test engineers. This display also aids the aircraft pilot when hazy or cloud cover conditions exist that limit the pilot's visibility of the Shuttle runway during the flight inspection. The aircraft position is calculated using the Global Positioning System and displayed in the cockpit on a graphical display.

  17. Integrating multiple scientific computing needs via a Private Cloud infrastructure

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Brunetti, R.; Lusso, S.; Vallero, S.

    2014-06-01

    In a typical scientific computing centre, diverse applications coexist and share a single physical infrastructure. An underlying Private Cloud facility eases the management and maintenance of heterogeneous use cases such as multipurpose or application-specific batch farms, Grid sites catering to different communities, parallel interactive data analysis facilities and others. It allows to dynamically and efficiently allocate resources to any application and to tailor the virtual machines according to the applications' requirements. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques; for example, rolling updates can be performed easily and minimizing the downtime. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 site and a dynamically expandable PROOF-based Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The Private Cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem (used in two different configurations for worker- and service-class hypervisors) and the OpenWRT Linux distribution (used for network virtualization). A future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and by using mainstream contextualization tools like CloudInit.

  18. Design of a small laser ceilometer and visibility measuring device for helicopter landing sites

    NASA Astrophysics Data System (ADS)

    Streicher, Jurgen; Werner, Christian; Dittel, Walter

    2004-01-01

    Hardware development for remote sensing costs a lot of time and money. A virtual instrument based on software modules was developed to optimise a small visibility and cloud base height sensor. Visibility is the parameter describing the turbidity of the atmosphere. This can be done either by a mean value over a path measured by a transmissometer or for each point of the atmosphere like the backscattered intensity of a range resolved lidar measurement. A standard ceilometer detects the altitude of clouds by using the runtime of the laser pulse and the increasing intensity of the back scattered light when hitting the boundary of a cloud. This corresponds to hard target range finding, but with a more sensitive detection. The output of a standard ceilometer is in case of cloud coverage the altitude of one or more layers. Commercial cloud sensors are specified to track cloud altitude at rather large distances (100 m up to 10 km) and are therefore big and expensive. A virtual instrument was used to calculate the system parameters for a small system for heliports at hospitals and landing platforms under visual flight rules (VFR). Helicopter pilots need information about cloud altitude (base not below 500 feet) and/or the visibility conditions (visual range not lower than 600m) at the destinated landing point. Private pilots need this information too when approaching a non-commercial airport. Both values can be measured automatically with the developed small and compact prototype, at the size of a shoebox for a reasonable price.

  19. Pilot Judgment Training and Evaluation. Volume 3.

    DTIC Science & Technology

    1982-06-01

    Information Manual. 3-1 8. Flight computer . _ 9. Basic navigation: aeronautical charts (sectional and world 4- aeronautical charts); airspace... clouds , traffic, etc., when you needed to and still maintained the course. 4-13 I___________________ -- -I ~ =~- I- -- INSTRUCTOR LESSON PLAN PART I...maintain basic VFR. PART III Observable Behavior Sought: The student will make proper diversions from clouds to maintain basic VFR. PART IV Reinforcements

  20. A world-wide databridge supported by a commercial cloud provider

    NASA Astrophysics Data System (ADS)

    Tat Cheung, Kwong; Field, Laurence; Furano, Fabrizio

    2017-10-01

    Volunteer computing has the potential to provide significant additional computing capacity for the LHC experiments. One of the challenges with exploiting volunteer computing is to support a global community of volunteers that provides heterogeneous resources. However, high energy physics applications require more data input and output than the CPU intensive applications that are typically used by other volunteer computing projects. While the so-called databridge has already been successfully proposed as a method to span the untrusted and trusted domains of volunteer computing and Grid computing respective, globally transferring data between potentially poor-performing residential networks and CERN could be unreliable, leading to wasted resources usage. The expectation is that by placing a storage endpoint that is part of a wider, flexible geographical databridge deployment closer to the volunteers, the transfer success rate and the overall performance can be improved. This contribution investigates the provision of a globally distributed databridge implemented upon a commercial cloud provider.

  1. Electron cloud buildup driving spontaneous vertical instabilities of stored beams in the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Romano, Annalisa; Boine-Frankenheim, Oliver; Buffat, Xavier; Iadarola, Giovanni; Rumolo, Giovanni

    2018-06-01

    At the beginning of the 2016 run, an anomalous beam instability was systematically observed at the CERN Large Hadron Collider (LHC). Its main characteristic was that it spontaneously appeared after beams had been stored for several hours in collision at 6.5 TeV to provide data for the experiments, despite large chromaticity values and high strength of the Landau-damping octupole magnet. The instability exhibited several features characteristic of those induced by the electron cloud (EC). Indeed, when LHC operates with 25 ns bunch spacing, an EC builds up in a large fraction of the beam chambers, as revealed by several independent indicators. Numerical simulations have been carried out in order to investigate the role of the EC in the observed instabilities. It has been found that the beam intensity decay is unfavorable for the beam stability when LHC operates in a strong EC regime.

  2. A cloud based tool for knowledge exchange on local scale flood risk.

    PubMed

    Wilkinson, M E; Mackay, E; Quinn, P F; Stutter, M; Beven, K J; MacLeod, C J A; Macklin, M G; Elkhatib, Y; Percy, B; Vitolo, C; Haygarth, P M

    2015-09-15

    There is an emerging and urgent need for new approaches for the management of environmental challenges such as flood hazard in the broad context of sustainability. This requires a new way of working which bridges disciplines and organisations, and that breaks down science-culture boundaries. With this, there is growing recognition that the appropriate involvement of local communities in catchment management decisions can result in multiple benefits. However, new tools are required to connect organisations and communities. The growth of cloud based technologies offers a novel way to facilitate this process of exchange of information in environmental science and management; however, stakeholders need to be engaged with as part of the development process from the beginning rather than being presented with a final product at the end. Here we present the development of a pilot Local Environmental Virtual Observatory Flooding Tool. The aim was to develop a cloud based learning platform for stakeholders, bringing together fragmented data, models and visualisation tools that will enable these stakeholders to make scientifically informed environmental management decisions at the local scale. It has been developed by engaging with different stakeholder groups in three catchment case studies in the UK and a panel of national experts in relevant topic areas. However, these case study catchments are typical of many northern latitude catchments. The tool was designed to communicate flood risk in locally impacted communities whilst engaging with landowners/farmers about the risk of runoff from the farmed landscape. It has been developed iteratively to reflect the needs, interests and capabilities of a wide range of stakeholders. The pilot tool combines cloud based services, local catchment datasets, a hydrological model and bespoke visualisation tools to explore real time hydrometric data and the impact of flood risk caused by future land use changes. The novel aspects of the pilot tool are; the co-evolution of tools on a cloud based platform with stakeholders, policy and scientists; encouraging different science disciplines to work together; a wealth of information that is accessible and understandable to a range of stakeholders; and provides a framework for how to approach the development of such a cloud based tool in the future. Above all, stakeholders saw the tool and the potential of cloud technologies as an effective means to taking a whole systems approach to solving environmental issues. This sense of community ownership is essential in order to facilitate future appropriate and acceptable land use management decisions to be co-developed by local catchment communities. The development processes and the resulting pilot tool could be applied to local catchments globally to facilitate bottom up catchment management approaches. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Integration of Cloud Technologies for Data Stewardship at the NOAA National Centers for Environmental Information (NCEI)

    NASA Astrophysics Data System (ADS)

    Casey, K. S.; Hausman, S. A.

    2016-02-01

    In the last year, the NOAA National Oceanographic Data Center (NODC) and its siblings, the National Climatic Data Center and National Geophysical Data Center, were merged into one organization, the NOAA National Centers for Environmental Information (NCEI). Combining its expertise under one management has helped NCEI accelerate its efforts to embrace and integrate private, public, and hybrid cloud environments into its range of data stewardship services. These services span a range of tiers, from basic, long-term preservation and access, through enhanced access and scientific quality control, to authoritative product development and international-level services. Throughout these tiers of stewardship, partnerships and pilot projects have been launched to identify technological and policy-oriented challenges, to establish solutions to these problems, and to highlight success stories for emulation during operational integration of the cloud into NCEI's data stewardship activities. Some of these pilot activities including data storage, access, and reprocessing in Amazon Web Services, the OneStop data discovery and access framework project, and a set of Cooperative Research and Development Agreements under the Big Data Project with Amazon, Google, IBM, Microsoft, and the Open Cloud Consortium. Progress in these efforts will be highlighted along with a future vision of how NCEI could leverage hybrid cloud deployments and federated systems across NOAA to enable effective data stewardship for its oceanographic, atmospheric, climatic, and geophysical Big Data.

  4. Diagnosing turbulence for research aircraft safety using open source toolkits

    NASA Astrophysics Data System (ADS)

    Lang, T. J.; Guy, N.

    Open source software toolkits have been developed and applied to diagnose in-cloud turbulence in the vicinity of Earth science research aircraft, via analysis of ground-based Doppler radar data. Based on multiple retrospective analyses, these toolkits show promise for detecting significant turbulence well prior to cloud penetrations by research aircraft. A pilot study demonstrated the ability to provide mission scientists turbulence estimates in near real time during an actual field campaign, and thus these toolkits are recommended for usage in future cloud-penetrating aircraft field campaigns.

  5. Managing virtual machines with Vac and Vcycle

    NASA Astrophysics Data System (ADS)

    McNab, A.; Love, P.; MacMahon, E.

    2015-12-01

    We compare the Vac and Vcycle virtual machine lifecycle managers and our experiences in providing production job execution services for ATLAS, CMS, LHCb, and the GridPP VO at sites in the UK, France and at CERN. In both the Vac and Vcycle systems, the virtual machines are created outside of the experiment's job submission and pilot framework. In the case of Vac, a daemon runs on each physical host which manages a pool of virtual machines on that host, and a peer-to-peer UDP protocol is used to achieve the desired target shares between experiments across the site. In the case of Vcycle, a daemon manages a pool of virtual machines on an Infrastructure-as-a-Service cloud system such as OpenStack, and has within itself enough information to create the types of virtual machines to achieve the desired target shares. Both systems allow unused shares for one experiment to temporarily taken up by other experiements with work to be done. The virtual machine lifecycle is managed with a minimum of information, gathered from the virtual machine creation mechanism (such as libvirt or OpenStack) and using the proposed Machine/Job Features API from WLCG. We demonstrate that the same virtual machine designs can be used to run production jobs on Vac and Vcycle/OpenStack sites for ATLAS, CMS, LHCb, and GridPP, and that these technologies allow sites to be operated in a reliable and robust way.

  6. Recent results and prospects for NA62 experiment

    NASA Astrophysics Data System (ADS)

    Martellotti, Silvia; Aglieri Rinella, G.; Aliberti, R.; Ambrosino, F.; Ammendola, R.; Angelucci, B.; Antonelli, A.; Anzivino, G.; Arcidiacono, R.; Azhinenko, I.; Balev, S.; Barbanera, M.; Bendotti, J.; Biagioni, A.; Bician, L.; Biino, C.; Bizzeti, A.; Blazek, T.; Blik, A.; Bloch-Devaux, B.; Bolotov, V.; Bonaiuto, V.; Boretto, M.; Bragadireanu, M.; Britton, D.; Britvich, G.; Brunetti, M. B.; Bryman, D.; Bucci, F.; Butin, F.; Capitolo, E.; Capoccia, C.; Capussela, T.; Cassese, A.; Catinaccio, A.; Cecchetti, A.; Ceccucci, A.; Cenci, P.; Cerny, V.; Cerri, C.; Checcucci, B.; Chikilev, O.; Chiozzi, S.; Ciaranfi, R.; Collazuol, G.; Conovaloff, A.; Cooke, P.; Cooper, P.; Corradi, G.; Cortina Gil, E.; Costantini, F.; Cotorobai, F.; Cotta Ramusino, A.; Coward, D.; D'Agostini, G.; Dainton, J.; Dalpiaz, P.; Danielsson, H.; Degrange, J.; De Simone, N.; Di Filippo, D.; Di Lella, L.; Di Lorenzo, S.; Dixon, N.; Doble, N.; Dobrich, B.; Duk, V.; Elsha, V.; Engelfried, J.; Enik, T.; Estrada, N.; Falaleev, V.; Fantechi, R.; Fascianelli, V.; Federici, L.; Fedotov, S.; Fiorini, M.; Fry, J.; Fu, J.; Fucci, A.; Fulton, L.; Gallorini, S.; Galeotti, S.; Gamberini, E.; Gatignon, L.; Georgiev, G.; Gianoli, A.; Giorgi, M.; Giudici, S.; Glonti, L.; Goncalves Martins, A.; Gonnella, F.; Goudzovski, E.; Guida, R.; Gushchin, E.; Hahn, F.; Hallgren, B.; Heath, H.; Herman, F.; Husek, T.; Hutanu, O.; Hutchcroft, D.; Iacobuzio, L.; Iacopini, E.; Imbergamo, E.; Jamet, O.; Jarron, P.; Jones, E.; Kampf, K.; Kaplon, J.; Kekelidze, V.; Kholodenko, S.; Khoriauli, G.; Khotyantsev, A.; Khudyakov, A.; Kiryushin, Yu.; Kleimenova, A.; Kleinknecht, K.; Kluge, A.; Koval, M.; Kozhuharov, V.; Krivda, M.; Kucerova, Z.; Kudenko, Y.; Kunze, J.; Lamanna, G.; Latino, G.; Lazzeroni, C.; Lehmann-Miotto, G.; Lenci, R.; Lenti, M.; Leonardi, E.; Lichard, P.; Lietava, R.; Litov, L.; Lollini, R.; Lomidze, D.; Lonardo, A.; Lupi, M.; Lurkin, N.; McCormick, K.; Madigozhin, D.; Maire, G.; Mandeiro, C.; Mannelli, I.; Mannocchi, G.; Mapelli, A.; Marchetto, F.; Marchevski, R.; Martellotti, S.; Massarotti, P.; Massri, K.; Matak, P.; Maurice, E.; Mefodev, A.; Menichetti, E.; Minucci, E.; Mirra, M.; Misheva, M.; Molokanova, N.; Morant, J.; Morel, M.; Moulson, M.; Movchan, S.; Munday, D.; Napolitano, M.; Neri, I.; Newson, F.; Norton, A.; Noy, M.; Nuessle, G.; Numao, T.; Obraztsov, V.; Ostankov, A.; Padolski, S.; Page, R.; Palladino, V.; Paoluzzi, G.; Parkinson, C.; Pedreschi, E.; Pepe, M.; Perez Gomez, F.; Perrin-Terrin, M.; Peruzzo, L.; Petrov, P.; Petrucci, F.; Piandani, R.; Piccini, M.; Pietreanu, D.; Pinzino, J.; Polenkevich, I.; Pontisso, L.; Potrebenikov, Yu.; Protopopescu, D.; Raffaelli, F.; Raggi, M.; Riedler, P.; Romano, A.; Rubin, P.; Ruggiero, G.; Russo, V.; Ryjov, V.; Salamon, A.; Salina, G.; Samsonov, V.; Santoni, C.; Saracino, G.; Sargeni, F.; Semenov, V.; Sergi, A.; Serra, M.; Shaikhiev, A.; Shkarovskiy, S.; Skillicorn, I.; Soldi, D.; Sotnikov, A.; Sougonyaev, V.; Sozzi, M.; Spadaro, T.; Spinella, F.; Staley, R.; Sturgess, A.; Sutcliffe, P.; Szilasi, N.; Tagnani, D.; Trilov, S.; Valdata-Nappi, M.; Valente, P.; Vasile, M.; Vassilieva, T.; Velghe, B.; Veltri, M.; Venditti, S.; Vicini, P.; Volpe, R.; Vormstein, M.; Wahl, H.; Wanke, R.; Wertelaers, P.; Winhart, A.; Winston, R.; Wrona, B.; Yushchenko, O.; Zamkovsky, M.; Zinchenko, A.; NA62 Collaboration

    2017-04-01

    The K+ →π+ ν ν ‾ decay is theoretically one of the cleanest meson decays and so a good place to look for indirect effects of new physics complementary to LHC searches. The NA62 experiment at CERN is designed to measure the branching ratio of this decay with 10% precision. NA62 was commissioned in October 2014, took data in pilot runs in 2014 and 2015. The NA62 experimental setup is illustrated and data quality is reported.

  7. Integration of Satellite-Derived Cloud Phase, Cloud Top Height, and Liquid Water Path into an Operational Aircraft Icing Nowcasting System

    NASA Technical Reports Server (NTRS)

    Haggerty, Julie; McDonough, Frank; Black, Jennifer; Landott, Scott; Wolff, Cory; Mueller, Steven; Minnis, Patrick; Smith, William, Jr.

    2008-01-01

    Operational products used by the U.S. Federal Aviation Administration to alert pilots of hazardous icing provide nowcast and short-term forecast estimates of the potential for the presence of supercooled liquid water and supercooled large droplets. The Current Icing Product (CIP) system employs basic satellite-derived information, including a cloud mask and cloud top temperature estimates, together with multiple other data sources to produce a gridded, three-dimensional, hourly depiction of icing probability and severity. Advanced satellite-derived cloud products developed at the NASA Langley Research Center (LaRC) provide a more detailed description of cloud properties (primarily at cloud top) compared to the basic satellite-derived information used currently in CIP. Cloud hydrometeor phase, liquid water path, cloud effective temperature, and cloud top height as estimated by the LaRC algorithms are into the CIP fuzzy logic scheme and a confidence value is determined. Examples of CIP products before and after the integration of the LaRC satellite-derived products will be presented at the conference.

  8. A novel mobile-cloud system for capturing and analyzing wheelchair maneuvering data: A pilot study.

    PubMed

    Fu, Jicheng; Jones, Maria; Liu, Tao; Hao, Wei; Yan, Yuqing; Qian, Gang; Jan, Yih-Kuen

    2016-01-01

    The purpose of this pilot study was to provide a new approach for capturing and analyzing wheelchair maneuvering data, which are critical for evaluating wheelchair users' activity levels. We proposed a mobile-cloud (MC) system, which incorporated the emerging mobile and cloud computing technologies. The MC system employed smartphone sensors to collect wheelchair maneuvering data and transmit them to the cloud for storage and analysis. A k-nearest neighbor (KNN) machine-learning algorithm was developed to mitigate the impact of sensor noise and recognize wheelchair maneuvering patterns. We conducted 30 trials in an indoor setting, where each trial contained 10 bouts (i.e., periods of continuous wheelchair movement). We also verified our approach in a different building. Different from existing approaches that require sensors to be attached to wheelchairs' wheels, we placed the smartphone into a smartphone holder attached to the wheelchair. Experimental results illustrate that our approach correctly identified all 300 bouts. Compared to existing approaches, our approach was easier to use while achieving similar accuracy in analyzing the accumulated movement time and maximum period of continuous movement (p > 0.8). Overall, the MC system provided a feasible way to ease the data collection process and generated accurate analysis results for evaluating activity levels.

  9. A Novel Mobile-Cloud System for Capturing and Analyzing Wheelchair Maneuvering Data: A Pilot Study

    PubMed Central

    Fu, Jicheng; Jones, Maria; Liu, Tao; Hao, Wei; Yan, Yuqing; Qian, Gang; Jan, Yih-Kuen

    2016-01-01

    The purpose of this pilot study was to provide a new approach for capturing and analyzing wheelchair maneuvering data, which are critical for evaluating wheelchair users’ activity levels. We proposed a mobile-cloud (MC) system, which incorporated the emerging mobile and cloud computing technologies. The MC system employed smartphone sensors to collect wheelchair maneuvering data and transmit them to the cloud for storage and analysis. A K-Nearest-Neighbor (KNN) machine-learning algorithm was developed to mitigate the impact of sensor noise and recognize wheelchair maneuvering patterns. We conducted 30 trials in an indoor setting, where each trial contained 10 bouts (i.e., periods of continuous wheelchair movement). We also verified our approach in a different building. Different from existing approaches that require sensors to be attached to wheelchairs’ wheels, we placed the smartphone into a smartphone holder attached to the wheelchair. Experimental results illustrate that our approach correctly identified all 300 bouts. Compared to existing approaches, our approach was easier to use while achieving similar accuracy in analyzing the accumulated movement time and maximum period of continuous movement (p > 0.8). Overall, the MC system provided a feasible way to ease the data collection process, and generated accurate analysis results for evaluating activity levels. PMID:26479684

  10. Space-charge effects in Penning ion traps

    NASA Astrophysics Data System (ADS)

    Porobić, T.; Beck, M.; Breitenfeldt, M.; Couratin, C.; Finlay, P.; Knecht, A.; Fabian, X.; Friedag, P.; Fléchard, X.; Liénard, E.; Ban, G.; Zákoucký, D.; Soti, G.; Van Gorp, S.; Weinheimer, Ch.; Wursten, E.; Severijns, N.

    2015-06-01

    The influence of space-charge on ion cyclotron resonances and magnetron eigenfrequency in a gas-filled Penning ion trap has been investigated. Off-line measurements with K39+ using the cooling trap of the WITCH retardation spectrometer-based setup at ISOLDE/CERN were performed. Experimental ion cyclotron resonances were compared with ab initio Coulomb simulations and found to be in agreement. As an important systematic effect of the WITCH experiment, the magnetron eigenfrequency of the ion cloud was studied under increasing space-charge conditions. Finally, the helium buffer gas pressure in the Penning trap was determined by comparing experimental cooling rates with simulations.

  11. The analysis of polar clouds from AVHRR satellite data using pattern recognition techniques

    NASA Technical Reports Server (NTRS)

    Smith, William L.; Ebert, Elizabeth

    1990-01-01

    The cloud cover in a set of summertime and wintertime AVHRR data from the Arctic and Antarctic regions was analyzed using a pattern recognition algorithm. The data were collected by the NOAA-7 satellite on 6 to 13 Jan. and 1 to 7 Jul. 1984 between 60 deg and 90 deg north and south latitude in 5 spectral channels, at the Global Area Coverage (GAC) resolution of approximately 4 km. This data embodied a Polar Cloud Pilot Data Set which was analyzed by a number of research groups as part of a polar cloud algorithm intercomparison study. This study was intended to determine whether the additional information contained in the AVHRR channels (beyond the standard visible and infrared bands on geostationary satellites) could be effectively utilized in cloud algorithms to resolve some of the cloud detection problems caused by low visible and thermal contrasts in the polar regions. The analysis described makes use of a pattern recognition algorithm which estimates the surface and cloud classification, cloud fraction, and surface and cloudy visible (channel 1) albedo and infrared (channel 4) brightness temperatures on a 2.5 x 2.5 deg latitude-longitude grid. In each grid box several spectral and textural features were computed from the calibrated pixel values in the multispectral imagery, then used to classify the region into one of eighteen surface and/or cloud types using the maximum likelihood decision rule. A slightly different version of the algorithm was used for each season and hemisphere because of differences in categories and because of the lack of visible imagery during winter. The classification of the scene is used to specify the optimal AVHRR channel for separating clear and cloudy pixels using a hybrid histogram-spatial coherence method. This method estimates values for cloud fraction, clear and cloudy albedos and brightness temperatures in each grid box. The choice of a class-dependent AVHRR channel allows for better separation of clear and cloudy pixels than does a global choice of a visible and/or infrared threshold. The classification also prevents erroneous estimates of large fractional cloudiness in areas of cloudfree snow and sea ice. The hybrid histogram-spatial coherence technique and the advantages of first classifying a scene in the polar regions are detailed. The complete Polar Cloud Pilot Data Set was analyzed and the results are presented and discussed.

  12. Experience from the 1st Year running a Massive High Quality Videoconferencing Service for the LHC

    NASA Astrophysics Data System (ADS)

    Fernandes, Joao; Baron, Thomas; Bompastor, Bruno

    2014-06-01

    In the last few years, we have witnessed an explosion of visual collaboration initiatives in the industry. Several advances in video services and also in their underlying infrastructure are currently improving the way people collaborate globally. These advances are creating new usage paradigms: any device in any network can be used to collaborate, in most cases with an overall high quality. To keep apace with this technology progression, the CERN IT Department launched a service based on the Vidyo product. This new service architecture introduces Adaptive Video Layering, which dynamically optimizes the video for each endpoint by leveraging the H.264 Scalable Video Coding (SVC)-based compression technology. It combines intelligent AV routing techniques with the flexibility of H.264 SVC video compression, in order to achieve resilient video collaboration over the Internet, 3G and WiFi. We present an overview of the results that have been achieved after this major change. In particular, the first year of operation of the CERN Vidyo service will be described in terms of performance and scale: The service became part of the daily activity of the LHC collaborations, reaching a monthly usage of more than 3200 meetings with a peak of 750 simultaneous connections. We also present some key features such as the integration with CERN Indico. LHC users can now join a Vidyo meeting either from their personal computer or a CERN videoconference room simply from an Indico event page, with the ease of a single click. The roadmap for future improvements, service extensions and core infrastructure tendencies such as cloud based services and virtualization of system components will also be discussed. Vidyo's strengths allowed us to build a universal service (it is accessible from PCs, but also videoconference rooms, traditional phones, tablets and smartphones), developed with 3 key ideas in mind: ease of use, full integration and high quality.

  13. STS-41G earth observations

    NASA Image and Video Library

    1984-10-07

    41G-39-044 (5-13 Oct 1984) --- "Flatirons", cumulonimbus clouds that have flattened out at a high altitude, the result of rapidly rising moist air. At a given altitude, depending on the temperature, wind, and humidity, the cloud mass can no longer rise and the wind aloft shears the cloud. Central Nigeria, an area in which tropical rain forest gives way to dryer savannah lands, lies beneath a layer a heavy haze and smoke. The crew consisted of astronauts Robert L. Crippen, commander; Jon A. McBride, pilot; mission specialist's Kathryn D. Sullivan, Sally K. Ride, and David D. Leestma; Canadian astronaut Marc Garneau, and Paul D. Scully-Power, payload specialist's.

  14. Cloud access to interoperable IVOA-compliant VOSpace storage

    NASA Astrophysics Data System (ADS)

    Bertocco, S.; Dowler, P.; Gaudet, S.; Major, B.; Pasian, F.; Taffoni, G.

    2018-07-01

    Handling, processing and archiving the huge amount of data produced by the new generation of experiments and instruments in Astronomy and Astrophysics are among the more exciting challenges to address in designing the future data management infrastructures and computing services. We investigated the feasibility of a data management and computation infrastructure, available world-wide, with the aim of merging the FAIR data management provided by IVOA standards with the efficiency and reliability of a cloud approach. Our work involved the Canadian Advanced Network for Astronomy Research (CANFAR) infrastructure and the European EGI federated cloud (EFC). We designed and deployed a pilot data management and computation infrastructure that provides IVOA-compliant VOSpace storage resources and wide access to interoperable federated clouds. In this paper, we detail the main user requirements covered, the technical choices and the implemented solutions and we describe the resulting Hybrid cloud Worldwide infrastructure, its benefits and limitations.

  15. Weather in the Cockpit: Priorities, Sources, Delivery, and Needs in the Next Generation Air Transportation System

    DTIC Science & Technology

    2012-07-01

    discomfort. Extreme turbulence could cause physical injuries to pilot/ passengers who are not wearing seat belts. Clear Air Turbulence (CAT) CAT... passengers who are not wearing seat belts. Generally caused by wind shear in the atmosphere where no clouds are present. Mountain Waves Fast...ways in which our analyses could inform the design of information systems. NextGen, in its mature state, envisions pilots having control over

  16. PREFACE: International Conference on Computing in High Energy and Nuclear Physics (CHEP 2012)

    NASA Astrophysics Data System (ADS)

    Ernst, Michael; Düllmann, Dirk; Rind, Ofer; Wong, Tony

    2012-12-01

    The International Conference on Computing in High Energy and Nuclear Physics (CHEP) was held at New York University on 21- 25 May 2012. CHEP is a major series of international conferences for physicists and computing professionals from the High Energy and Nuclear Physics community and related scientific and technical fields. The CHEP conference provides a forum to exchange information on computing progress and needs for the community, and to review recent, ongoing and future activities. CHEP conferences are held at roughly 18-month intervals, alternating between Europe, Asia, the Americas and other parts of the world. Recent CHEP conferences have been held in Taipei, Taiwan (2010); Prague, Czech Republic (2009); Victoria, Canada (2007); Mumbai, India (2006); Interlaken, Switzerland (2004); San Diego, United States (2003); Beijing, China (2001); Padova, Italy (2000). CHEP 2012 was organized by Brookhaven National Laboratory (BNL) and co-sponsored by New York University. The organizational structure for CHEP consists of an International Advisory Committee (IAC) which sets the overall themes of the conference, a Program Organizing Committee (POC) that oversees the program content, and a Local Organizing Committee (LOC) that is responsible for local arrangements (lodging, transportation and social events) and conference logistics (registration, program scheduling, conference site selection and conference proceedings). There were over 500 attendees with a program that included plenary sessions of invited speakers, a number of parallel sessions comprising around 125 oral and 425 poster presentations and industrial exhibitions. We thank all the presenters for the excellent scientific content of their contributions to the conference. Conference tracks covered topics on Online Computing, Event Processing, Distributed Processing and Analysis on Grids and Clouds, Computer Facilities, Production Grids and Networking, Software Engineering, Data Stores and Databases and Collaborative Tools. We would like to thank Brookhaven Science Associates, New York University, Blue Nest Events, the International Advisory Committee, the Program Committee and the Local Organizing Committee members for all their support and assistance. We also would like to acknowledge the support provided by the following sponsors: ACEOLE, Data Direct Networks, Dell, the European Middleware Initiative and Nexsan. Special thanks to the Program Committee members for their careful choice of conference contributions and enormous effort in reviewing and editing the conference proceedings. The next CHEP conference will be held in Amsterdam, the Netherlands on 14-18 October 2013. Conference Chair Michael Ernst (BNL) Program Committee Daniele Bonacorsi, University of Bologna, Italy Simone Campana, CERN, Switzerland Philippe Canal, Fermilab, United States Sylvain Chapeland, CERN, Switzerland Dirk Düllmann, CERN, Switzerland Johannes Elmsheuser, Ludwig Maximilian University of Munich, Germany Maria Girone, CERN, Switzerland Steven Goldfarb, University of Michigan, United States Oliver Gutsche, Fermilab, United States Benedikt Hegner, CERN, Switzerland Andreas Heiss, Karlsruhe Institute of Technology, Germany Peter Hristov, CERN, Switzerland Tony Johnson, SLAC, United States David Lange, LLNL, United States Adam Lyon, Fermilab, United States Remigius Mommsen, Fermilab, United States Axel Naumann, CERN, Switzerland Niko Neufeld, CERN, Switzerland Rolf Seuster, TRIUMF, Canada Local Organizing Committee Maureen Anderson, John De Stefano, Mariette Faulkner, Ognian Novakov, Ofer Rind, Tony Wong (BNL) Kyle Cranmer (NYU) International Advisory Committee Mohammad Al-Turany, GSI, Germany Lothar Bauerdick, Fermilab, United States Ian Bird, CERN, Switzerland Dominique Boutigny, IN2P3, France Federico Carminati, CERN, Switzerland Marco Cattaneo, CERN, Switzerland Gang Chen, Institute of High Energy Physics, China Peter Clarke, University of Edinburgh, United Kingdom Sridhara Dasu, University of Wisconsin-Madison, United States Günter Duckeck, Ludwig Maximilian University of Munich, Germany Richard Dubois, SLAC, United States Michael Ernst, BNL, United States Ian Fisk, Fermilab, United States Gonzalo Merino, PIC, Spain John Gordon, STFC-RAL, United Kingdom Volker Gülzow, DESY, Germany Frederic Hemmer, CERN, Switzerland Viatcheslav Ilyin, Moscow State University, Russia Nobuhiko Katayama, KEK, Japan Alexei Klimentov, BNL, United States Simon C. Lin, Academia Sinica, Taiwan Milos Lokajícek, FZU Prague, Czech Republic David Malon, ANL, United States Pere Mato Vila, CERN, Switzerland Mauro Morandin, INFN CNAF, Italy Harvey Newman, Caltech, United States Farid Ould-Saada, University of Oslo, Norway Ruth Pordes, Fermilab, United States Hiroshi Sakamoto, University of Tokyo, Japan Alberto Santoro, UERJ, Brazil Jim Shank, Boston University, United States Dongchul Son, Kyungpook National University, South Korea Reda Tafirout, TRIUMF, Canada Stephen Wolbers, Fermilab, United States Frank Wuerthwein, UCSD, United States

  17. A multi-year data set on aerosol-cloud-precipitation-meteorology interactions for marine stratocumulus clouds.

    PubMed

    Sorooshian, Armin; MacDonald, Alexander B; Dadashazar, Hossein; Bates, Kelvin H; Coggon, Matthew M; Craven, Jill S; Crosbie, Ewan; Hersey, Scott P; Hodas, Natasha; Lin, Jack J; Negrón Marty, Arnaldo; Maudlin, Lindsay C; Metcalf, Andrew R; Murphy, Shane M; Padró, Luz T; Prabhakar, Gouri; Rissman, Tracey A; Shingler, Taylor; Varutbangkul, Varuntida; Wang, Zhen; Woods, Roy K; Chuang, Patrick Y; Nenes, Athanasios; Jonsson, Haflidi H; Flagan, Richard C; Seinfeld, John H

    2018-02-27

    Airborne measurements of meteorological, aerosol, and stratocumulus cloud properties have been harmonized from six field campaigns during July-August months between 2005 and 2016 off the California coast. A consistent set of core instruments was deployed on the Center for Interdisciplinary Remotely-Piloted Aircraft Studies Twin Otter for 113 flight days, amounting to 514 flight hours. A unique aspect of the compiled data set is detailed measurements of aerosol microphysical properties (size distribution, composition, bioaerosol detection, hygroscopicity, optical), cloud water composition, and different sampling inlets to distinguish between clear air aerosol, interstitial in-cloud aerosol, and droplet residual particles in cloud. Measurements and data analysis follow documented methods for quality assurance. The data set is suitable for studies associated with aerosol-cloud-precipitation-meteorology-radiation interactions, especially owing to sharp aerosol perturbations from ship traffic and biomass burning. The data set can be used for model initialization and synergistic application with meteorological models and remote sensing data to improve understanding of the very interactions that comprise the largest uncertainty in the effect of anthropogenic emissions on radiative forcing.

  18. A multi-year data set on aerosol-cloud-precipitation-meteorology interactions for marine stratocumulus clouds

    PubMed Central

    Sorooshian, Armin; MacDonald, Alexander B.; Dadashazar, Hossein; Bates, Kelvin H.; Coggon, Matthew M.; Craven, Jill S.; Crosbie, Ewan; Hersey, Scott P.; Hodas, Natasha; Lin, Jack J.; Negrón Marty, Arnaldo; Maudlin, Lindsay C.; Metcalf, Andrew R.; Murphy, Shane M.; Padró, Luz T.; Prabhakar, Gouri; Rissman, Tracey A.; Shingler, Taylor; Varutbangkul, Varuntida; Wang, Zhen; Woods, Roy K.; Chuang, Patrick Y.; Nenes, Athanasios; Jonsson, Haflidi H.; Flagan, Richard C.; Seinfeld, John H.

    2018-01-01

    Airborne measurements of meteorological, aerosol, and stratocumulus cloud properties have been harmonized from six field campaigns during July-August months between 2005 and 2016 off the California coast. A consistent set of core instruments was deployed on the Center for Interdisciplinary Remotely-Piloted Aircraft Studies Twin Otter for 113 flight days, amounting to 514 flight hours. A unique aspect of the compiled data set is detailed measurements of aerosol microphysical properties (size distribution, composition, bioaerosol detection, hygroscopicity, optical), cloud water composition, and different sampling inlets to distinguish between clear air aerosol, interstitial in-cloud aerosol, and droplet residual particles in cloud. Measurements and data analysis follow documented methods for quality assurance. The data set is suitable for studies associated with aerosol-cloud-precipitation-meteorology-radiation interactions, especially owing to sharp aerosol perturbations from ship traffic and biomass burning. The data set can be used for model initialization and synergistic application with meteorological models and remote sensing data to improve understanding of the very interactions that comprise the largest uncertainty in the effect of anthropogenic emissions on radiative forcing. PMID:29485627

  19. A multi-year data set on aerosol-cloud-precipitation-meteorology interactions for marine stratocumulus clouds

    NASA Astrophysics Data System (ADS)

    Sorooshian, Armin; MacDonald, Alexander B.; Dadashazar, Hossein; Bates, Kelvin H.; Coggon, Matthew M.; Craven, Jill S.; Crosbie, Ewan; Hersey, Scott P.; Hodas, Natasha; Lin, Jack J.; Negrón Marty, Arnaldo; Maudlin, Lindsay C.; Metcalf, Andrew R.; Murphy, Shane M.; Padró, Luz T.; Prabhakar, Gouri; Rissman, Tracey A.; Shingler, Taylor; Varutbangkul, Varuntida; Wang, Zhen; Woods, Roy K.; Chuang, Patrick Y.; Nenes, Athanasios; Jonsson, Haflidi H.; Flagan, Richard C.; Seinfeld, John H.

    2018-02-01

    Airborne measurements of meteorological, aerosol, and stratocumulus cloud properties have been harmonized from six field campaigns during July-August months between 2005 and 2016 off the California coast. A consistent set of core instruments was deployed on the Center for Interdisciplinary Remotely-Piloted Aircraft Studies Twin Otter for 113 flight days, amounting to 514 flight hours. A unique aspect of the compiled data set is detailed measurements of aerosol microphysical properties (size distribution, composition, bioaerosol detection, hygroscopicity, optical), cloud water composition, and different sampling inlets to distinguish between clear air aerosol, interstitial in-cloud aerosol, and droplet residual particles in cloud. Measurements and data analysis follow documented methods for quality assurance. The data set is suitable for studies associated with aerosol-cloud-precipitation-meteorology-radiation interactions, especially owing to sharp aerosol perturbations from ship traffic and biomass burning. The data set can be used for model initialization and synergistic application with meteorological models and remote sensing data to improve understanding of the very interactions that comprise the largest uncertainty in the effect of anthropogenic emissions on radiative forcing.

  20. Historic AVHRR Processing in the Eumetsat Climate Monitoring Satellite Application Facility (cmsaf) (Invited)

    NASA Astrophysics Data System (ADS)

    Karlsson, K.

    2010-12-01

    The EUMETSAT CMSAF project (www.cmsaf.eu) compiles climatological datasets from various satellite sources with emphasis on the use of EUMETSAT-operated satellites. However, since climate monitoring primarily has a global scope, also datasets merging data from various satellites and satellite operators are prepared. One such dataset is the CMSAF historic GAC (Global Area Coverage) dataset which is based on AVHRR data from the full historic series of NOAA-satellites and the European METOP satellite in mid-morning orbit launched in October 2006. The CMSAF GAC dataset consists of three groups of products: Macroscopical cloud products (cloud amount, cloud type and cloud top), cloud physical products (cloud phase, cloud optical thickness and cloud liquid water path) and surface radiation products (including surface albedo). Results will be presented and discussed for all product groups, including some preliminary inter-comparisons with other datasets (e.g., PATMOS-X, MODIS and CloudSat/CALIPSO datasets). A background will also be given describing the basic methodology behind the derivation of all products. This will include a short historical review of AVHRR cloud processing and resulting AVHRR applications at SMHI. Historic GAC processing is one of five pilot projects selected by the SCOPE-CM (Sustained Co-Ordinated Processing of Environmental Satellite data for Climate Monitoring) project organised by the WMO Space programme. The pilot project is carried out jointly between CMSAF and NOAA with the purpose of finding an optimal GAC processing approach. The initial activity is to inter-compare results of the CMSAF GAC dataset and the NOAA PATMOS-X dataset for the case when both datasets have been derived using the same inter-calibrated AVHRR radiance dataset. The aim is to get further knowledge of e.g. most useful multispectral methods and the impact of ancillary datasets (for example from meteorological reanalysis datasets from NCEP and ECMWF). The CMSAF project is currently defining plans for another five years (2012-2017) of operations and development. New GAC reprocessing efforts are planned and new methodologies will be tested. Central questions here will be how to increase the quantitative use of the products through improving error and uncertainty estimates and how to compile the information in a way to allow meaningful and efficient ways of using the data for e.g. validation of climate model information.

  1. [Sensory illusions in hang-gliding].

    PubMed

    Bousquet, F; Bizeau, A; Resche-Rigon, P; Taillemite, J P; De Rotalier

    1997-01-01

    Sensory illusions in hang-gliding and para-gliding. Hang-gliding and para-gliding are at the moment booming sports. Sensory illusions are physiological phenomena sharing the wrong perception of the pilote's real position in space. These phenomena are very familiar to aeroplane pilotes, they can also be noticed on certain conditions with hang-gliding pilotes. There are many and various sensory illusions, but only illusions of vestibular origin will be dealt with in this article. Vestibular physiology is reminded with the working principle of a semicircular canal. Physiology and laws of physics explain several sensory illusions, especially when the pilote loses his visual landmarks: flying through a cloud, coriolis effect. Also some specific stages of hang-gliding foster those phenomena: spiraling downwards, self-rotation, following an asymetric closing of the parachute, spin on oneself. Therefore a previous briefing for the pilotes seems necessary.

  2. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    NASA Astrophysics Data System (ADS)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.

  3. Estimates of the fraction of precipitation seedable under application of the Wyoming weather modification pilot project seeding criteria

    NASA Astrophysics Data System (ADS)

    Ritzman, Jaclyn M.

    The objective of the Wyoming Weather Modification Pilot Project is to evaluate the effect of glaciogenic seeding on wintertime precipitation over two co-located barriers in southeast Wyoming. Orographic clouds are to be targeted if they meet strict criteria. An analysis of the impact of seeding requires knowledge of the amount of precipitation that fell from seedable clouds. This amount of precipitation was determined by applying the strict seeding criteria to an eight-year simulation from the Weather Research and Forecasting model at 4-km horizontal resolution. Results from the analysis from the model suggested that the fraction of seedable precipitation was 35.1% (35.5%) over the Sierra Madre and Medicine Bow mountain ranges from 2000-2008. This fraction decreases to 23.2% (23.0%) under a warmer, future climate scenario over the Sierra Madres (Medicine Bows).

  4. LHCb experience with running jobs in virtual machines

    NASA Astrophysics Data System (ADS)

    McNab, A.; Stagni, F.; Luzzi, C.

    2015-12-01

    The LHCb experiment has been running production jobs in virtual machines since 2013 as part of its DIRAC-based infrastructure. We describe the architecture of these virtual machines and the steps taken to replicate the WLCG worker node environment expected by user and production jobs. This relies on the uCernVM system for providing root images for virtual machines. We use the CernVM-FS distributed filesystem to supply the root partition files, the LHCb software stack, and the bootstrapping scripts necessary to configure the virtual machines for us. Using this approach, we have been able to minimise the amount of contextualisation which must be provided by the virtual machine managers. We explain the process by which the virtual machine is able to receive payload jobs submitted to DIRAC by users and production managers, and how this differs from payloads executed within conventional DIRAC pilot jobs on batch queue based sites. We describe our operational experiences in running production on VM based sites managed using Vcycle/OpenStack, Vac, and HTCondor Vacuum. Finally we show how our use of these resources is monitored using Ganglia and DIRAC.

  5. Testing as a Service with HammerCloud

    NASA Astrophysics Data System (ADS)

    Medrano Llamas, Ramón; Barrand, Quentin; Elmsheuser, Johannes; Legger, Federica; Sciacca, Gianfranco; Sciabà, Andrea; van der Ster, Daniel

    2014-06-01

    HammerCloud was designed and born under the needs of the grid community to test the resources and automate operations from a user perspective. The recent developments in the IT space propose a shift to the software defined data centres, in which every layer of the infrastructure can be offered as a service. Testing and monitoring is an integral part of the development, validation and operations of big systems, like the grid. This area is not escaping the paradigm shift and we are starting to perceive as natural the Testing as a Service (TaaS) offerings, which allow testing any infrastructure service, such as the Infrastructure as a Service (IaaS) platforms being deployed in many grid sites, both from the functional and stressing perspectives. This work will review the recent developments in HammerCloud and its evolution to a TaaS conception, in particular its deployment on the Agile Infrastructure platform at CERN and the testing of many IaaS providers across Europe in the context of experiment requirements. The first section will review the architectural changes that a service running in the cloud needs, such an orchestration service or new storage requirements in order to provide functional and stress testing. The second section will review the first tests of infrastructure providers on the perspective of the challenges discovered from the architectural point of view. Finally, the third section will evaluate future requirements of scalability and features to increase testing productivity.

  6. Evidence for Natural Variability in Marine Stratocumulus Cloud Properties Due to Cloud-Aerosol

    NASA Technical Reports Server (NTRS)

    Albrecht, Bruce; Sharon, Tarah; Jonsson, Haf; Minnis, Patrick; Minnis, Patrick; Ayers, J. Kirk; Khaiyer, Mandana M.

    2004-01-01

    In this study, aircraft observations from the Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS) Twin Otter are used to characterize the variability in drizzle, cloud, and aerosol properties associated with cloud rifts and the surrounding solid clouds observed off the coast of California. A flight made on 16 July 1999 provided measurements directly across an interface between solid and rift cloud conditions. Aircraft instrumentation allowed for measurements of aerosol, cloud droplet, and drizzle spectra. CCN concentrations were measured in addition to standard thermodynamic variables and the winds. A Forward Scatter Spectrometer Probe (FSSP) measured size distribution of cloud-sized droplets. A Cloud Imaging Probe (CIP) was used to measure distributions of drizzle-sized droplets. Aerosol distributions were obtained from a Cloud Aerosol Scatterprobe (CAS). The CAS probe measured aerosols, cloud droplets and drizzle-sized drops; for this study. The CAS probe was used to measure aerosols in the size range of 0.5 micron - 1 micron. Smaller aerosols were characterized using an Ultrafine Condensation Particle Counter (CPC) sensor. The CPC was used to measure particles with diameters greater than 0.003 micron. By subtracting different count concentrations measured with the CPC, this probe was capable of identifying ultrafine particles those falling in the size range of 3 nanometers - 7 nanometers that are believed to be associated with new particle production.

  7. multi-dimensional Cloud-aERosol Exploratory Study using RPAS (mCERES): Bottom-up and top-down closure of aerosol-cloud interactions

    NASA Astrophysics Data System (ADS)

    Roberts, Greg; Calmer, Radiance; Sanchez, Kevin; Cayez, Grégoire; Nicoll, Kerianne; Hashimshoni, Eyal; Rosenfeld, Daniel; Ansmann, Albert; Sciare, Jean; Ovadneite, Jurgita; Bronz, Murat; Hattenberger, Gautier; Preissler, Jana; Buehl, Johannes; Ceburnis, Darius; O'Dowd, Colin

    2016-04-01

    Clouds are omnipresent in earth's atmosphere and constitute an important role in regulating the radiative budget of the planet. However, the response of clouds to climate change remains uncertain, in particular, with respect to aerosol-cloud interactions and feedback mechanisms between the biosphere and atmosphere. Aerosol-cloud interactions and their feedbacks are the main themes of the European project FP7 BACCHUS (Impact of Biogenic versus Anthropogenic Emissions on Clouds and Climate: towards a Holistic Understanding). The National Center for Meteorological Research (CNRM-GAME, Toulouse, France) conducted airborne experiments in Cyprus and Ireland in March and August 2015 respectively to link ground-based and satellite observations. Multiple RPAS (remotely piloted aircraft systems) were instrumented for a specific scientific focus to characterize the vertical distribution of aerosol, cloud microphysical properties, radiative fluxes, 3D wind vectors and meteorological state parameters. Flights below and within clouds were coordinated with satellite overpasses to perform 'top-down' closure of cloud micro-physical properties. Measurements of cloud condensation nuclei spectra at the ground-based site have been used to determine cloud microphyical properties using wind vectors and meteorological parameters measured by the RPAS at cloud base. These derived cloud properties have been validated by in-situ RPAS measurements in the cloud and compared to those derived by the Suomi-NPP satellite. In addition, RPAS profiles in Cyprus observed the layers of dust originating from the Arabian Peninsula and the Sahara Desert. These profiles generally show a well-mixed boundary layer and compare well with ground-based LIDAR observations.

  8. Virtual machine provisioning, code management, and data movement design for the Fermilab HEPCloud Facility

    NASA Astrophysics Data System (ADS)

    Timm, S.; Cooper, G.; Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Grassano, D.; Tiradani, A.; Krishnamurthy, R.; Vinayagam, S.; Raicu, I.; Wu, H.; Ren, S.; Noh, S.-Y.

    2017-10-01

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores. This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.

  9. Virtual Machine Provisioning, Code Management, and Data Movement Design for the Fermilab HEPCloud Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timm, S.; Cooper, G.; Fuess, S.

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores.more » This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.« less

  10. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  11. Analysis of the Meteorology Associated with the 1998 NASA Glenn Twin Otter Icing Flights

    NASA Technical Reports Server (NTRS)

    Bernstein, Ben C.

    2000-01-01

    This document contains a basic analysis of the meteorology associated with the NASA Glenn Twin Otter icing encounters between December 1997 and March 1998. The purpose of this analysis is to provide a meteorological context for the aircraft data collected during these flights. For each case, the following data elements are presented: (1) A brief overview of the Twin Otter encounter, including locations, liquid water contents, temperatures and microphysical makeup of the clouds and precipitation aloft, (2) Upper-air charts, providing hand-analyzed locations of lows, troughs, ridges, saturated/unsaturated air, temperatures, warm/cold advection, and jet streams, (3) Balloon-borne soundings, providing vertical profiles of temperature, moisture and winds, (4) Infrared and visible satellite data, providing cloud locations and cloud top temperature, (5) 3-hourly surface charts, providing hand-analyzed locations of lows, highs, fronts, precipitation (including type) and cloud cover, (6) Hourly, regional radar mosaics, providing fine resolution of the locations of precipitation (including intensity and type), pilot reports of icing (including intensity and type), surface observations of precipitation type and Twin Otter tracks for a one hour window centered on the time of the radar data, and (7) Hourly plots of icing pilot reports, providing the icing intensity, icing type, icing altitudes and aircraft type. Outages occurred in nearly every dataset at some point. All relevant data that was available is presented here. All times are in UTC and all heights are in feet above mean sea level (MSL).

  12. The ISB Cancer Genomics Cloud: A Flexible Cloud-Based Platform for Cancer Genomics Research.

    PubMed

    Reynolds, Sheila M; Miller, Michael; Lee, Phyliss; Leinonen, Kalle; Paquette, Suzanne M; Rodebaugh, Zack; Hahn, Abigail; Gibbs, David L; Slagel, Joseph; Longabaugh, William J; Dhankani, Varsha; Reyes, Madelyn; Pihl, Todd; Backus, Mark; Bookman, Matthew; Deflaux, Nicole; Bingham, Jonathan; Pot, David; Shmulevich, Ilya

    2017-11-01

    The ISB Cancer Genomics Cloud (ISB-CGC) is one of three pilot projects funded by the National Cancer Institute to explore new approaches to computing on large cancer datasets in a cloud environment. With a focus on Data as a Service, the ISB-CGC offers multiple avenues for accessing and analyzing The Cancer Genome Atlas, TARGET, and other important references such as GENCODE and COSMIC using the Google Cloud Platform. The open approach allows researchers to choose approaches best suited to the task at hand: from analyzing terabytes of data using complex workflows to developing new analysis methods in common languages such as Python, R, and SQL; to using an interactive web application to create synthetic patient cohorts and to explore the wealth of available genomic data. Links to resources and documentation can be found at www.isb-cgc.org Cancer Res; 77(21); e7-10. ©2017 AACR . ©2017 American Association for Cancer Research.

  13. The Namibia Early Flood Warning System, A CEOS Pilot Project

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Frye, Stuart; Cappelaere, Pat; Sohlberg, Robert; Handy, Matthew; Grossman, Robert

    2012-01-01

    Over the past year few years, an international collaboration has developed a pilot project under the auspices of Committee on Earth Observation Satellite (CEOS) Disasters team. The overall team consists of civilian satellite agencies. For this pilot effort, the development team consists of NASA, Canadian Space Agency, Univ. of Maryland, Univ. of Colorado, Univ. of Oklahoma, Ukraine Space Research Institute and Joint Research Center(JRC) for European Commission. This development team collaborates with regional , national and international agencies to deliver end-to-end disaster coverage. In particular, the team in collaborating on this effort with the Namibia Department of Hydrology to begin in Namibia . However, the ultimate goal is to expand the functionality to provide early warning over the South Africa region. The initial collaboration was initiated by United Nations Office of Outer Space Affairs and CEOS Working Group for Information Systems and Services (WGISS). The initial driver was to demonstrate international interoperability using various space agency sensors and models along with regional in-situ ground sensors. In 2010, the team created a preliminary semi-manual system to demonstrate moving and combining key data streams and delivering the data to the Namibia Department of Hydrology during their flood season which typically is January through April. In this pilot, a variety of moderate resolution and high resolution satellite flood imagery was rapidly delivered and used in conjunction with flood predictive models in Namibia. This was collected in conjunction with ground measurements and was used to examine how to create a customized flood early warning system. During the first year, the team made use of SensorWeb technology to gather various sensor data which was used to monitor flood waves traveling down basins originating in Angola, but eventually flooding villages in Namibia. The team made use of standardized interfaces such as those articulated under the Open Cloud Consortium (OGC) Sensor Web Enablement (SWE) set of web services was good [1][2]. However, it was discovered that in order to make a system like this functional, there were many performance issues. Data sets were large and located in a variety of location behind firewalls and had to be accessed across open networks, so security was an issue. Furthermore, the network access acted as bottleneck to transfer map products to where they are needed. Finally, during disasters, many users and computer processes act in parallel and thus it was very easy to overload the single string of computers stitched together in a virtual system that was initially developed. To address some of these performance issues, the team partnered with the Open Cloud Consortium (OCC) who supplied a Computation Cloud located at the University of Illinois at Chicago and some manpower to administer this Cloud. The Flood SensorWeb [3] system was interfaced to the Cloud to provide a high performance user interface and product development engine. Figure 1 shows the functional diagram of the Flood SensorWeb. Figure 2 shows some of the functionality of the Computation Cloud that was integrated. A significant portion of the original system was ported to the Cloud and during the past year, technical issues were resolved which included web access to the Cloud, security over the open Internet, beginning experiments on how to handle surge capacity by using the virtual machines in the cloud in parallel, using tiling techniques to render large data sets as layers on map, interfaces to allow user to customize the data processing/product chain and other performance enhancing techniques. The conclusion reached from the effort and this presentation is that defining the interoperability standards in a small fraction of the work. For example, once open web service standards were defined, many users could not make use of the standards due to security restrictions. Furthermore, once an interoperable sysm is functional, then a surge of users can render a system unusable, especially in the disaster domain.

  14. The Awareness and Challenges of Cloud Computing Adoption on Tertiary Education in Malaysia

    NASA Astrophysics Data System (ADS)

    Hazreeni Hamzah, Nor; Mahmud, Maziah; Zukri, Shamsunarnie Mohamed; Yaacob, Wan Fairos Wan; Yacob, Jusoh

    2017-09-01

    This preliminary study aims to investigate the awareness of the adoption of cloud computing among the academicians in tertiary education in Malaysia. Besides, this study also want to explore the possible challenges faced by the academician while adopting this new technology. The pilot study was done on 40 lecturers in Universiti Teknologi MARA Kampus Kota Bharu (UiTMKB) by using self administered questionnaire. The results found that almost half (40 percent) were not aware on the existing of cloud computing in teaching and learning (T&L) process. The challenges confronting the adoption of cloud computing are data insecurity, data insecurity, unsolicited advertisement, lock-in, reluctance to eliminate staff positions, privacy concerns, reliability challenge, regulatory compliance concerns/user control and institutional culture/resistance to change in technology. This possible challenges can be factorized in two major factors which were security and dependency factor and user control and mentality factor.

  15. The Environmental Virtual Observatory (EVO) local exemplar: A cloud based local landscape learning visualisation tool for communicating flood risk to catchment stakeholders

    NASA Astrophysics Data System (ADS)

    Wilkinson, Mark; Beven, Keith; Brewer, Paul; El-khatib, Yehia; Gemmell, Alastair; Haygarth, Phil; Mackay, Ellie; Macklin, Mark; Marshall, Keith; Quinn, Paul; Stutter, Marc; Thomas, Nicola; Vitolo, Claudia

    2013-04-01

    Today's world is dominated by a wide range of informatics tools that are readily available to a wide range of stakeholders. There is growing recognition that the appropriate involvement of local communities in land and water management decisions can result in multiple environmental, economic and social benefits. Therefore, local stakeholder groups are increasingly being asked to participate in decision making alongside policy makers, government agencies and scientists. As such, addressing flooding issues requires new ways of engaging with the catchment and its inhabitants at a local level. To support this, new tools and approaches are required. The growth of cloud based technologies offers new novel ways to facilitate this process of exchange of information in earth sciences. The Environmental Virtual Observatory Pilot project (EVOp) is a new initiative from the UK Natural Environment Research Council (NERC) designed to deliver proof of concept for new tools and approaches to support the challenges as outlined above (http://www.evo-uk.org/). The long term vision of the Environmental Virtual Observatory is to: • Make environmental data more visible and accessible to a wide range of potential users including public good applications; • Provide tools to facilitate the integrated analysis of data, greater access to added knowledge and expert analysis and visualisation of the results; • Develop new, added-value knowledge from public and private sector data assets to help tackle environmental challenges. As part of the EVO pilot, an interactive cloud based tool has been developed with local stakeholders. The Local Landscape Visualisation Tool attempts to communicate flood risk in local impacted communities. The tool has been developed iteratively to reflect the needs, interests and capabilities of a wide range of stakeholders. This tool (assessable via a web portal) combines numerous cloud based tools and services, local catchment datasets, hydrological models and novel visualisation techniques. This pilot tool has been developed by engaging with different stakeholder groups in three catchments in the UK; the Afon Dyfi (Wales), the River Tarland (Scotland) and the River Eden (England). Stakeholders were interested in accessing live data in their catchments and looking at different land use change scenarios on flood peaks. Visualisation tools have been created which offer access to real time data (such as river level, rainfall and webcam images). Other tools allow land owners to use cloud based models (example presented here uses Topmodel, a rainfall-runoff model, on a custom virtual machine image on Amazon web services) and local datasets to explore future land use scenarios, allowing them to understand the associated flood risk. Different ways to communicate model uncertainty are currently being investigated and discussed with stakeholders. In summary the pilot project has had positive feedback and has evolved into two unique parts; a web based map tool and a model interface tool. Users can view live data from different sources, combine different data types together (data mash-up), develop local scenarios for land use and flood risk and exploit the dynamic, elastic cloud modelling capability. This local toolkit will reside within a wider EVO platform that will include national and global datasets, models and state of the art cloud computer systems.

  16. The ALICE Software Release Validation cluster

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Krzewicki, M.

    2015-12-01

    One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future.

  17. Status of the DIRAC Project

    NASA Astrophysics Data System (ADS)

    Casajus, A.; Ciba, K.; Fernandez, V.; Graciani, R.; Hamar, V.; Mendez, V.; Poss, S.; Sapunov, M.; Stagni, F.; Tsaregorodtsev, A.; Ubeda, M.

    2012-12-01

    The DIRAC Project was initiated to provide a data processing system for the LHCb Experiment at CERN. It provides all the necessary functionality and performance to satisfy the current and projected future requirements of the LHCb Computing Model. A considerable restructuring of the DIRAC software was undertaken in order to turn it into a general purpose framework for building distributed computing systems that can be used by various user communities in High Energy Physics and other scientific application domains. The CLIC and ILC-SID detector projects started to use DIRAC for their data production system. The Belle Collaboration at KEK, Japan, has adopted the Computing Model based on the DIRAC system for its second phase starting in 2015. The CTA Collaboration uses DIRAC for the data analysis tasks. A large number of other experiments are starting to use DIRAC or are evaluating this solution for their data processing tasks. DIRAC services are included as part of the production infrastructure of the GISELA Latin America grid. Similar services are provided for the users of the France-Grilles and IBERGrid National Grid Initiatives in France and Spain respectively. The new communities using DIRAC started to provide important contributions to its functionality. Among recent additions can be mentioned the support of the Amazon EC2 computing resources as well as other Cloud management systems; a versatile File Replica Catalog with File Metadata capabilities; support for running MPI jobs in the pilot based Workload Management System. Integration with existing application Web Portals, like WS-PGRADE, is demonstrated. In this paper we will describe the current status of the DIRAC Project, recent developments of its framework and functionality as well as the status of the rapidly evolving community of the DIRAC users.

  18. Stratocumulus Precipitation and Entrainment Experiment (SPEE) Field Campaign Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albrecht, Bruce; Ghate, Virendra; CADeddu, Maria

    2016-06-01

    The scientific focus of this project was to examine precipitation and entrainment processes in marine stratocumulus clouds. The entrainment studies focused on characterizing cloud turbulence at cloud top using Doppler cloud radar observations. The precipitation studies focused on characterizing the precipitation and the macroscopic properties (cloud thickness, and liquid water path) of the clouds. This project will contribute to the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility’s overall objective of providing the remote-sensing observations needed to improve the representation of key cloud processes in climate models. It will be of direct relevance to the componentsmore » of ARM dealing with entrainment and precipitation processes in stratiform clouds. Further, the radar observing techniques that will be used in this study were developed using ARM Southern Great Plains (SGP) facility observations under Atmospheric System Research (ASR) support. The observing systems operating automatously from a site located just north of the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS) aircraft hangar in Marina, California during the period of 1 May to 4 November 2015 included: 1. Microwave radiometer: ARM Microwave Radiometer, 3-Channel (MWR3C) with channels centered at 23.834, 30, and 89 GHz; supported by Dr. Maria Cadeddu. 2. Cloud Radar: CIRPAS 95 GHz Frequency Modulated Continuous Wave (FMCW) Cloud Radar (Centroid Frequency Chirp Rate [CFCR]); operations overseen by Drs. Ghate and Albrecht. 3. Ceilometer: Vaisala CK-14; operations overseen by Drs. Ghate and Albrecht.« less

  19. Proceedings of the Second Pilot Climate Data System Workshop

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The proceedings of the workshop held on January 29 and 30, 1986 are discussed. Data management, satellite radiance data, clouds, ultraviolet flux variations in the upper atmosphere, rainfall during El Nino events, and the use of optical disks are among the topics covered.

  20. Astronaut John Young in shadow of Lunar Module behind ultraviolet camera

    NASA Image and Video Library

    1972-04-22

    AS16-114-18439 (22 April 1972) --- Astronaut Charles M. Duke Jr., lunar module pilot, stands in the shadow of the Lunar Module (LM) behind the ultraviolet (UV) camera which is in operation. This photograph was taken by astronaut John W. Young, commander, during the mission's second extravehicular activity (EVA). The UV camera's gold surface is designed to maintain the correct temperature. The astronauts set the prescribed angles of azimuth and elevation (here 14 degrees for photography of the large Magellanic Cloud) and pointed the camera. Over 180 photographs and spectra in far-ultraviolet light were obtained showing clouds of hydrogen and other gases and several thousand stars. The United States flag and Lunar Roving Vehicle (LRV) are in the left background. While astronauts Young and Duke descended in the Apollo 16 Lunar Module (LM) "Orion" to explore the Descartes highlands landing site on the moon, astronaut Thomas K. Mattingly II, command module pilot, remained with the Command and Service Modules (CSM) "Casper" in lunar orbit.

  1. Astronaut Charles M. Duke, Jr., in shadow of Lunar Module behind ultraviolet camera

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Astronaut Charles M. Duke, Jr., lunar module pilot, stands in the shadow of the Lunar Module (LM) behind the ultraviolet (UV) camera which is in operation. This photograph was taken by astronaut John W. Young, mission commander, during the mission's second extravehicular activity (EVA-2). The UV camera's gold surface is designed to maintain the correct temperature. The astronauts set the prescribed angles of azimuth and elevation (here 14 degrees for photography of the large Magellanic Cloud) and pointed the camera. Over 180 photographs and spectra in far-ultraviolet light were obtained showing clouds of hydrogen and other gases and several thousand stars. The United States flag and Lunar Roving Vehicle (LRV) are in the left background. While astronauts Young and Duke descended in the Apollo 16 Lunar Module (lm) 'Orion' to explore the Descartes highlands landing site on the Moon, astronaut Thomas K. Mattingly II, command module pilot, remained with the Command and Service Modules (csm) 'Casper' in lunar orbit.

  2. STS-111 Flight Day 2 Highlights

    NASA Technical Reports Server (NTRS)

    2002-01-01

    On Flight Day 2 of STS-111, the crew of Endeavour (Kenneth Cockrell, Commander; Paul Lockhart, Pilot; Franklin Chang-Diaz, Mission Specialist; Philippe Perrin, Mission Specialist) and the Expedition 5 crew (Valery Korzun, Commander; Peggy Whitson, Flight Engineer; Sergei Treschev, Flight Engineer), having successfully entered orbit around the Earth, begin to maneuver towards the International Space Station (ISS), where the Expedition 5 crew will replace the Expedition 4 crew. Live video is shown of the Earth from several vantage points aboard the Shuttle. The center-line camera, which will allow Shuttle pilots to align the docking apparatus with that on the ISS, provides footage of the Earth. Chang-Diaz participates in an interview, in Spanish, conducted from the ground via radio communications, with Cockrell also appearing. Footage of the Earth includes: Daytime video of the Eastern United States with some cloud cover as Endeavour passes over the Florida panhandle, Georgia, and the Carolinas; Daytime video of Lake Michigan unobscured by cloud cover; Nighttime low-light camera video of Madrid, Spain.

  3. Analysis of the Meteorology Associated with the 1997 NASA Glenn Twin Otter Icing Events

    NASA Technical Reports Server (NTRS)

    Bernstein, Ben C.

    2000-01-01

    This part of the document contains an analysis of the meteorology associated with the premier icing encounters from the January-March 1997 NASA Twin Otter dataset. The purpose of this analysis is to provide a meteorological context for the aircraft data collected during these flights. For each case, the following data elements are presented: (1) A detailed discussion of the Twin Otter encounter, including locations, liquid water contents, temperatures and microphysical makeup of the clouds and precipitation aloft, (2) Upper-air charts, providing hand-analyzed locations of lows, troughs, ridges, saturated/unsaturated air, temperatures, warm/cold advection, and jet streams, (3) Balloon-borne soundings, providing vertical profiles of temperature, moisture and winds, (4) Infrared satellite data, providing cloud locations and cloud top temperature, (5) 3-hourly surface charts, providing hand-analyzed locations of lows, highs, fronts, precipitation (including type) and cloud cover, (6) Hourly plots of icing pilot reports, providing the icing intensity, icing type, icing altitudes and aircraft type, (7) Hourly, regional radar mosaics, providing fine resolution of the locations of precipitation (including intensity and type), pilot reports of icing (including intensity and type), surface observations of precipitation type and Twin Otter tracks for a one hour window centered on the time of the radar data, and (8) Plots of data from individual NEXRAD radars at times and elevation angles that have been matched to Twin Otter flight locations. Outages occurred in nearly every dataset at some point. All relevant data that was available is presented here. All times are in UTC and all heights are in feet above mean sea level (MSL).

  4. Clouds, Aerosols, and Precipitation in the Marine Boundary Layer: An Arm Mobile Facility Deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Robert; Wyant, Matthew; Bretherton, Christopher S.

    The Clouds, Aerosol, and Precipitation in the Marine Boundary Layer (CAP-MBL) deployment at Graciosa Island in the Azores generated a 21 month (April 2009-December 2010) comprehensive dataset documenting clouds, aerosols and precipitation using the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF). The scientific aim of the deployment is to gain improved understanding of the interactions of clouds, aerosols and precipitation in the marine boundary layer. Graciosa Island straddles the boundary between the subtropics and midlatitudes in the Northeast Atlantic Ocean, and consequently experiences a great diversity of meteorological and cloudiness conditions. Low clouds are the dominant cloud type, with stratocumulusmore » and cumulus occurring regularly. Approximately half of all clouds contained precipitation detectable as radar echoes below the cloud base. Radar and satellite observations show that clouds with tops from 1- 11 km contribute more or less equally to surface-measured precipitation at Graciosa. A wide range of aerosol conditions was sampled during the deployment consistent with the diversity of sources as indicated by back trajectory analysis. Preliminary findings suggest important two-way interactions between aerosols and clouds at Graciosa, with aerosols affecting light precipitation and cloud radiative properties while being controlled in part by precipitation scavenging. The data from at Graciosa are being compared with short-range forecasts made a variety of models. A pilot analysis with two climate and two weather forecast models shows that they reproduce the observed time-varying vertical structure of lower-tropospheric cloud fairly well, but the cloud-nucleating aerosol concentrations less well. The Graciosa site has been chosen to be a permanent fixed ARM site that became operational in October 2013.« less

  5. Clouds, aerosol, and precipitation in the Marine Boundary Layer: An ARM mobile facility deployment

    DOE PAGES

    Wood, Robert; Luke, Ed; Wyant, Matthew; ...

    2014-04-27

    The Clouds, Aerosol, and Precipitation in the Marine Boundary Layer (CAP-MBL) deployment at Graciosa Island in the Azores generated a 21-month (April 2009-December 2010) comprehensive dataset documenting clouds, aerosols, and precipitation using the Atmospheric Radiation Measurement Program (ARM) Mobile Facility (AMF). The scientific aim of the deployment is to gain improved understanding of the interactions of clouds, aerosols, and precipitation in the marine boundary layer. Graciosa Island straddles the boundary between the subtropics and midlatitudes in the Northeast Atlantic Ocean and consequently experiences a great diversity of meteorological and cloudiness conditions. Low clouds are the dominant cloud type, with stratocumulusmore » and cumulus occurring regularly. Approximately half of all clouds contained precipitation detectable as radar echoes below the cloud base. Radar and satellite observations show that clouds with tops from 1-11 km contribute more or less equally to surface-measured precipitation at Graciosa. A wide range of aerosol conditions was sampled during the deployment consistent with the diversity of sources as indicated by back-trajectory analysis. Preliminary findings suggest important two-way interactions between aerosols and clouds at Graciosa, with aerosols affecting light precipitation and cloud radiative properties while being controlled in part by precipitation scavenging.The data from Graciosa are being compared with short-range forecasts made with a variety of models. A pilot analysis with two climate and two weather forecast models shows that they reproduce the observed time-varying vertical structure of lower-tropospheric cloud fairly well but the cloud-nucleating aerosol concentrations less well. The Graciosa site has been chosen to be a permanent fixed ARM site that became operational in October 2013.« less

  6. Clouds, Aerosols, and Precipitation in the Marine Boundary Layer: An Arm Mobile Facility Deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Robert; Wyant, Matthew; Bretherton, Christopher S.

    The Clouds, Aerosol, and Precipitation in the Marine Boundary Layer (CAP-MBL) 38 deployment at Graciosa Island in the Azores generated a 21 month (April 2009-December 2010) 39 comprehensive dataset documenting clouds, aerosols and precipitation using the Atmospheric 40 Radiation Measurement (ARM) Mobile Facility (AMF). The scientific aim of the deployment is 41 to gain improved understanding of the interactions of clouds, aerosols and precipitation in the 42 marine boundary layer. 43 Graciosa Island straddles the boundary between the subtropics and midlatitudes in the 44 Northeast Atlantic Ocean, and consequently experiences a great diversity of meteorological and 45 cloudiness conditions. Lowmore » clouds are the dominant cloud type, with stratocumulus and cumulus 46 occurring regularly. Approximately half of all clouds contained precipitation detectable as radar 47 echoes below the cloud base. Radar and satellite observations show that clouds with tops from 1-48 11 km contribute more or less equally to surface-measured precipitation at Graciosa. A wide 49 range of aerosol conditions was sampled during the deployment consistent with the diversity of 50 sources as indicated by back trajectory analysis. Preliminary findings suggest important two-way 51 interactions between aerosols and clouds at Graciosa, with aerosols affecting light precipitation 52 and cloud radiative properties while being controlled in part by precipitation scavenging. 53 The data from at Graciosa are being compared with short-range forecasts made a variety 54 of models. A pilot analysis with two climate and two weather forecast models shows that they 55 reproduce the observed time-varying vertical structure of lower-tropospheric cloud fairly well, 56 but the cloud-nucleating aerosol concentrations less well. The Graciosa site has been chosen to 57 be a long-term ARM site that became operational in October 2013.« less

  7. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  8. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  9. Integration of High-Performance Computing into Cloud Computing Services

    NASA Astrophysics Data System (ADS)

    Vouk, Mladen A.; Sills, Eric; Dreher, Patrick

    High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).

  10. Learning and Design with Online Real-Time Collaboration

    ERIC Educational Resources Information Center

    Stevenson, Michael; Hedberg, John G.

    2013-01-01

    This paper explores the use of emerging Cloud technologies that support real-time online collaboration. It considers the extent to which these technologies can be leveraged to develop complex skillsets supporting interaction between multiple learners in online spaces. In a pilot study that closely examines how groups of learners translate two…

  11. Photogrammetric retrieval of volcanic ash cloud top height from SEVIRI and MODIS

    NASA Astrophysics Data System (ADS)

    Zakšek, Klemen; Hort, Matthias; Zaletelj, Janez; Langmann, Bärbel

    2013-04-01

    Even if erupting in remote areas, volcanoes can have a significant impact on the modern society due to volcanic ash dispersion in the atmosphere. The ash does not affect merely air traffic - its transport in the atmosphere and its deposition on land and in the oceans may also significantly influence the climate through modifications of atmospheric CO2. The emphasis of this contribution is the retrieval of volcanic ash plume height (ACTH). ACTH is important information especially for air traffic but also to predict ash transport and to estimate the mass flux of the ejected material. ACTH is usually estimated from ground measurements, pilot reports, or satellite remote sensing. But ground based instruments are often not available at remote volcanoes and also the pilots reports are a matter of chance. Volcanic ash cloud top height (ACTH) can be monitored on the global level using satellite remote sensing. The most often used method compares brightness temperature of the cloud with the atmospheric temperature profile. Because of uncertainties of this method (unknown emissivity of the ash cloud, tropopause, etc.) we propose photogrammetric methods based on the parallax between data retrieved from geostationary (SEVIRI) and polar orbiting satellites (MODIS). The parallax is estimated using automatic image matching in three level image pyramids. The procedure works well if the data from both satellites are retrieved nearly simultaneously. MODIS does not retrieve the data at exactly the same time as SEVIRI. To compensate for advection we use two sequential SEVIRI images (one before and one after the MODIS retrieval) and interpolate the cloud position from SEVIRI data to the time of MODIS retrieval. ACTH is then estimated by intersection of corresponding lines-of-view from MODIS and interpolated SEVIRI data. The proposed method was tested using MODIS band 1 and SEVIRI HRV band for the case of the Eyjafjallajökull eruption in April 2010. The parallax between MODIS and SEVIRI data can reach over 30 km which implies ACTH of more than 12 km. The accuracy of ACTH was estimated to 0.6 km. The limitation of this procedure is that it has difficulties with automatic image matching if the ash cloud is not opaque.

  12. Toward Measuring Galactic Dense Molecular Gas Properties and 3D Distribution with Hi-GAL

    NASA Astrophysics Data System (ADS)

    Zetterlund, Erika; Glenn, Jason; Maloney, Phil

    2016-01-01

    The Herschel Space Observatory's submillimeter dust continuum survey Hi-GAL provides a powerful new dataset for characterizing the structure of the dense interstellar medium of the Milky Way. Hi-GAL observed a 2° wide strip covering the entire 360° of the Galactic plane in broad bands centered at 70, 160, 250, 350, and 500 μm, with angular resolution ranging from 10 to 40 arcseconds. We are adapting a molecular cloud clump-finding algorithm and a distance probability density function distance-determination method developed for the Bolocam Galactic Plane Survey (BGPS) to the Hi-GAL data. Using these methods we expect to generate a database of 105 cloud clumps, derive distance information for roughly half the clumps, and derive precise distances for approximately 20% of them. With five-color photometry and distances, we will measure the cloud clump properties, such as luminosities, physical sizes, and masses, and construct a three-dimensional map of the Milky Way's dense molecular gas distribution.The cloud clump properties and the dense gas distribution will provide critical ground truths for comparison to theoretical models of molecular cloud structure formation and galaxy evolution models that seek to emulate spiral galaxies. For example, such models cannot resolve star formation and use prescriptive recipes, such as converting a fixed fraction of interstellar gas to stars at a specified interstellar medium density threshold. The models should be compared to observed dense molecular gas properties and galactic distributions.As a pilot survey to refine the clump-finding and distance measurement algorithms developed for BGPS, we have identified molecular cloud clumps in six 2° × 2° patches of the Galactic plane, including one in the inner Galaxy along the line of sight through the Molecular Ring and the termination of the Galactic bar and one toward the outer Galaxy. Distances have been derived for the inner Galaxy clumps and compared to Bolocam Galactic Plane Survey results. We present the pilot survey clump catalog, distances, clump properties, and a comparison to BGPS.

  13. Electron Cloud in Steel Beam Pipe vs Titanium Nitride Coated and Amorphous Carbon Coated Beam Pipes in Fermilab's Main Injector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backfish, Michael

    This paper documents the use of four retarding field analyzers (RFAs) to measure electron cloud signals created in Fermilab’s Main Injector during 120 GeV operations. The first data set was taken from September 11, 2009 to July 4, 2010. This data set is used to compare two different types of beam pipe that were installed in the accelerator. Two RFAs were installed in a normal steel beam pipe like the rest of the Main Injector while another two were installed in a one meter section of beam pipe that was coated on the inside with titanium nitride (TiN). A secondmore » data run started on August 23, 2010 and ended on January 10, 2011 when Main Injector beam intensities were reduced thus eliminating the electron cloud. This second run uses the same RFA setup but the TiN coated beam pipe was replaced by a one meter section coated with amorphous carbon (aC). This section of beam pipe was provided by CERN in an effort to better understand how an aC coating will perform over time in an accelerator. The research consists of three basic parts: (a) continuously monitoring the conditioning of the three different types of beam pipe over both time and absorbed electrons (b) measurement of the characteristics of the surrounding magnetic fields in the Main Injector in order to better relate actual data observed in the Main Injector with that of simulations (c) measurement of the energy spectrum of the electron cloud signals using retarding field analyzers in all three types of beam pipe.« less

  14. Molecular understanding of sulphuric acid-amine particle nucleation in the atmosphere.

    PubMed

    Almeida, João; Schobesberger, Siegfried; Kürten, Andreas; Ortega, Ismael K; Kupiainen-Määttä, Oona; Praplan, Arnaud P; Adamov, Alexey; Amorim, Antonio; Bianchi, Federico; Breitenlechner, Martin; David, André; Dommen, Josef; Donahue, Neil M; Downard, Andrew; Dunne, Eimear; Duplissy, Jonathan; Ehrhart, Sebastian; Flagan, Richard C; Franchin, Alessandro; Guida, Roberto; Hakala, Jani; Hansel, Armin; Heinritzi, Martin; Henschel, Henning; Jokinen, Tuija; Junninen, Heikki; Kajos, Maija; Kangasluoma, Juha; Keskinen, Helmi; Kupc, Agnieszka; Kurtén, Theo; Kvashin, Alexander N; Laaksonen, Ari; Lehtipalo, Katrianne; Leiminger, Markus; Leppä, Johannes; Loukonen, Ville; Makhmutov, Vladimir; Mathot, Serge; McGrath, Matthew J; Nieminen, Tuomo; Olenius, Tinja; Onnela, Antti; Petäjä, Tuukka; Riccobono, Francesco; Riipinen, Ilona; Rissanen, Matti; Rondo, Linda; Ruuskanen, Taina; Santos, Filipe D; Sarnela, Nina; Schallhart, Simon; Schnitzhofer, Ralf; Seinfeld, John H; Simon, Mario; Sipilä, Mikko; Stozhkov, Yuri; Stratmann, Frank; Tomé, Antonio; Tröstl, Jasmin; Tsagkogeorgas, Georgios; Vaattovaara, Petri; Viisanen, Yrjo; Virtanen, Annele; Vrtala, Aron; Wagner, Paul E; Weingartner, Ernest; Wex, Heike; Williamson, Christina; Wimmer, Daniela; Ye, Penglin; Yli-Juuti, Taina; Carslaw, Kenneth S; Kulmala, Markku; Curtius, Joachim; Baltensperger, Urs; Worsnop, Douglas R; Vehkamäki, Hanna; Kirkby, Jasper

    2013-10-17

    Nucleation of aerosol particles from trace atmospheric vapours is thought to provide up to half of global cloud condensation nuclei. Aerosols can cause a net cooling of climate by scattering sunlight and by leading to smaller but more numerous cloud droplets, which makes clouds brighter and extends their lifetimes. Atmospheric aerosols derived from human activities are thought to have compensated for a large fraction of the warming caused by greenhouse gases. However, despite its importance for climate, atmospheric nucleation is poorly understood. Recently, it has been shown that sulphuric acid and ammonia cannot explain particle formation rates observed in the lower atmosphere. It is thought that amines may enhance nucleation, but until now there has been no direct evidence for amine ternary nucleation under atmospheric conditions. Here we use the CLOUD (Cosmics Leaving OUtdoor Droplets) chamber at CERN and find that dimethylamine above three parts per trillion by volume can enhance particle formation rates more than 1,000-fold compared with ammonia, sufficient to account for the particle formation rates observed in the atmosphere. Molecular analysis of the clusters reveals that the faster nucleation is explained by a base-stabilization mechanism involving acid-amine pairs, which strongly decrease evaporation. The ion-induced contribution is generally small, reflecting the high stability of sulphuric acid-dimethylamine clusters and indicating that galactic cosmic rays exert only a small influence on their formation, except at low overall formation rates. Our experimental measurements are well reproduced by a dynamical model based on quantum chemical calculations of binding energies of molecular clusters, without any fitted parameters. These results show that, in regions of the atmosphere near amine sources, both amines and sulphur dioxide should be considered when assessing the impact of anthropogenic activities on particle formation.

  15. New Particle Formation in an Urban Atmosphere: The Role of Various Ingredients Investigated in the CLOUD Chamber

    NASA Astrophysics Data System (ADS)

    Baltensperger, U.; Xiao, M.; Hoyle, C.; Dada, L.; Garmash, O.; Stolzenburg, D.; Molteni, U.; Lehtipalo, K.; El-Haddad, I.; Dommen, J.

    2017-12-01

    Atmospheric aerosols play an important role on climate via aerosol-radiation interaction and aerosol-cloud interaction. The latter is strongly influenced by new particle formation (NPF). The physical and chemical mechanisms behind the NPF process are still under investigation. Great advancements were made in resolving chemical and physical mechanisms of NPF with a series of experiments conducted at the CLOUD (Cosmics Leaving Outdoor Droplets) chamber facility at CERN (Geneva, Switzerland), including binary nucleation of sulfuric acid - water, ternary nucleation of sulfuric acid - water with ammonia or dimethylamine as well as oxidation products (highly oxygenated molecules, HOMs) from biogenic precursors with and without the presence of sulfuric acid. Here, we investigate possible NPF mechanisms in urban atmospheres, where large populations are exposed to high aerosol concentrations; these mechanisms are still missing and are urgently needed. Urban atmospheres are highly polluted with high concentrations of SO2, ammonia, NOx and volatile organic vapors from anthropogenic activity as well as with high particle concentrations, which provide a high condensation sink for condensable gases. Aromatic hydrocarbons from industrial activities, traffic and residential combustion are present at high concentrations and contribute significantly to photochemical smog in the urban environment.The experiments were conducted at the CLOUD chamber facility during the CLOUD11 campaign in fall 2016. Three aromatic hydrocarbons were selected: toluene, 1,2,4-trimethylbenzene (1,2,4-TMB) and naphthalene (NPT). Experiments were also conducted with mixtures of the three aromatic hydrocarbons to better represent the urban atmosphere. All the experiments were conducted in the presence of sulfuric acid concentrations with or without the addition of ammonia and NOx. New particle formation rates and early growth rates derived for each precursor and their mixture, together with sulfuric acid and with or without the addition of ammonia and NOx will be reported.

  16. Clouds, Aerosol, and Precipitation in the Marine Boundary Layer: An ARM Mobile Facility Deployment

    NASA Technical Reports Server (NTRS)

    Wood, Robert; Wyant, Matthew; Bretherton, Christopher S.; Remillard, Jasmine; Kollias, Pavlos; Fletcher, Jennifer; Stemmler, Jayson; de Szoeke, Simone; Yuter, Sandra; Miller, Matthew; hide

    2015-01-01

    Capsule: A 21-month deployment to Graciosa Island in the northeastern Atlantic Ocean is providing an unprecedented record of the clouds, aerosols and meteorology in a poorly-sampled remote marine environment The Clouds, Aerosol, and Precipitation in the Marine Boundary Layer (CAP-MBL) deployment at Graciosa Island in the Azores generated a 21 month (April 2009- December 2010) comprehensive dataset documenting clouds, aerosols and precipitation using the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF). The scientific aim of the deployment is to gain improved understanding of the interactions of clouds, aerosols and precipitation in the marine boundary layer. Graciosa Island straddles the boundary between the subtropics and midlatitudes in the Northeast Atlantic Ocean, and consequently experiences a great diversity of meteorological and cloudiness conditions. Low clouds are the dominant cloud type, with stratocumulus and cumulus occurring regularly. Approximately half of all clouds contained precipitation detectable as radar echoes below the cloud base. Radar and satellite observations show that clouds with tops from 1- 11 km contribute more or less equally to surface-measured precipitation at Graciosa. A wide range of aerosol conditions was sampled during the deployment consistent with the diversity of sources as indicated by back trajectory analysis. Preliminary findings suggest important two-way interactions between aerosols and clouds at Graciosa, with aerosols affecting light precipitation and cloud radiative properties while being controlled in part by precipitation scavenging. The data from at Graciosa are being compared with short-range forecasts made a variety of models. A pilot analysis with two climate and two weather forecast models shows that they reproduce the observed time-varying vertical structure of lower-tropospheric cloud fairly well, but the cloud-nucleating aerosol concentrations less well. The Graciosa site has been chosen to be a permanent fixed ARM site that became operational in October 2013.

  17. Near-Real Time Satellite-Retrieved Cloud and Surface Properties for Weather and Aviation Safety Applications

    NASA Technical Reports Server (NTRS)

    Minnis, Patrick; Smith, William L., Jr.; Bedka, Kristopher M.; Nguyen, Louis; Palikonda, Rabindra; Hong, Gang; Trepte, Qing Z.; Chee, Thad; Scarino, Benjamin; Spangenberg, Douglas A.; hide

    2014-01-01

    Cloud properties determined from satellite imager radiances provide a valuable source of information for nowcasting and weather forecasting. In recent years, it has been shown that assimilation of cloud top temperature, optical depth, and total water path can increase the accuracies of weather analyses and forecasts. Aircraft icing conditions can be accurately diagnosed in near-­-real time (NRT) retrievals of cloud effective particle size, phase, and water path, providing valuable data for pilots. NRT retrievals of surface skin temperature can also be assimilated in numerical weather prediction models to provide more accurate representations of solar heating and longwave cooling at the surface, where convective initiation. These and other applications are being exploited more frequently as the value of NRT cloud data become recognized. At NASA Langley, cloud properties and surface skin temperature are being retrieved in near-­-real time globally from both geostationary (GEO) and low-­-earth orbiting (LEO) satellite imagers for weather model assimilation and nowcasting for hazards such as aircraft icing. Cloud data from GEO satellites over North America are disseminated through NCEP, while those data and global LEO and GEO retrievals are disseminated from a Langley website. This paper presents an overview of the various available datasets, provides examples of their application, and discusses the use of the various datasets downstream. Future challenges and areas of improvement are also presented.

  18. Near-Real Time Satellite-Retrieved Cloud and Surface Properties for Weather and Aviation Safety Applications

    NASA Astrophysics Data System (ADS)

    Minnis, P.; Smith, W., Jr.; Bedka, K. M.; Nguyen, L.; Palikonda, R.; Hong, G.; Trepte, Q.; Chee, T.; Scarino, B. R.; Spangenberg, D.; Sun-Mack, S.; Fleeger, C.; Ayers, J. K.; Chang, F. L.; Heck, P. W.

    2014-12-01

    Cloud properties determined from satellite imager radiances provide a valuable source of information for nowcasting and weather forecasting. In recent years, it has been shown that assimilation of cloud top temperature, optical depth, and total water path can increase the accuracies of weather analyses and forecasts. Aircraft icing conditions can be accurately diagnosed in near-real time (NRT) retrievals of cloud effective particle size, phase, and water path, providing valuable data for pilots. NRT retrievals of surface skin temperature can also be assimilated in numerical weather prediction models to provide more accurate representations of solar heating and longwave cooling at the surface, where convective initiation. These and other applications are being exploited more frequently as the value of NRT cloud data become recognized. At NASA Langley, cloud properties and surface skin temperature are being retrieved in near-real time globally from both geostationary (GEO) and low-earth orbiting (LEO) satellite imagers for weather model assimilation and nowcasting for hazards such as aircraft icing. Cloud data from GEO satellites over North America are disseminated through NCEP, while those data and global LEO and GEO retrievals are disseminated from a Langley website. This paper presents an overview of the various available datasets, provides examples of their application, and discusses the use of the various datasets downstream. Future challenges and areas of improvement are also presented.

  19. 4-D cloud properties from passive satellite data and applications to resolve the flight icing threat to aircraft

    NASA Astrophysics Data System (ADS)

    Smith, William L., Jr.

    The threat for aircraft icing in clouds is a significant hazard that routinely impacts aviation operations. Accurate diagnoses and forecasts of aircraft icing conditions requires identifying the location and vertical distribution of clouds with super-cooled liquid water (SLW) droplets, as well as the characteristics of the droplet size distribution. Traditional forecasting methods rely on guidance from numerical models and conventional observations, neither of which currently resolve cloud properties adequately on the optimal scales needed for aviation. Satellite imagers provide measurements over large areas with high spatial resolution that can be interpreted to identify the locations and characteristics of clouds, including features associated with adverse weather and storms. This thesis develops new techniques for interpreting cloud products derived from satellite data to infer the flight icing threat to aircraft in a wide range of cloud conditions. For unobscured low clouds, the icing threat is determined using empirical relationships developed from correlations between satellite imager retrievals of liquid water path and droplet size with icing conditions reported by pilots (PIREPS). For deep ice over water cloud systems, ice and liquid water content profiles are derived by using the imager cloud properties to constrain climatological information on cloud vertical structure and water phase obtained apriori from radar and lidar observations, and from cloud model analyses. Retrievals of the SLW content embedded within overlapping clouds are mapped to the icing threat using guidance from an airfoil modeling study. Compared to PIREPS, the satellite icing detection and intensity accuracies are found to be about 90% and 70%, respectively. Mean differences between the imager IWC retrievals with those from CloudSat and Calipso are less than 30%. This level of closure in the cloud water budget can only be achieved by correcting for errors in the imager retrievals due to the simplifying but poor assumption that deep optically thick clouds are single-phase and vertically homogeneous. When applied to geostationary satellite data, the profiling method provides a real-time characterization of clouds in 4-D. This research should improve the utility of satellite imager data for quantitatively diagnosing and predicting clouds and their effects in weather and climate applications.

  20. Comparison of Satellite and Aircraft Measurements of Cloud Microphysical Properties in Icing Conditions During ATREC/AIRS-II

    NASA Technical Reports Server (NTRS)

    Nguyen, Louis; Minnis, Patrick; Spangenberg, Douglas A.; Nordeen, Michele L.; Palikonda, Rabindra; Khaiyer, Mandana M.; Gultepe, Ismail; Reehorst, Andrew L.

    2004-01-01

    Satellites are ideal for continuous monitoring of aircraft icing conditions in many situations over extensive areas. The satellite imager data are used to diagnose a number of cloud properties that can be used to develop icing intensity indices. Developing and validating these indices requires comparison with objective "cloud truth" data in addition to conventional pilot reports (PIREPS) of icing conditions. Minnis et al. examined the relationships between PIREPS icing and satellite-derived cloud properties. The Atlantic-THORPEX Regional Campaign (ATReC) and the second Alliance Icing Research Study (AIRS-II) field programs were conducted over the northeastern USA and southeastern Canada during late 2003 and early 2004. The aircraft and surface measurements are concerned primarily with the icing characteristics of clouds and, thus, are ideal for providing some validation information for the satellite remote sensing product. This paper starts the process of comparing cloud properties and icing indices derived from the Geostationary Operational Environmental Satellite (GOES) with the aircraft in situ measurements of several cloud properties during campaigns and some of the The comparisons include cloud phase, particle size, icing intensity, base and top altitudes, temperatures, and liquid water path. The results of this study are crucial for developing a more reliable and objective icing product from satellite data. This icing product, currently being derived from GOES data over the USA, is an important complement to more conventional products based on forecasts, and PIREPS.

  1. Pilot Tests of Satellite Snowcover/Runoff Forecasting Systems. [Arizona, Sierra Nevada Mountains (Ca), Colorado, Rocky Mountains (North America)

    NASA Technical Reports Server (NTRS)

    Rango, A.

    1978-01-01

    Major snow zones of the western U.S. were selected to test the capability of satellite systems for mapping snowcover in various snow, cloud, climatic, and vegetation regimes. Different satellite snowcover analysis methods used in each area are described along with results.

  2. New experimental measurements of electron clouds in ion beams with large tune depression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molvik, A W; Covo, M K; Cohen, R H

    We study electron clouds in high perveance beams (K = 8E-4) with a large tune depression of 0.2 (defined as the ratio of a single particle oscillation response to the applied focusing fields, with and without space charge). These 1 MeV, 180 mA, K+ beams have a beam potential of +2 kV when electron clouds are minimized. Simulation results are discussed in a companion paper [J-L. Vay, this Conference]. We have developed the first diagnostics that quantitatively measure the accumulation of electrons in a beam [1]. This, together with measurements of electron sources, will enable the electron particle balance tomore » be measured, and electron-trapping efficiencies determined. We, along with colleagues from GSI and CERN, have also measured the scaling of gas desorption with beam energy and dE/dx [2]. Experiments where the heavy-ion beam is transported with solenoid magnetic fields, rather than with quadrupole magnetic or electrostatic fields, are being initiated. We will discuss initial results from experiments using electrode sets (in the middle and at the ends of magnets) to either expel or to trap electrons within the magnets. We observe electron oscillations in the last quadrupole magnet when we flood the beam with electrons from an end wall. These oscillations, of order 10 MHz, are observed to grow from the center of the magnet while drifting upstream against the beam, in good agreement with simulations.« less

  3. Variability of Aerosol and its Impact on Cloud Properties Over Different Cities of Pakistan

    NASA Astrophysics Data System (ADS)

    Alam, Khan

    Interaction between aerosols and clouds is the subject of considerable scientific research, due to the importance of clouds in controlling climate. Aerosols vary in time in space and can lead to variations in cloud microphysics. This paper is a pilot study to examine the temporal and spatial variation of aerosol particles and their impact on different cloud optical properties in the territory of Pakistan using the Moderate resolution Imaging Spectroradiometer (MODIS) on board NASA's Terra satellite data and Multi-angle Imaging Spectroradiometer (MISR) data. We also use Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model for trajectory analysis to obtain origin of air masses in order to understand the spatial and temporal variability of aerosol concentrations. We validate data of MODIS and MISR by using linear correlation and regression analysis, which shows that there is an excellent agreement between data of these instruments. Seasonal study of Aerosol Optical Depth (AOD) shows that maximum value is found in monsoon season (June-August) over all study areas. We analyze the relationships between aerosol optical depth (AOD) and some cloud parameters like water vapor (WV), cloud fraction (CF), cloud top temperature (CTT) and cloud top pressure (CTP). We construct the regional correlation maps and time series plots for aerosol and cloud parameters mandatory for the better understanding of aerosol-cloud interaction. Our analyses show that there is a strong positive correlation between AOD and water vapor in all cities. The correlation between AOD and CF is positive for the cities where the air masses are moist while the correlation is negative for cities where air masses are relatively dry and with lower aerosol abundance. It shows that these correlations depend on meteorological conditions. Similarly as AOD increases Cloud Top Pressure (CTP) is decreasing while Cloud Top Temperature (CTT) is increasing. Key Words: MODIS, MISR, HYSPLIT, AOD, CF, CTP, CTT

  4. Recent observations of lightning discharges from the top of a thundercloud into the clear air above

    NASA Technical Reports Server (NTRS)

    Vaughan, O. H., Jr.; Vonnegut, B.

    1988-01-01

    A letter of inquiry to a magazine read by airplane pilots has elicited 15 new observations of a rare form of lightning that comes out of the cloud top, goes up vertically, and terminates in the clear air above. These confirm previous observations showing that this phenomenon usually occurs above very large and energetic thunderclouds. However, small clouds with tops at 15,000 feet have been observed to have this rare form of lightning also. One of the more spectacular observations was made over the severe storm that produced the devastating Xenia, Ohio, tornadoes of April 1965.

  5. Federated data storage and management infrastructure

    NASA Astrophysics Data System (ADS)

    Zarochentsev, A.; Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Hristov, P.

    2016-10-01

    The Large Hadron Collider (LHC)’ operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. Computing models for the High Luminosity LHC era anticipate a growth of storage needs of at least orders of magnitude; it will require new approaches in data storage organization and data handling. In our project we address the fundamental problem of designing of architecture to integrate a distributed heterogeneous disk resources for LHC experiments and other data- intensive science applications and to provide access to data from heterogeneous computing facilities. We have prototyped a federated storage for Russian T1 and T2 centers located in Moscow, St.-Petersburg and Gatchina, as well as Russian / CERN federation. We have conducted extensive tests of underlying network infrastructure and storage endpoints with synthetic performance measurement tools as well as with HENP-specific workloads, including the ones running on supercomputing platform, cloud computing and Grid for ALICE and ATLAS experiments. We will present our current accomplishments with running LHC data analysis remotely and locally to demonstrate our ability to efficiently use federated data storage experiment wide within National Academic facilities for High Energy and Nuclear Physics as well as for other data-intensive science applications, such as bio-informatics.

  6. The NASA Thunderstorm Observations and Research (ThOR) Mission: Lightning Mapping from Space to Improve the Short-term Forecasting of Severe Storms

    NASA Technical Reports Server (NTRS)

    Goodman, S. J.; Christian, H. J.; Boccippio, D. J.; Koshak, W. J.; Cecil, D. J.; Arnold, James E. (Technical Monitor)

    2002-01-01

    The ThOR mission uses a lightning mapping sensor in geostationary Earth orbit to provide continuous observations of thunderstorm activity over the Americas and nearby oceans. The link between lightning activity and cloud updrafts is the basis for total lightning observations indicating the evolving convective intensification and decay of storms. ThOR offers a national operational demonstration of the utility of real-time total lightning mapping for earlier and more reliable identification of potentially severe and hazardous storms. Regional pilot projects have already demonstrated that the dominance in-cloud lightning and increasing in-cloud lash rates are known to precede severe weather at the surface by tens of minutes. ThOR is currently planned for launch in 2005 on a commercial or research satellite. Real-time data will be provided to selected NWS Weather Forecast Offices and National Centers (EMC/AWC/SPC) for evaluation.

  7. Oceanic Weather Decision Support for Unmanned Global Hawk Science Missions into Hurricanes with Tailored Satellite Derived Products

    NASA Astrophysics Data System (ADS)

    Feltz, Wayne; Griffin, Sarah; Velden, Christopher; Zipser, Ed; Cecil, Daniel; Braun, Scott

    2017-04-01

    The purpose of this presentation is to identify in-flight hazards to high-altitude aircraft, namely the Global Hawk. The Global Hawk was used during Septembers 2012-2016 as part of two NASA funded Hurricane Sentinel-3 field campaigns to over-fly hurricanes in the Atlantic Ocean. This talk identifies the cause of severe turbulence experienced over Hurricane Emily (2005) and how a combination of NOAA funded GOES-R algorithm derived cloud top heights/tropical overshooting tops using GOES-13/SEVIRI imager radiances, and lightning information are used to identify areas of potential turbulence for near real-time navigation decision support. Several examples will demonstrate how the Global Hawk pilots remotely received and used real-time satellite derived cloud and lightning detection information to keep the aircraft safely above clouds and avoid regions of potential turbulence.

  8. Lidar Studies of Extinction in Clouds in the ECLIPS Project

    NASA Technical Reports Server (NTRS)

    Martin, C.; Platt, R.; Young, Stuart A.; Patterson, Graeme P.

    1992-01-01

    The Experimental Cloud Lidar Pilot Study (ECLIPS) project has now had two active phases in 1989 and 1991. A number of laboratories around the world have taken part in the study. The observations have yielded new data on cloud height and structure, and have yielded some useful new information on the retrieval of cloud optical properties, together with the uncertainties involved. Clouds have a major impact on the climate of the earth. They have the effect of reducing the mean surface temperature from 30 C for a cloudless planet to a value of about 15 C for present cloud conditions. However, it is not at all certain how clouds would react to a change in the planetary temperature in the event of climate change due to a radiative forcing from greenhouse gases. Clouds both reflect out sunlight (negative feedback) and enhance the greenhouse effect (positive feedback), but the ultimate sign of cloud feedback is unknown. Because of these uncertainties, campaigns to study clouds intensely were initiated. The International Satellite Cloud Climatology (ISCPP) and the FIRE Campaigns (cirrus and stratocumulus) are examples. The ECLIPS was set up similarly to the above experiments to obtain information specifically on cloud base, but also cloud top (where possible), optical properties, and cloud structure. ECLIPS was designed to allow as many laboratories as possible globally to take part to get the largest range of clouds. It involves observations with elastic backscatter lidar, supported by infrared fluxes at the ground and radiosonde data, as basic instrumentation. More complex experiments using beam filter radiometers, solar pyranometers, and satellite data and often associated with other campaigns were also encouraged to join ECLIPS. Two periods for observation were chosen, Sep. - Dec. 1989 and Apr. - Jul. 1992 into which investigators were requested to fit 30 days of observations. These would be either continuous, or arranged to coincide with NOAA satellite overpasses to obtain AVHRR data. The distribution of the ECLIPS international effort as in 1991 is shown. The main gaps in the global distribution are in the tropics and the Southern Hemisphere.

  9. Dark Clouds and Deadly Skies: Assessing the Strategic Effectiveness of Using Remotely Piloted Aircraft Outside of Designated Combat Zones

    DTIC Science & Technology

    2015-06-01

    embassy bombings in Kenya and Tanzania that killed 225 people. An Islamist spokesman claimed that many nomadic tribesmen, including children, were...remained in the single digits . In September of 2011, Press TV, which is an Iranian news organization, claimed that there had been over 80 drone strikes

  10. Report on the Radar/PIREP Cloud Top Discrepancy Study

    NASA Technical Reports Server (NTRS)

    Wheeler, Mark M.

    1997-01-01

    This report documents the results of the Applied Meteorology Unit's (AMU) investigation of inconsistencies between pilot reported cloud top heights and weather radar indicated echo top heights (assumed to be cloud tops) as identified by the 45 Weather Squadron (45WS). The objective for this study is to document and understand the differences in echo top characteristics as displayed on both the WSR-88D and WSR-74C radars and cloud top heights reported by the contract weather aircraft in support of space launch operations at Cape Canaveral Air Station (CCAS), Florida. These inconsistencies are of operational concern since various Launch Commit Criteria (LCC) and Flight Rules (FR) in part describe safe and unsafe conditions as a function of cloud thickness. Some background radar information was presented. Scan strategies for the WSR-74C and WSR-88D were reviewed along with a description of normal radar beam propagation influenced by the Effective Earth Radius Model. Atmospheric conditions prior to and leading up to both launch operations were detailed. Through the analysis of rawinsonde and radar data, atmospheric refraction or bending of the radar beam was identified as the cause of the discrepancies between reported cloud top heights by the contract weather aircraft and those as identified by both radars. The atmospheric refraction caused the radar beam to be further bent toward the Earth than normal. This radar beam bending causes the radar target to be displayed erroneously, with higher cloud top heights and a very blocky or skewed appearance.

  11. Aviation response to a widely dispersed volcanic ash and gas cloud from the August 2008 eruption of Kasatochi, Alaska, USA

    USGS Publications Warehouse

    Guffanti, Marianne; Schneider, David J.; Wallace, Kristi L.; Hall, Tony; Bensimon, Dov R.; Salinas, Leonard J.

    2010-01-01

    The extensive volcanic cloud from Kasatochi's 2008 eruption caused widespread disruptions to aviation operations along Pacific oceanic, Canadian, and U.S. air routes. Based on aviation hazard warnings issued by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, the Federal Aviation Administration, and Meteorological Service of Canada, air carriers largely avoided the volcanic cloud over a 5 day period by route modifications and flight cancellations. Comparison of time coincident GOES thermal infrared (TIR) data for ash detection with Ozone Monitoring Instrument (OMI) ultraviolet data for SO2 detection shows congruent areas of ash and gas in the volcanic cloud in the 2 days following onset of ash production. After about 2.5 days, the area of SO2 detected by OMI was more extensive than the area of ash indicated by TIR data, indicating significant ash depletion by fall out had occurred. Pilot reports of visible haze at cruise altitudes over Canada and the northern United States suggested that SO2 gas had converted to sulfate aerosols. Uncertain about the hazard potential of the aging cloud, airlines coped by flying over, under, or around the observed haze layer. Samples from a nondamaging aircraft encounter with Kasatochi's nearly 3 day old cloud contained volcanic silicate particles, confirming that some fine ash is present in predominantly gas clouds. The aircraft's exposure to ash was insufficient to cause engine damage; however, slightly damaging encounters with volcanic clouds from eruptions of Reventador in 2002 and Hekla in 2000 indicate the possibility of lingering hazards associated with old and/or diffuse volcanic clouds.

  12. Aerosols, clouds, and precipitation in the North Atlantic trades observed during the Barbados aerosol cloud experiment - Part 1: Distributions and variability

    NASA Astrophysics Data System (ADS)

    Jung, Eunsil; Albrecht, Bruce A.; Feingold, Graham; Jonsson, Haflidi H.; Chuang, Patrick; Donaher, Shaunna L.

    2016-07-01

    Shallow marine cumulus clouds are by far the most frequently observed cloud type over the Earth's oceans; but they are poorly understood and have not been investigated as extensively as stratocumulus clouds. This study describes and discusses the properties and variations of aerosol, cloud, and precipitation associated with shallow marine cumulus clouds observed in the North Atlantic trades during a field campaign (Barbados Aerosol Cloud Experiment- BACEX, March-April 2010), which took place off Barbados where African dust periodically affects the region. The principal observing platform was the Center for Interdisciplinary Remotely Piloted Aircraft Studies (CIRPAS) Twin Otter (TO) research aircraft, which was equipped with standard meteorological instruments, a zenith pointing cloud radar and probes that measured aerosol, cloud, and precipitation characteristics.The temporal variation and vertical distribution of aerosols observed from the 15 flights, which included the most intense African dust event during all of 2010 in Barbados, showed a wide range of aerosol conditions. During dusty periods, aerosol concentrations increased substantially in the size range between 0.5 and 10 µm (diameter), particles that are large enough to be effective giant cloud condensation nuclei (CCN). The 10-day back trajectories showed three distinct air masses with distinct vertical structures associated with air masses originating in the Atlantic (typical maritime air mass with relatively low aerosol concentrations in the marine boundary layer), Africa (Saharan air layer), and mid-latitudes (continental pollution plumes). Despite the large differences in the total mass loading and the origin of the aerosols, the overall shapes of the aerosol particle size distributions were consistent, with the exception of the transition period.The TO was able to sample many clouds at various phases of growth. Maximum cloud depth observed was less than ˜ 3 km, while most clouds were less than 1 km deep. Clouds tend to precipitate when the cloud is thicker than 500-600 m. Distributions of cloud field characteristics (depth, radar reflectivity, Doppler velocity, precipitation) were well identified in the reflectivity-velocity diagram from the cloud radar observations. Two types of precipitation features were observed for shallow marine cumulus clouds that may impact boundary layer differently: first, a classic cloud-base precipitation where precipitation shafts were observed to emanate from the cloud base; second, cloud-top precipitation where precipitation shafts emanated mainly near the cloud tops, sometimes accompanied by precipitation near the cloud base. The second type of precipitation was more frequently observed during the experiment. Only 42-44 % of the clouds sampled were non-precipitating throughout the entire cloud layer and the rest of the clouds showed precipitation somewhere in the cloud, predominantly closer to the cloud top.

  13. New directions in the CernVM file system

    NASA Astrophysics Data System (ADS)

    Blomer, Jakob; Buncic, Predrag; Ganis, Gerardo; Hardi, Nikola; Meusel, Rene; Popescu, Radu

    2017-10-01

    The CernVM File System today is commonly used to host and distribute application software stacks. In addition to this core task, recent developments expand the scope of the file system into two new areas. Firstly, CernVM-FS emerges as a good match for container engines to distribute the container image contents. Compared to native container image distribution (e.g. through the “Docker registry”), CernVM-FS massively reduces the network traffic for image distribution. This has been shown, for instance, by a prototype integration of CernVM-FS into Mesos developed by Mesosphere, Inc. We present a path for a smooth integration of CernVM-FS and Docker. Secondly, CernVM-FS recently raised new interest as an option for the distribution of experiment conditions data. Here, the focus is on improved versioning capabilities of CernVM-FS that allows to link the conditions data of a run period to the state of a CernVM-FS repository. Lastly, CernVM-FS has been extended to provide a name space for physics data for the LIGO and CMS collaborations. Searching through a data namespace is often done by a central, experiment specific database service. A name space on CernVM-FS can particularly benefit from an existing, scalable infrastructure and from the POSIX file system interface.

  14. New optical package and algorithms for accurate estimation and interactive recording of the cloud cover information over land and sea

    NASA Astrophysics Data System (ADS)

    Krinitskiy, Mikhail; Sinitsyn, Alexey; Gulev, Sergey

    2014-05-01

    Cloud fraction is a critical parameter for the accurate estimation of short-wave and long-wave radiation - one of the most important surface fluxes over sea and land. Massive estimates of the total cloud cover as well as cloud amount for different layers of clouds are available from visual observations, satellite measurements and reanalyses. However, these data are subject of different uncertainties and need continuous validation against highly accurate in-situ measurements. Sky imaging with high resolution fish eye camera provides an excellent opportunity for collecting cloud cover data supplemented with additional characteristics hardly available from routine visual observations (e.g. structure of cloud cover under broken cloud conditions, parameters of distribution of cloud dimensions). We present operational automatic observational package which is based on fish eye camera taking sky images with high resolution (up to 1Hz) in time and a spatial resolution of 968x648px. This spatial resolution has been justified as an optimal by several sensitivity experiments. For the use of the package at research vessel when the horizontal positioning becomes critical, a special extension of the hardware and software to the package has been developed. These modules provide the explicit detection of the optimal moment for shooting. For the post processing of sky images we developed a software realizing the algorithm of the filtering of sunburn effect in case of small and moderate could cover and broken cloud conditions. The same algorithm accurately quantifies the cloud fraction by analyzing color mixture for each point and introducing the so-called "grayness rate index" for every pixel. The accuracy of the algorithm has been tested using the data collected during several campaigns in 2005-2011 in the North Atlantic Ocean. The collection of images included more than 3000 images for different cloud conditions supplied with observations of standard parameters. The system is fully autonomous and has a block for digital data collection at the hard disk. The system has been tested for a wide range of open ocean cloud conditions and we will demonstrate some pilot results of data processing and physical interpretation of fractional cloud cover estimation.

  15. Alpha 2 LASSO Data Bundles

    DOE Data Explorer

    Gustafson, William Jr; Vogelmann, Andrew; Endo, Satoshi; Toto, Tami; Xiao, Heng; Li, Zhijin; Cheng, Xiaoping; Kim, Jinwon; Krishna, Bhargavi

    2015-08-31

    The Alpha 2 release is the second release from the LASSO Pilot Phase that builds upon the Alpha 1 release. Alpha 2 contains additional diagnostics in the data bundles and focuses on cases from spring-summer 2016. A data bundle is a unified package consisting of LASSO LES input and output, observations, evaluation diagnostics, and model skill scores. LES input include model configuration information and forcing data. LES output includes profile statistics and full domain fields of cloud and environmental variables. Model evaluation data consists of LES output and ARM observations co-registered on the same grid and sampling frequency. Model performance is quantified by skill scores and diagnostics in terms of cloud and environmental variables.

  16. The effects of moon illumination, moon angle, cloud cover, and sky glow on night vision goggle flight performance

    NASA Astrophysics Data System (ADS)

    Loro, Stephen Lee

    This study was designed to examine moon illumination, moon angle, cloud cover, sky glow, and Night Vision Goggle (NVG) flight performance to determine possible effects. The research was a causal-comparative design. The sample consisted of 194 Fort Rucker Initial Entry Rotary Wing NVG flight students being observed by 69 NVG Instructor Pilots. The students participated in NVG flight training from September 1992 through January 1993. Data were collected using a questionnaire. Observations were analyzed using a Kruskal-Wallis one-way analysis of variance and a Wilcox matched pairs signed-ranks test for difference. Correlations were analyzed using Pearson's r. The analyses results indicated that performance at high moon illumination levels is superior to zero moon illumination, and in most task maneuvers, superior to >0%--50% moon illumination. No differences were found in performance at moon illumination levels above 50%. Moon angle had no effect on night vision goggle flight performance. Cloud cover and sky glow have selective effects on different maneuvers. For most task maneuvers, cloud cover does not affect performance. Overcast cloud cover had a significant effect on seven of the 14 task maneuvers. Sky glow did not affect eight out of 14 task maneuvers at any level of sky glow.

  17. A Taxonomy for Choosing, Evaluating, and Integrating In-the-Cloud Resources in a University Environment

    ERIC Educational Resources Information Center

    Kahn, Russell L.

    2013-01-01

    This article develops and applies an analytic matrix for searching and using Web 2.0 resources along a learning continuum based on learning styles. This continuum applies core concepts of cognitive psychology, which places an emphasis on internal processes, such as motivation, thinking, attitudes, and reflection. A pilot study found that access to…

  18. Top-down and bottom-up aerosol-cloud closure: towards understanding sources of uncertainty in deriving cloud shortwave radiative flux

    NASA Astrophysics Data System (ADS)

    Sanchez, Kevin J.; Roberts, Gregory C.; Calmer, Radiance; Nicoll, Keri; Hashimshoni, Eyal; Rosenfeld, Daniel; Ovadnevaite, Jurgita; Preissler, Jana; Ceburnis, Darius; O'Dowd, Colin; Russell, Lynn M.

    2017-08-01

    Top-down and bottom-up aerosol-cloud shortwave radiative flux closures were conducted at the Mace Head Atmospheric Research Station in Galway, Ireland, in August 2015. This study is part of the BACCHUS (Impact of Biogenic versus Anthropogenic emissions on Clouds and Climate: towards a Holistic UnderStanding) European collaborative project, with the goal of understanding key processes affecting aerosol-cloud shortwave radiative flux closures to improve future climate predictions and develop sustainable policies for Europe. Instrument platforms include ground-based unmanned aerial vehicles (UAVs)1 and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN) concentration were used to initiate a 1-D microphysical aerosol-cloud parcel model (ACPM). UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction or a five-hole probe for 3-D wind vectors. UAV cloud measurements are rare and have only become possible in recent years through the miniaturization of instrumentation. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF) by between 25 and 60 W m-2. After accounting for entrainment, satellite-derived cloud droplet number concentrations (CDNCs) were within 30 % of simulated CDNC. In cases with a well-mixed boundary layer, δRF is no greater than 20 W m-2 after accounting for cloud-top entrainment and up to 50 W m-2 when entrainment is not taken into account. In cases with a decoupled boundary layer, cloud microphysical properties are inconsistent with ground-based aerosol measurements, as expected, and δRF is as high as 88 W m-2, even high (> 30 W m-2) after accounting for cloud-top entrainment. This work demonstrates the need to take in situ measurements of aerosol properties for cases where the boundary layer is decoupled as well as consider cloud-top entrainment to accurately model stratocumulus cloud radiative flux. 1The regulatory term for UAV is remotely piloted aircraft (RPA).

  19. Section Editors

    NASA Astrophysics Data System (ADS)

    Groep, D. L.; Bonacorsi, D.

    2014-06-01

    1. Data Acquisition, Trigger and Controls Niko NeufeldCERNniko.neufeld@cern.ch Tassos BeliasDemokritosbelias@inp.demokritos.gr Andrew NormanFNALanorman@fnal.gov Vivian O'DellFNALodell@fnal.gov 2. Event Processing, Simulation and Analysis Rolf SeusterTRIUMFseuster@cern.ch Florian UhligGSIf.uhlig@gsi.de Lorenzo MonetaCERNLorenzo.Moneta@cern.ch Pete ElmerPrincetonpeter.elmer@cern.ch 3. Distributed Processing and Data Handling Nurcan OzturkU Texas Arlingtonnurcan@uta.edu Stefan RoiserCERNstefan.roiser@cern.ch Robert IllingworthFNAL Davide SalomoniINFN CNAFDavide.Salomoni@cnaf.infn.it Jeff TemplonNikheftemplon@nikhef.nl 4. Data Stores, Data Bases, and Storage Systems David LangeLLNLlange6@llnl.gov Wahid BhimjiU Edinburghwbhimji@staffmail.ed.ac.uk Dario BarberisGenovaDario.Barberis@cern.ch Patrick FuhrmannDESYpatrick.fuhrmann@desy.de Igor MandrichenkoFNALivm@fnal.gov Mark van de SandenSURF SARA sanden@sara.nl 5. Software Engineering, Parallelism & Multi-Core Solveig AlbrandLPSC/IN2P3solveig.albrand@lpsc.in2p3.fr Francesco GiacominiINFN CNAFfrancesco.giacomini@cnaf.infn.it Liz SextonFNALsexton@fnal.gov Benedikt HegnerCERNbenedikt.hegner@cern.ch Simon PattonLBNLSJPatton@lbl.gov Jim KowalkowskiFNAL jbk@fnal.gov 6. Facilities, Infrastructures, Networking and Collaborative Tools Maria GironeCERNMaria.Girone@cern.ch Ian CollierSTFC RALian.collier@stfc.ac.uk Burt HolzmanFNALburt@fnal.gov Brian Bockelman U Nebraskabbockelm@cse.unl.edu Alessandro de SalvoRoma 1Alessandro.DeSalvo@ROMA1.INFN.IT Helge MeinhardCERN Helge.Meinhard@cern.ch Ray PasetesFNAL rayp@fnal.gov Steven GoldfarbU Michigan Steven.Goldfarb@cern.ch

  20. Secondary organic aerosol production from pinanediol, a semi-volatile surrogate for first-generation oxidation products of monoterpenes

    NASA Astrophysics Data System (ADS)

    Ye, Penglin; Zhao, Yunliang; Chuang, Wayne K.; Robinson, Allen L.; Donahue, Neil M.

    2018-05-01

    We have investigated the production of secondary organic aerosol (SOA) from pinanediol (PD), a precursor chosen as a semi-volatile surrogate for first-generation oxidation products of monoterpenes. Observations at the CLOUD facility at CERN have shown that oxidation of organic compounds such as PD can be an important contributor to new-particle formation. Here we focus on SOA mass yields and chemical composition from PD photo-oxidation in the CMU smog chamber. To determine the SOA mass yields from this semi-volatile precursor, we had to address partitioning of both the PD and its oxidation products to the chamber walls. After correcting for these losses, we found OA loading dependent SOA mass yields from PD oxidation that ranged between 0.1 and 0.9 for SOA concentrations between 0.02 and 20 µg m-3, these mass yields are 2-3 times larger than typical of much more volatile monoterpenes. The average carbon oxidation state measured with an aerosol mass spectrometer was around -0.7. We modeled the chamber data using a dynamical two-dimensional volatility basis set and found that a significant fraction of the SOA comprises low-volatility organic compounds that could drive new-particle formation and growth, which is consistent with the CLOUD observations.

  1. Cloud flexibility using DIRAC interware

    NASA Astrophysics Data System (ADS)

    Fernandez Albor, Víctor; Seco Miguelez, Marcos; Fernandez Pena, Tomas; Mendez Muñoz, Victor; Saborido Silva, Juan Jose; Graciani Diaz, Ricardo

    2014-06-01

    Communities of different locations are running their computing jobs on dedicated infrastructures without the need to worry about software, hardware or even the site where their programs are going to be executed. Nevertheless, this usually implies that they are restricted to use certain types or versions of an Operating System because either their software needs an definite version of a system library or a specific platform is required by the collaboration to which they belong. On this scenario, if a data center wants to service software to incompatible communities, it has to split its physical resources among those communities. This splitting will inevitably lead to an underuse of resources because the data centers are bound to have periods where one or more of its subclusters are idle. It is, in this situation, where Cloud Computing provides the flexibility and reduction in computational cost that data centers are searching for. This paper describes a set of realistic tests that we ran on one of such implementations. The test comprise software from three different HEP communities (Auger, LHCb and QCD phenomelogists) and the Parsec Benchmark Suite running on one or more of three Linux flavors (SL5, Ubuntu 10.04 and Fedora 13). The implemented infrastructure has, at the cloud level, CloudStack that manages the virtual machines (VM) and the hosts on which they run, and, at the user level, the DIRAC framework along with a VM extension that will submit, monitorize and keep track of the user jobs and also requests CloudStack to start or stop the necessary VM's. In this infrastructure, the community software is distributed via the CernVM-FS, which has been proven to be a reliable and scalable software distribution system. With the resulting infrastructure, users are allowed to send their jobs transparently to the Data Center. The main purpose of this system is the creation of flexible cluster, multiplatform with an scalable method for software distribution for several VOs. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine, which is transparent to the user.

  2. Electron-cloud build-up in hadron machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furman, M.A.

    2004-08-09

    The first observations of electron-proton coupling effect for coasting beams and for long-bunch beams were made at the earliest proton storage rings at the Budker Institute of Nuclear Physics (BINP) in the mid-60's [1]. The effect was mainly a form of the two-stream instability. This phenomenon reappeared at the CERN ISR in the early 70's, where it was accompanied by an intense vacuum pressure rise. When the ISR was operated in bunched-beam mode while testing aluminum vacuum chambers, a resonant effect was observed in which the electron traversal time across the chamber was comparable to the bunch spacing [2]. Thismore » effect (''beam-induced multipacting''), being resonant in nature, is a dramatic manifestation of an electron cloud sharing the vacuum chamber with a positively-charged beam. An electron-cloud-induced instability has been observed since the mid-80's at the PSR (LANL) [3]; in this case, there is a strong transverse instability accompanied by fast beam losses when the beam current exceeds a certain threshold. The effect was observed for the first time for a positron beam in the early 90's at the Photon Factory (PF) at KEK, where the most prominent manifestation was a coupled-bunch instability that was absent when the machine was operated with an electron beam under otherwise identical conditions [4]. Since then, with the advent of ever more intense positron and hadron beams, and the development and deployment of specialized electron detectors [5-9], the effect has been observed directly or indirectly, and sometimes studied systematically, at most lepton and hadron machines when operated with sufficiently intense beams. The effect is expected in various forms and to various degrees in accelerators under design or construction. The electron-cloud effect (ECE) has been the subject of various meetings [10-15]. Two excellent reviews, covering the phenomenology, measurements, simulations and historical development, have been recently given by Frank Zimmermann [16,17]. In this article we focus on the mechanisms of electron-cloud buildup and dissipation for hadronic beams, particularly those with very long, intense, bunches.« less

  3. Volcanic plumes fast detection: a methodological proposal for an integrated approach

    NASA Astrophysics Data System (ADS)

    Bernabeo, R. Alberto; Tositti, Laura; Brattich, Erika

    2017-04-01

    The behaviour of erupting volcanoes ranges from the quiet, steady effusion of lava to highly explosive eruptions. Therefore volcanic eruptions may present a direct threat to the safety of aircraft in flight and major operational difficulties at aerodromes and in airspaces located downwind the resulting volcanic ash cloud, in particular when eruptions are of high intensity and/or prolonged. Since volcanic ash clouds and gases are not displayed on either airborne or ATC radar and are extremely difficult to identify at night, pilots must rely on reports from air traffic controllers and from other pilots to determine the location of an ash cloud or gases. As a result, there is a clear need to develop extra tools enabling the timely on-board sensing of volcanic plumes for the sake of safety purposes. Large scale eruptions may eject many cubic kilometres of glass particles and pulverized rock (volcanic ash) as well as corrosive/hazardous gases high into the atmosphere, potentially over a wide area for timescales ranging from hours to weeks or even months. Volcanic ash consists mostly of sharp-edged, hard glass particles and pulverized rock. It is very abrasive and, being largely composed of siliceous materials, has a melting temperature below the operating temperature of modern turbine engines at cruise thrust. A volcanic plume in fact contains a complex mixture of water vapour, sulphur dioxide (producing sulphuric acid as a result of gas-to particle conversions reaction catalysed by iron in cloud droplets), chlorine and other halogens, and trace elements which are highly reactive and may interact with the mineral particles to produce corrosive effects hazardous to both airframes and human health. Remotely piloted aircraft system (RPAS) or Unmanned aerial vehicles (UAV) are slowly becoming efficient platforms - with dedicated miniaturized sensors that can be used in scientific/commercial remote sensing applications - and are of fundamental support to the planning, running and control of the territory in which public safety is or may be at risk, and with reference to all those subjects that require a continuous cyclical process of observation, evaluation and interpretation. At the same time, a better knowledge of the chemical properties of volcanic emissions is a must for the future expansion foreseen in the next coming years in air transportation, for the health hazards that a volcanic ash cloud poses around the world and for a better understanding of the reduction already observed in GPS/GNSS satellite signals anytime a volcanic cloud covers the sky (thus obscuring the signal used by the navigation systems of modern aircraft), with associated safety risks. In this paper we propose a multitasking experimental approach based on the integrated use of remote sensing, aerosol sampling and chemical speciation together with the use of drones/tethered balloons equipped with aerosol sensors aimed at providing all the information which have been collected partially so far. The study will also collect information about the 3D distribution of all the aerosol properties described before with the aim of determining and helping the vertical resolution of data from remote sensing.

  4. SHiP: a new multipurpose beam-dump experiment at the SPS

    NASA Astrophysics Data System (ADS)

    Dijkstra, H. B.

    2016-11-01

    SHiP is an experiment to look for very weakly interacting particles at a new to be constructed beam-dump facility at the CERN SPS. The SHiP Technical Proposal has been submitted to the CERN SPS Committee in April 2015. The 400 GeV/c proton beam extracted from the SPS will be dumped on a heavy target with the aim of integrating 2 × 1020 proton on target in five years. A detector located downstream of the target, based on a long vacuum tank followed by a spectrometer and particle identification detectors, will allow probing a variety of models with light long-lived exotic particles and masses below a few GeV/c2. The main focus will be the physics of the so-called Hidden Portals, i.e. search for Dark Photons, Light scalars and pseudo-scalars, and Heavy Neutral Leptons (HNL). The sensitivity to HNL will allow for the first time to probe, in the mass range between the kaon and the charm meson mass, a coupling range for which Baryogenesis and active neutrino masses could also be explained. Integrated in SHiP is an Emulsion Cloud Chamber, already used in the OPERA experiment, which will allow to study active neutrino cross-sections and angular distributions. In particular SHiP can distinguish between vτ and v¯τ, and their deep inelastic scattering cross sections will be measured with statistics three orders of magnitude larger than currently available.

  5. Project S'COOL

    NASA Technical Reports Server (NTRS)

    Green, Carolyn J.; Chambers, Lin H.

    1998-01-01

    The Students Clouds Observations On-Line or S'COOL project was piloted in 1997. It was created with the idea of using students to serve as one component of the validation for the Clouds and the Earth's Radiant Energy System (CERES) instrument which was launched with the Tropical Rainfall Measuring Mission (TRMM) in November, 1997. As part of NASA's Earth Science Enterprise CERES is interested in the role clouds play in regulating our climate. Over thirty schools became involved in the initial thrust of the project. The CERES instrument detects the location of clouds and identifies their physical properties. S'COOL students coordinate their ground truth observations with the exact overpass of the satellite at their location. Their findings regarding cloud type, height, fraction and opacity as well as surface conditions are then reported to the NASA Langley Distributed Active Archive Center (DAAC). The data is then accessible to both the CERES team for validation and to schools for educational application via the Internet. By March of 1998 ninety-three schools, in nine countries had enrolled in the S'COOL project. Joining the United States participants were from schools in Australia, Canada, France, Germany, Norway, Spain, Sweden, and Switzerland. The project is gradually becoming the global project envisioned by the project s creators. As students obtain the requested data useful for the scientists, it was hoped that students with guidance from their instructors would have opportunity and motivation to learn more about clouds and atmospheric science as well.

  6. Mitigating Mosquito Disease Vectors with Citizen Science: a Review of the GLOBE Observer Mosquito Habitat Mapper Pilot and Implications for Wide-scale Implementation

    NASA Astrophysics Data System (ADS)

    Riebeek Kohl, H.; Low, R.; Boger, R. A.; Schwerin, T. G.; Janney, D. W.

    2017-12-01

    The spread of disease vectors, including mosquitoes, is an increasingly significant global environmental issue driven by a warming climate. In 2017, the GLOBE Observer Program launched a new citizen science initiative to map mosquito habitats using the free GLOBE Observer App for smart phones and tablets. The app guides people to identify mosquito larvae and breeding sites, and then once documented, to eliminate or treat the site to prevent further breeding. It also gives citizen scientists the option to identify the mosquito larvae species to determine whether it is one of three genera that potentially could transmit Zika, dengue fever, yellow fever, chikungunya, and other diseases. This data is uploaded to an international database that is freely available to the public and science community. GLOBE Observer piloted the initiative with educators in the United States, Brazil, and Peru, and it is now open for global participation. This presentation will discuss lessons learned in the pilot phase as well as plans to implement the initiative worldwide in partnership with science museums and science centers. GLOBE Observer is the non-student citizen science arm of the Global Learning and Observations to Benefit the Environment (GLOBE) Program, a long-standing, international science and education program that provides students and citizen scientists with the opportunity to participate in data collection and the scientific process, and contribute meaningfully to our understanding of the Earth system and global environment. GLOBE Observer data collection also includes cloud cover and cloud type and land cover/land use (in late 2017).

  7. CERN and high energy physics, the grand picture

    ScienceCinema

    Heuer, Rolf-Dieter

    2018-05-24

    The lecture will touch on several topics, to illustrate the role of CERN in the present and future of high-energy physics: how does CERN work? What is the role of the scientific community, of bodies like Council and SPC, and of international cooperation, in the definition of CERN's scientific programme? What are the plans for the future of the LHC and of the non-LHC physics programme? What is the role of R&D; and technology transfer at CERN?

  8. Dissemination of CERN's Technology Transfer: Added Value from Regional Transfer Agents

    ERIC Educational Resources Information Center

    Hofer, Franz

    2005-01-01

    Technologies developed at CERN, the European Organization for Nuclear Research, are disseminated via a network of external technology transfer officers. Each of CERN's 20 member states has appointed at least one technology transfer officer to help establish links with CERN. This network has been in place since 2001 and early experiences indicate…

  9. Instrumentation for surveying the lower part of the atmosphere in extremes conditions

    NASA Astrophysics Data System (ADS)

    Gobinddass, Marie-Line; Molinie, Jack; Richard, Sandrine; Jean-Louis, Sabrina

    To observe atmospheric phenomena such as clouds, precipitation and wind in order to understand how they form and evolve meteorologists use few instruments which allows to measure parameters as temperature, pressure and humidity. In the specific case of Kourou region where the French Space Agency is located the environment and safeguard group works on protecting biodiversity in and around the center. By considering a few scientific challenges in atmospheric science one of the main topics of this work consists on the understanding of the fluctuation of the atmosphere due to natural or industrials perturbations. We have considered a few experiences with many instruments in a large space of more than 1200 km per square. To differentiate and try to quantify industrial cloud from natural cloud or from natural atmosphere, the idea of using a drone has been experimented. The ratio of the cost of such experimentation with the relevance of the results which can be obtained will be discussed here. It is necessary to take into account the turbulence in the atmosphere due to industrial acid cloud or hot cloud. Finally, instead of taking the risk of having airbone measurements with a pilot we have thought of the tetherball due to it lower cost and for security reason. The technical experiment and few type of results will be presented here.

  10. Cloud-based NEXRAD Data Processing and Analysis for Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Seo, B. C.; Demir, I.; Keem, M.; Goska, R.; Weber, J.; Krajewski, W. F.

    2016-12-01

    The real-time and full historical archive of NEXRAD Level II data, covering the entire United States from 1991 to present, recently became available on Amazon cloud S3. This provides a new opportunity to rebuild the Hydro-NEXRAD software system that enabled users to access vast amounts of NEXRAD radar data in support of a wide range of research. The system processes basic radar data (Level II) and delivers radar-rainfall products based on the user's custom selection of features such as space and time domain, river basin, rainfall product space and time resolution, and rainfall estimation algorithms. The cloud-based new system can eliminate prior challenges faced by Hydro-NEXRAD data acquisition and processing: (1) temporal and spatial limitation arising from the limited data storage; (2) archive (past) data ingestion and format conversion; and (3) separate data processing flow for the past and real-time Level II data. To enhance massive data processing and computational efficiency, the new system is implemented and tested for the Iowa domain. This pilot study begins by ingesting rainfall metadata and implementing Hydro-NEXRAD capabilities on the cloud using the new polarimetric features, as well as the existing algorithm modules and scripts. The authors address the reliability and feasibility of cloud computation and processing, followed by an assessment of response times from an interactive web-based system.

  11. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    NASA Astrophysics Data System (ADS)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats

    2014-06-01

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.

  12. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less

  13. A Comprehensive Two-moment Warm Microphysical Bulk Scheme :

    NASA Astrophysics Data System (ADS)

    Caro, D.; Wobrock, W.; Flossmann, A.; Chaumerliac, N.

    The microphysic properties of gaz, aerosol particles, and hydrometeors have impli- cations at local scale (precipitations, pollution peak,..), at regional scale (inundation, acid rains,...), and also, at global scale (radiative forcing,...). So, a multi-scale study is necessary to understand and forecast in a good way meteorological phenomena con- cerning clouds. However, it cannot be carried with detailed microphysic model, on account of computers limitations. So, microphysical bulk schemes have to estimate the n´ large scale z properties of clouds due to smaller scale processes and charac- teristics. So, the development of such bulk scheme is rather important to go further in the knowledge of earth climate and in the forecasting of intense meteorological phenomena. Here, a quasi-spectral microphysic warm scheme has been developed to predict the concentrations and mixing ratios of aerosols, cloud droplets and raindrops. It considers, explicitely and analytically, the nucleation of droplets (Abdul-Razzak et al., 2000), condensation/evaporation (Chaumerliac et al., 1987), the breakup and collision-coalescence processes with the Long (1974) Ss kernels and the Berry and ´ Reinhardt (1974) Ss autoconversion parameterization, but also, the aerosols and gaz ´ scavenging. First, the parameterization has been estimated in the simplest dynamic framework of an air parcel model, with the results of the detailed scavenging model, DESCAM (Flossmann et al., 1985). Then, it has been tested, in the dynamic frame- work of a kinematic model (Szumowski et al., 1998) dedicated to the HaRP cam- paign (Hawaiian Rainband Project, 1990), with the observations and with the results of the two dimensional detailed microphysic scheme, DESCAM 2-D (Flossmann et al., 1988), implement in the CLARK model (Clark and Farley, 1984).

  14. Dynamic consideration of smog chamber experiments

    NASA Astrophysics Data System (ADS)

    Chuang, Wayne K.; Donahue, Neil M.

    2017-08-01

    Recent studies of the α-pinene + ozone reaction that address particle nucleation show relatively high molar yields of highly oxidized multifunctional organic molecules with very low saturation concentrations that can form and grow new particles on their own. However, numerous smog-chamber experiments addressing secondary organic aerosol (SOA) mass yields, interpreted via equilibrium partitioning theory, suggest that the vast majority of SOA from α-pinene is semivolatile. We explore this paradox by employing a dynamic volatility basis set (VBS) model that reproduces the new-particle growth rates observed in the CLOUD experiment at CERN and then modeling SOA mass yield experiments conducted at Carnegie Mellon University (CMU). We find that the base-case simulations do overpredict observed SOA mass but by much less than an equilibrium analysis would suggest; this is because delayed condensation of vapors suppresses the apparent mass yields early in the chamber experiments. We further find that a second VBS model featuring substantial oligomerization of semivolatile monomers can match the CLOUD growth rates with substantially lower SOA mass yields; this is because the lighter monomers have a higher velocity and thus a higher condensation rate for a given mass concentration. The oligomerization simulations are a closer match to the CMU experiments than the base-case simulations, though they overpredict the observations somewhat. However, we also find that if the chemical conditions in CLOUD and the CMU chamber were identical, substantial nucleation would have occurred in the CMU experiments when in fact none occurred. This suggests that the chemical mechanisms differed in the two experiments, perhaps because the high oxidation rates in the SOA formation experiments led to rapid termination of peroxy radical chemistry.

  15. Measurement-model comparison of stabilized Criegee intermediate and highly oxygenated molecule production in the CLOUD chamber

    NASA Astrophysics Data System (ADS)

    Sarnela, Nina; Jokinen, Tuija; Duplissy, Jonathan; Yan, Chao; Nieminen, Tuomo; Ehn, Mikael; Schobesberger, Siegfried; Heinritzi, Martin; Ehrhart, Sebastian; Lehtipalo, Katrianne; Tröstl, Jasmin; Simon, Mario; Kürten, Andreas; Leiminger, Markus; Lawler, Michael J.; Rissanen, Matti P.; Bianchi, Federico; Praplan, Arnaud P.; Hakala, Jani; Amorim, Antonio; Gonin, Marc; Hansel, Armin; Kirkby, Jasper; Dommen, Josef; Curtius, Joachim; Smith, James N.; Petäjä, Tuukka; Worsnop, Douglas R.; Kulmala, Markku; Donahue, Neil M.; Sipilä, Mikko

    2018-02-01

    Atmospheric oxidation is an important phenomenon which produces large quantities of low-volatility compounds such as sulfuric acid and oxidized organic compounds. Such species may be involved in the nucleation of particles and enhance their subsequent growth to reach the size of cloud condensation nuclei (CCN). In this study, we investigate α-pinene, the most abundant monoterpene globally, and its oxidation products formed through ozonolysis in the Cosmic Leaving OUtdoor Droplets (CLOUD) chamber at CERN (the European Organization for Nuclear Research). By scavenging hydroxyl radicals (OH) with hydrogen (H2), we were able to investigate the formation of highly oxygenated molecules (HOMs) purely driven by ozonolysis and study the oxidation of sulfur dioxide (SO2) driven by stabilized Criegee intermediates (sCIs). We measured the concentrations of HOM and sulfuric acid with a chemical ionization atmospheric-pressure interface time-of-flight (CI-APi-TOF) mass spectrometer and compared the measured concentrations with simulated concentrations calculated with a kinetic model. We found molar yields in the range of 3.5-6.5 % for HOM formation and 22-32 % for the formation of stabilized Criegee intermediates by fitting our model to the measured sulfuric acid concentrations. The simulated time evolution of the ozonolysis products was in good agreement with measured concentrations except that in some of the experiments sulfuric acid formation was faster than simulated. In those experiments the simulated and measured concentrations met when the concentration reached a plateau but the plateau was reached 20-50 min later in the simulations. The results shown here are consistent with the recently published yields for HOM formation from different laboratory experiments. Together with the sCI yields, these results help us to understand atmospheric oxidation processes better and make the reaction parameters more comprehensive for broader use.

  16. Nowcasting Aircraft Icing Conditions in Moscow Region Using Geostationary Meteorological Satellite Data

    NASA Astrophysics Data System (ADS)

    Barabanova, Olga

    2013-04-01

    Nowadays the Main Aviation Meteorological Centre in Moscow (MAMC) provides forecasts of icing conditions in Moscow Region airports using information of surface observation network, weather radars and atmospheric sounding. Unfortunately, satellite information is not used properly in aviation meteorological offices in Moscow Region: weather forecasters deal with satellites images of cloudiness only. The main forecasters of MAMC realise that it is necessary to employ meteorological satellite numerical data from different channels in aviation forecasting and especially in nowcasting. Algorithm of nowcasting aircraft in-flight icing conditions has been developed using data from geostationary meteorological satellites "Meteosat-7" and "Meteosat-9". The algorithm is based on the brightness temperature differences. Calculation of brightness temperature differences help to discriminate clouds with supercooled large drops where severe icing conditions are most likely. Due to the lack of visible channel data, the satellite icing detection methods will be less accurate at night. Besides this method is limited by optically thick ice clouds where it is not possible to determine the extent to which supercooled large drops exists within the underlying clouds. However, we determined that most of the optically thick cases are associated with convection or mid-latitude cyclones and they will nearly always have a layer where which supercooled large drops exists with an icing threat. This product is created hourly for the Moscow Air Space and mark zones with moderate or severe icing hazards. The results were compared with mesoscale numerical atmospheric model COSMO-RU output. Verification of the algorithms results using aircraft pilot reports shows that this algorithm is a good instrument for the operational practise in aviation meteorological offices in Moscow Region. The satellite-based algorithms presented here can be used in real time to diagnose areas of icing for pilots to avoid.

  17. The EPOS Vision for the Open Science Cloud

    NASA Astrophysics Data System (ADS)

    Jeffery, Keith; Harrison, Matt; Cocco, Massimo

    2016-04-01

    Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be considered as ICS-ds by EPOS.. Provision of access to ICS-Ds from ICS-C concerns several aspects: (a) Technical : it may be more or less difficult to connect and pass from ICS-C to the ICS-d/ CES the 'package' (probably a virtual machine) of data and software; (b) Security/privacy : including passing personal information e.g. related to AAAI (Authentication, authorization, accounting Infrastructure); (c) financial and legal : such as payment, licence conditions; Appropriate interfaces from ICS-C to ICS-d are being designed to accommodate these aspects. The Open Science Cloud is timely because it provides a framework to discuss governance and sustainability for computational resource provision as well as an effective interpretation of federated approach to HPC(High Performance Computing) -HTC (High Throughput Computing). It will be a unique opportunity to share and adopt procurement policies to provide access to computational resources for RIs. The current state of discussions and expected roadmap for the EPOS-Open Science Cloud relationship are presented.

  18. The LHC timeline: a personal recollection (1980-2012)

    NASA Astrophysics Data System (ADS)

    Maiani, Luciano; Bonolis, Luisa

    2017-12-01

    The objective of this interview is to study the history of the Large Hadron Collider in the LEP tunnel at CERN, from first ideas to the discovery of the Brout-Englert-Higgs boson, seen from the point of view of a member of CERN scientific committees, of the CERN Council and a former Director General of CERN in the years of machine construction.

  19. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study

    PubMed Central

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien

    2017-01-01

    Background Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. Objective The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. Methods We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Results Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician’s ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. Conclusions AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. PMID:28951384

  20. Design of an off-axis visual display based on a free-form projection screen to realize stereo vision

    NASA Astrophysics Data System (ADS)

    Zhao, Yuanming; Cui, Qingfeng; Piao, Mingxu; Zhao, Lidong

    2017-10-01

    A free-form projection screen is designed for an off-axis visual display, which shows great potential in applications such as flight training for providing both accommodation and convergence cues for pilots. The method based on point cloud is proposed for the design of the free-form surface, and the design of the point cloud is controlled by a program written in the macro-language. In the visual display based on the free-form projection screen, when the error of the screen along Z-axis is 1 mm, the error of visual distance at each filed is less than 1%. And the resolution of the design for full field is better than 1‧, which meet the requirement of resolution for human eyes.

  1. The Transition from Diffuse to Dense Gas in Herschel Dust Emission Maps

    NASA Astrophysics Data System (ADS)

    Goldsmith, Paul

    Dense cores in dark clouds are the sites where young stars form. These regions manifest as relatively small (<0.1pc) pockets of cold and dense gas. If we wish to understand the star formation process, we have to understand the physical conditions in dense cores. This has been a main aim of star formation research in the past decade. Today, we do indeed possess a good knowledge of the density and velocity structure of cores, as well as their chemical evolution and physical lifetime. However, we do not understand well how dense cores form out of the diffuse gas clouds surrounding them. It is crucial that we constrain the relationship between dense cores and their environment: if we only understand dense cores, we may be able to understand how individual stars form --- but we would not know how the star forming dense cores themselves come into existence. We therefore propose to obtain data sets that reveal both dense cores and the clouds containing them in the same map. Based on these maps, we will study how dense cores form out of their natal clouds. Since cores form stars, this knowledge is crucial for the development of a complete theoretical and observational understanding of the formation of stars and their planets, as envisioned in NASA's Strategic Science Plan. Fortunately, existing archival data allow to derive exactly the sort of maps we need for our analysis. Here, we describe a program that exclusively builds on PACS and SPIRE dust emission imaging data from the NASA-supported Herschel mission. The degree-sized wide-field Herschel maps of the nearby (<260pc) Polaris Flare and Aquila Rift clouds are ideal for our work. They permit to resolve dense cores (<0.1pc), while the maps also reveal large-scale cloud structure (5pc and larger). We will generate column density maps from these dust emission maps and then run a tree-based hierarchical multi-scale structure analysis on them. Only this procedure permits to exploit the full potential of the maps: we will characterize cloud structure over a vast range of spatial scales. This work has many advantages over previous studies, where information about dense cores and their environment was pieced together using a variety of methods an instruments. Now, the Herschel maps permit for the first time to characterize both molecular clouds and their cores in one shot in a single data set. We use these data to answer a variety of simple yet very important questions. First, we study whether dense cores have sharp boundaries. If such boundaries exist, they would indicate that dense cores have an individual identity well-separate from the near-fractal cloud structure on larger spatial scales. Second, we will --- in very approximate sense --- derive global density gradients for molecular clouds from radii <0.1pc to 5pc and larger. These "synoptic" density gradients provide a useful quantitative description of the relation between cloud material at very different spatial scales. Also, these measurements can be compared to synoptic density gradients derived in the same fashion for theoretical cloud models. Third, we study how dense cores are nested into the "clumps" forming molecular clouds, i.e., we study whether the most massive dense cores in a cloud (<0.1pc) reside in the most massive regions identified on lager spatial scale (1pc and larger). This will show how the properties of dense cores are influenced by their environment. Our study will derive unique constraints to cloud structure. But our small sample forbids to make strong statements. This pilot study does thus prepare future larger efforts. Our entire project builds on data reduction and analysis methods which our team has used in the past. This guarantees a swift completion of the project with predictable efficiency. We present pilot studies that demonstrate that the data and analysis methods are suited to tackle the science goals. This project is thus guaranteed to return significant results.

  2. DIRAC universal pilots

    NASA Astrophysics Data System (ADS)

    Stagni, F.; McNab, A.; Luzzi, C.; Krzemien, W.; Consortium, DIRAC

    2017-10-01

    In the last few years, new types of computing models, such as IAAS (Infrastructure as a Service) and IAAC (Infrastructure as a Client), gained popularity. New resources may come as part of pledged resources, while others are in the form of opportunistic ones. Most but not all of these new infrastructures are based on virtualization techniques. In addition, some of them, present opportunities for multi-processor computing slots to the users. Virtual Organizations are therefore facing heterogeneity of the available resources and the use of an Interware software like DIRAC to provide the transparent, uniform interface has become essential. The transparent access to the underlying resources is realized by implementing the pilot model. DIRAC’s newest generation of generic pilots (the so-called Pilots 2.0) are the “pilots for all the skies”, and have been successfully released in production more than a year ago. They use a plugin mechanism that makes them easily adaptable. Pilots 2.0 have been used for fetching and running jobs on every type of resource, being it a Worker Node (WN) behind a CREAM/ARC/HTCondor/DIRAC Computing element, a Virtual Machine running on IaaC infrastructures like Vac or BOINC, on IaaS cloud resources managed by Vcycle, the LHCb High Level Trigger farm nodes, and any type of opportunistic computing resource. Make a machine a “Pilot Machine”, and all diversities between them will disappear. This contribution describes how pilots are made suitable for different resources, and the recent steps taken towards a fully unified framework, including monitoring. Also, the cases of multi-processor computing slots either on real or virtual machines, with the whole node or a partition of it, is discussed.

  3. The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments

    NASA Astrophysics Data System (ADS)

    Girone, M.; Andreeva, J.; Barreiro Megino, F. H.; Campana, S.; Cinquilli, M.; Di Girolamo, A.; Dimou, M.; Giordano, D.; Karavakis, E.; Kenyon, M. J.; Kokozkiewicz, L.; Lanciotti, E.; Litmaath, M.; Magini, N.; Negri, G.; Roiser, S.; Saiz, P.; Saiz Santos, M. D.; Schovancova, J.; Sciabà, A.; Spiga, D.; Trentadue, R.; Tuckett, D.; Valassi, A.; Van der Ster, D. C.; Shiers, J. D.

    2012-12-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.

  4. QM2017: Status and Key open Questions in Ultra-Relativistic Heavy-Ion Physics

    NASA Astrophysics Data System (ADS)

    Schukraft, Jurgen

    2017-11-01

    Almost exactly 3 decades ago, in the fall of 1986, the era of experimental ultra-relativistic E / m ≫ 1) heavy ion physics started simultaneously at the SPS at CERN and the AGS at Brookhaven with first beams of light Oxygen ions at fixed target energies of 200 GeV/A and 14.6 GeV/A, respectively. The event was announced by CERN [CERN's subatomic particle accelerators: Set up world-record in energy and break new ground for physics (CERN-PR-86-11-EN) (1986) 4 p, issued on 29 September 1986. URL (http://cds.cern.ch/record/855571)

  5. Update on CERN Search based on SharePoint 2013

    NASA Astrophysics Data System (ADS)

    Alvarez, E.; Fernandez, S.; Lossent, A.; Posada, I.; Silva, B.; Wagner, A.

    2017-10-01

    CERN’s enterprise Search solution “CERN Search” provides a central search solution for users and CERN service providers. A total of about 20 million public and protected documents from a wide range of document collections is indexed, including Indico, TWiki, Drupal, SharePoint, JACOW, E-group archives, EDMS, and CERN Web pages. In spring 2015, CERN Search was migrated to a new infrastructure based on SharePoint 2013. In the context of this upgrade, the document pre-processing and indexing process was redesigned and generalised. The new data feeding framework allows to profit from new functionality and it facilitates the long term maintenance of the system.

  6. Evolution of the ATLAS PanDA Production and Distributed Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maeno, T.; De, K.; Wenaus, T.

    2012-12-13

    Evolution of the ATLAS PanDA Production and Distributed Analysis System T Maeno1,5, K De2, T Wenaus1, P Nilsson2, R Walker3, A Stradling2, V Fine1, M Potekhin1, S Panitkin1 and G Compostella4 Published under licence by IOP Publishing Ltd Journal of Physics: Conference Series, Volume 396, Part 3 Article PDF References Citations Metrics 101 Total downloads Cited by 8 articles Turn on MathJax Share this article Article information Abstract The PanDA (Production and Distributed Analysis) system has been developed to meet ATLAS production and analysis requirements for a data-driven workload management system capable of operating at LHC data processing scale. PanDAmore » has performed well with high reliability and robustness during the two years of LHC data-taking, while being actively evolved to meet the rapidly changing requirements for analysis use cases. We will present an overview of system evolution including automatic rebrokerage and reattempt for analysis jobs, adaptation for the CernVM File System, support for the multi-cloud model through which Tier-2 sites act as members of multiple clouds, pledged resource management and preferential brokerage, and monitoring improvements. We will also describe results from the analysis of two years of PanDA usage statistics, current issues, and plans for the future.« less

  7. Refinement of the Pion PDF implementing Drell-Yan and Deep Inelastic Scattering Experimental Data

    NASA Astrophysics Data System (ADS)

    Barry, Patrick; Sato, Nobuo; Melnitchouk, Wally; Ji, Chueng-Ryong

    2017-09-01

    We realize that an abundance of ``sea'' quarks and gluons (as opposed to three valence quarks) is crucial to understanding the mass and internal structure of the proton. An effective pion cloud exists around the core valence structure. In the Drell-Yan (DY) process, two hadrons collide, one donating a quark and the other donating an antiquark. The quark-antiquark pair annihilate, forming a virtual photon, which creates a lepton-antilepton pair. By measuring their cross-sections, we obtain information about the parton distribution function (PDF) of the hadrons. The PDF is the probability of finding a parton at a momentum fraction of the hadron, x, between 0 and 1. Complementary to the DY process is deep inelastic scattering (DIS). Here, a target nucleon is probed by a lepton, and we investigate the pion cloud of the nucleon. The experiments H1 and ZEUS done at HERA at DESY collect DIS data by detecting a leading neutron (LN). By using nested sampling to generate sets of parameters, we present some preliminary fits of pion PDFs to DY (Fermilab-E615 and CERN-NA10) and LN (H1 and ZEUS) datasets. We aim to perform a full NLO QCD global analysis to determine pion PDFs accurately for all x. There have been no attempts to fit the pion PDF using both low and high x data until now.

  8. In silico vs. Over the Clouds: On-the-Fly Mental State Estimation of Aircraft Pilots, Using a Functional Near Infrared Spectroscopy Based Passive-BCI

    PubMed Central

    Gateau, Thibault; Ayaz, Hasan; Dehais, Frédéric

    2018-01-01

    There is growing interest for implementing tools to monitor cognitive performance in naturalistic work and everyday life settings. The emerging field of research, known as neuroergonomics, promotes the use of wearable and portable brain monitoring sensors such as functional near infrared spectroscopy (fNIRS) to investigate cortical activity in a variety of human tasks out of the laboratory. The objective of this study was to implement an on-line passive fNIRS-based brain computer interface to discriminate two levels of working memory load during highly ecological aircraft piloting tasks. Twenty eight recruited pilots were equally split into two groups (flight simulator vs. real aircraft). In both cases, identical approaches and experimental stimuli were used (serial memorization task, consisting in repeating series of pre-recorded air traffic control instructions, easy vs. hard). The results show pilots in the real flight condition committed more errors and had higher anterior prefrontal cortex activation than pilots in the simulator, when completing cognitively demanding tasks. Nevertheless, evaluation of single trial working memory load classification showed high accuracy (>76%) across both experimental conditions. The contributions here are two-fold. First, we demonstrate the feasibility of passively monitoring cognitive load in a realistic and complex situation (live piloting of an aircraft). In addition, the differences in performance and brain activity between the two experimental conditions underscore the need for ecologically-valid investigations. PMID:29867411

  9. In silico vs. Over the Clouds: On-the-Fly Mental State Estimation of Aircraft Pilots, Using a Functional Near Infrared Spectroscopy Based Passive-BCI.

    PubMed

    Gateau, Thibault; Ayaz, Hasan; Dehais, Frédéric

    2018-01-01

    There is growing interest for implementing tools to monitor cognitive performance in naturalistic work and everyday life settings. The emerging field of research, known as neuroergonomics, promotes the use of wearable and portable brain monitoring sensors such as functional near infrared spectroscopy (fNIRS) to investigate cortical activity in a variety of human tasks out of the laboratory. The objective of this study was to implement an on-line passive fNIRS-based brain computer interface to discriminate two levels of working memory load during highly ecological aircraft piloting tasks. Twenty eight recruited pilots were equally split into two groups (flight simulator vs. real aircraft). In both cases, identical approaches and experimental stimuli were used (serial memorization task, consisting in repeating series of pre-recorded air traffic control instructions, easy vs. hard). The results show pilots in the real flight condition committed more errors and had higher anterior prefrontal cortex activation than pilots in the simulator, when completing cognitively demanding tasks. Nevertheless, evaluation of single trial working memory load classification showed high accuracy (>76%) across both experimental conditions. The contributions here are two-fold. First, we demonstrate the feasibility of passively monitoring cognitive load in a realistic and complex situation (live piloting of an aircraft). In addition, the differences in performance and brain activity between the two experimental conditions underscore the need for ecologically-valid investigations.

  10. Remote Sensing of Supercooled Cloud Layers in Cold Climate Using Ground Based Integrated Sensors System and Comparison with Pilot Reports and model forecasts

    NASA Astrophysics Data System (ADS)

    Boudala, Faisal; Wu, Di; Gultepe, Ismail; Anderson, Martha; turcotte, marie-france

    2017-04-01

    In-flight aircraft icing is one of the major weather hazards to aviation . It occurs when an aircraft passes through a cloud layer containing supercooled drops (SD). The SD in contact with the airframe freezes on the surface which degrades the performance of the aircraft.. Prediction of in-flight icing requires accurate prediction of SD sizes, liquid water content (LWC), and temperature. The current numerical weather predicting (NWP) models are not capable of making accurate prediction of SD sizes and associated LWC. Aircraft icing environment is normally studied by flying research aircraft, which is quite expensive. Thus, developing a ground based remote sensing system for detection of supercooled liquid clouds and characterization of their impact on severity of aircraft icing one of the important tasks for improving the NWPs based predictions and validations. In this respect, Environment and Climate Change Canada (ECCC) in cooperation with the Department of National Defense (DND) installed a number of specialized ground based remote sensing platforms and present weather sensors at Cold Lake, Alberta that includes a multi-channel microwave radiometer (MWR), K-band Micro Rain radar (MRR), Ceilometer, Parsivel distrometer and Vaisala PWD22 present weather sensor. In this study, a number of pilot reports confirming icing events and freezing precipitation that occurred at Cold Lake during the 2014-2016 winter periods and associated observation data for the same period are examined. The icing events are also examined using aircraft icing intensity estimated using ice accumulation model which is based on a cylindrical shape approximation of airfoil and the Canadian High Resolution Regional Deterministic Prediction System (HRDPS) model predicted LWC, median volume diameter and temperature. The results related to vertical atmospheric profiling conditions, surface observations, and the Canadian High Resolution Regional Deterministic Prediction System (HRDPS) model predictions are given. Preliminary results suggest that remote sensing and present weather sensors based observations of cloud SD regions can be used to describe micro and macro physical characteristics of the icing conditions. The model based icing intensity prediction reasonably agreed with the PIREPs and MWR observations.

  11. Mitigation of volcanic hazards to aviation: The need for real-time integration of multiple data sources (Invited)

    NASA Astrophysics Data System (ADS)

    Schneider, D. J.

    2009-12-01

    The successful mitigation of volcanic hazards to aviation requires rapid interpretation and coordination of data from multiple sources, and communication of information products to a variety of end users. This community of information providers and information users include volcano observatories, volcanic ash advisory centers, meteorological watch offices, air traffic control centers, airline dispatch and military flight operations centers, and pilots. Each of these entities has capabilities and needs that are unique to their situations that evolve over a range of time spans. Prior to an eruption, information about probable eruption scenarios are needed in order to allow for contingency planning. Once a hazardous eruption begins, the immediate questions are where, when, how high, and how long will the eruption last? Following the initial detection of an eruption, the need for information changes to forecasting the movement of the volcanic cloud, determining whether ground operations will be affected by ash fall, and estimating how long the drifting volcanic cloud will remain hazardous. A variety of tools have been developed and/or improved over the past several years that provide additional data sources about volcanic hazards that is pertinent to the aviation sector. These include seismic and pressure sensors, ground-based radar and lidar, web cameras, ash dispersion models, and more sensitive satellite sensors that are capable of better detecting volcanic ash, gases and aerosols. Along with these improved capabilities come increased challenges in rapidly assimilating the available data sources, which come from a variety of data providers. In this presentation, examples from the recent large eruptions of Okmok, Kasatochi, and Sarychev Peak volcanoes will be used to demonstrate the challenges faced by hazard response agencies. These eruptions produced volcanic clouds that were dispersed over large regions of the Northern Hemisphere and were observed by pilots and detected by various satellite sensors for several weeks. The disruption to aviation caused by these eruptions further emphasizes the need to improve the real-time characterization of volcanic clouds (altitude, composition, particle size, and concentration) and to better understand the impacts of volcanic ash, gases and aerosols on aircraft, flight crews, and passengers.

  12. Videosonde observations of tropical precipitating clouds developed over the Sumatera Island, Indonesia

    NASA Astrophysics Data System (ADS)

    Suzuki, Kenji; Nakagawa, Katsuhiro; Kawano, Tetsuya; Mori, Shuichi; Katsumata, Masaki; Yoneyama, Kunio

    2017-04-01

    During November-December 2015, as a pilot study of the Years of the Maritime and Continent (YMC), a campaign observation over the southwestern coastal land and adjacent sea of Sumatera Island, Indonesia was carried out to examine land-ocean coupling processes in mechanisms of coastal heavy rain. Our videosonde observations were conducted as a part of this campaign for the better understandings of microphysical features in tropical precipitating clouds developed over the Sumatera Island. Videosonde is one of strong tools to measure hydrometeors in clouds directly. It is a balloon-borne radiosonde that acquires images of precipitation particles via a CCD camera. The system has a stroboscopic illumination that provides information on particle size and shape. One of the advantages for the videosonde is to capture images of precipitation particles as they are in the air because the videosonde can obtain particle images without contact. Recorded precipitation particles are classified as raindrops, frozen drops (hail), graupel, ice crystals, or snowflakes on the basis of transparency and shape. Videosondes were launched from BMKG Bengkulu weather station (3.86°S,102.3°E). After the launch of a videosonde, the Range Height Indicator (RHI) scans by a C-band dual-polarimetric radar installed on R/V Mirai, which was approximately 50 km off Sumatera Island, were continuously performed, targeting the videosonde in the precipitating cloud. Eighteen videosondes were launched into various types of tropical precipitating clouds during the Pre-YMC campaign.

  13. Foliage penetration by using 4-D point cloud data

    NASA Astrophysics Data System (ADS)

    Méndez Rodríguez, Javier; Sánchez-Reyes, Pedro J.; Cruz-Rivera, Sol M.

    2012-06-01

    Real-time awareness and rapid target detection are critical for the success of military missions. New technologies capable of detecting targets concealed in forest areas are needed in order to track and identify possible threats. Currently, LAser Detection And Ranging (LADAR) systems are capable of detecting obscured targets; however, tracking capabilities are severely limited. Now, a new LADAR-derived technology is under development to generate 4-D datasets (3-D video in a point cloud format). As such, there is a new need for algorithms that are able to process data in real time. We propose an algorithm capable of removing vegetation and other objects that may obfuscate concealed targets in a real 3-D environment. The algorithm is based on wavelets and can be used as a pre-processing step in a target recognition algorithm. Applications of the algorithm in a real-time 3-D system could help make pilots aware of high risk hidden targets such as tanks and weapons, among others. We will be using a 4-D simulated point cloud data to demonstrate the capabilities of our algorithm.

  14. In-Situ Microphysics from the RACORO IOP

    DOE Data Explorer

    McFarquhar, Greg

    2013-11-08

    These files were generated by Greg McFarquhar and Robert Jackson at the University of Illinois. Please contact mcfarq@atmos.uiuc.edu or rjackso2@atmos.uiuc.edu for more information or for assistance in interpreting the content of these files. We highly recommend that anyone wishing to use these files do so in a collaborative endeavor and we welcome queries and opportunities for collaboration. There are caveats associated with the use of the data which are difficult to thoroughly document and not all products for all time periods have been thoroughly examined. This is a value added data set of the best estimate of cloud microphysical parameters derived using data collected by the cloud microphysical probes installed on the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS) Twin Otter during RACORO. These files contain best estimates of liquid size distributions N(D) in terms of droplet diameter D, liquid water content LWC, extinction of liquid drops beta, effective radius of cloud drops (re), total number concentration of droplets NT, and radar reflectivity factor Z at 1 second resolution.

  15. Big Bang Day: The Making of CERN (Episode 1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2009-10-06

    A two-part history of the CERN project. Quentin Cooper explores the fifty-year history of CERN, the European particle physics laboratory in Switzerland. The institution was created to bring scientists together after WW2 .......

  16. Big Bang Day: The Making of CERN (Episode 1)

    ScienceCinema

    None

    2017-12-09

    A two-part history of the CERN project. Quentin Cooper explores the fifty-year history of CERN, the European particle physics laboratory in Switzerland. The institution was created to bring scientists together after WW2 .......

  17. Efficient transfer of weather information to the pilot in flight

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. H.

    1982-01-01

    Efficient methods for providing weather information to the pilot in flight are summarized. Use of discrete communications channels in the aeronautical, VHF band or subcarriers in the VOR navigation band are considered the best possibilities. Data rates can be provided such that inputs to the ground based transmitters from 2400 band telephone lines are easily accommodated together with additional data. The crucial weather data considered for uplinking are identified as radar reflectivity patterns relating to precipitation, spherics data, hourly sequences, nowcasts, forecasts, cloud top heights with freezing and icing conditions, the critical weather map and satellite maps. NEXRAD, the ground based, Doppler weather radar which will produce an improved weather product also encourages use of an uplink to fully utilize its capability to improve air safety.

  18. CERN welcomes new members

    NASA Astrophysics Data System (ADS)

    2017-08-01

    Lithuania is on course to become an associate member of CERN, pending final approval by the Lithuanian parliament. Associate membership will allow representatives of the Baltic nation to take part in meetings of the CERN Council, which oversees the Geneva-based physics lab.

  19. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study.

    PubMed

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien; Hwang, Juey-Jen; Ho, Yi-Lwun

    2017-09-26

    Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician's ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. ©Ying-Hsien Chen, Chi-Sheng Hung, Ching-Chang Huang, Yu-Chien Hung, Juey-Jen Hwang, Yi-Lwun Ho. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 26.09.2017.

  20. STS-64 launch view

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Passing through some of the trailer clouds of an overcast sky which temporarily postponed its launch, the Space Shuttle Discovery heads for its 19th Earth orbital flight. Several kilometers away, astronaut John H. Casper, Jr., who took this picture, was piloting the Shuttle Training Aircraft (STA) from which the launch and landing area weather was being monitored. Onboard Discovery were astronauts Richard N. Richards, L. Blaine Hammond, Jr., Mark C. Lee, Carl J. Meade, Susan J. Helms, and Jerry M. Linenger.

  1. STS-75 Flight Day 10

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On this tenth day of the STS-75 mission, the flight crew, Cmdr. Andrew Allen, Pilot Scott Horowitz, Payload Cmdr. Franklin Chang-Diaz, Payload Specialist Umberto Guidoni (Italy), and Missions Specialists Jeffrey Hoffman, Maurizio Cheli (ESA), and Claude Nicollier (ESA), are shown performing middeck and Microgravity lab experiments, including the Material pour l'Etude des Phenomenes Interessant la Solidification sur Terre et en Orbite (MEPHISTO) experiment, as well as some material burn tests. Earth views include cloud cover and horizon shots.

  2. Use of Weather Information by General Aviation Pilots. Part 2. Qualitative: Exploring Factors Involved in Weather-Related Decision Making

    DTIC Science & Technology

    2008-03-01

    fic nstances wthn some gven text, speech, or behavor (Mles & Huberman , 1994). Multple nstances of a sngle factor then consttute a “theme...Aerospace Medcne. Mles, M.B., and Huberman , A.M. (1994). Qualitative data analysis. Thousand Oaks, CA: Sage. Mlgram, S. (2004). Obedience to authority...personal minimum for GA VFR visibility ________ statute miles 12. Your normal personal minimum for GA VFR cloud ceiling ________ feet AGL For

  3. Nightfall and the Cloud: Examining the Future of Unmanned Combat Aerial Vehicles and Remotely Piloted Aircraft

    DTIC Science & Technology

    2015-10-01

    likely outcomes and make decisions; however, that is a fundamentally different dy- namic than a true learning process. Preprogrammed assumptions and design ...ISR, targeting, forward air control, laser designation , weapons delivery, battle damage assessment ISR, targeting acquisition, and attack...KTAS 400 KTAS Weapons Payload N/A N/A 2 Hellfire missiles 4 Hellfire missiles 14 Hellfire or 4 Hellfire and 2x GBU -12 or 2 Joint Direct Attack

  4. EFQPSK Versus CERN: A Comparative Study

    NASA Technical Reports Server (NTRS)

    Borah, Deva K.; Horan, Stephen

    2001-01-01

    This report presents a comparative study on Enhanced Feher's Quadrature Phase Shift Keying (EFQPSK) and Constrained Envelope Root Nyquist (CERN) techniques. These two techniques have been developed in recent times to provide high spectral and power efficiencies under nonlinear amplifier environment. The purpose of this study is to gain insights into these techniques and to help system planners and designers with an appropriate set of guidelines for using these techniques. The comparative study presented in this report relies on effective simulation models and procedures. Therefore, a significant part of this report is devoted to understanding the mathematical and simulation models of the techniques and their set-up procedures. In particular, mathematical models of EFQPSK and CERN, effects of the sampling rate in discrete time signal representation, and modeling of nonlinear amplifiers and predistorters have been considered in detail. The results of this study show that both EFQPSK and CERN signals provide spectrally efficient communications compared to filtered conventional linear modulation techniques when a nonlinear power amplifier is used. However, there are important differences. The spectral efficiency of CERN signals, with a small amount of input backoff, is significantly better than that of EFQPSK signals if the nonlinear amplifier is an ideal clipper. However, to achieve such spectral efficiencies with a practical nonlinear amplifier, CERN processing requires a predistorter which effectively translates the amplifier's characteristics close to those of an ideal clipper. Thus, the spectral performance of CERN signals strongly depends on the predistorter. EFQPSK signals, on the other hand, do not need such predistorters since their spectra are almost unaffected by the nonlinear amplifier, Ibis report discusses several receiver structures for EFQPSK signals. It is observed that optimal receiver structures can be realized for both coded and uncoded EFQPSK signals with not too much increase in computational complexity. When a nonlinear amplifier is used, the bit error rate (BER) performance of the CERN signals with a matched filter receiver is found to be more than one decibel (dB) worse compared to the bit error performance of EFQPSK signals. Although channel coding is found to provide BER performance improvement for both EFQPSK and CERN signals, the performance of EFQPSK signals remains better than that of CERN. Optimal receiver structures for CERN signals with nonlinear equalization is left as a possible future work. Based on the numerical results, it is concluded that, in nonlinear channels, CERN processing leads towards better bandwidth efficiency with a compromise in power efficiency. Hence for bandwidth efficient communications needs, CERN is a good solution provided effective adaptive predistorters can be realized. On the other hand, EFQPSK signals provide a good power efficient solution with a compromise in band width efficiency.

  5. Sharing scientific discovery globally: toward a CERN virtual visit service

    NASA Astrophysics Data System (ADS)

    Goldfarb, S.; Hatzifotiadou, D.; Lapka, M.; Papanestis, A.

    2017-10-01

    The installation of virtual visit services by the LHC collaborations began shortly after the first high-energy collisions were provided by the CERN accelerator in 2010. The experiments: ATLAS [1], CMS [2], LHCb [3], and ALICE [4] have all joined in this popular and effective method to bring the excitement of scientific exploration and discovery into classrooms and other public venues around the world. Their programmes, which use a combination of video conference, webcast, and video recording to communicate with remote audiences have already reached tens of thousands of viewers, and the demand only continues to grow. Other venues, such as the CERN Control Centre, are also considering similar permanent installations. We present a summary of the development of the various systems in use around CERN today, including the technology deployed and a variety of use cases. We then lay down the arguments for the creation of a CERN-wide service that would support these programmes in a more coherent and effective manner. Potential services include a central booking system and operational management similar to what is currently provided for the common CERN video conference facilities. Certain choices in technology could be made to support programmes based on popular tools including (but not limited to) Skype™ [5], Google Hangouts [6], Facebook Live [7], and Periscope [8]. Successful implementation of the project, which relies on close partnership between the experiments, CERN IT CDA [9], and CERN IR ECO [10], has the potential to reach an even larger, global audience, more effectively than ever before.

  6. Near-field monitoring of the Eyjafjallajökull eruption cloud

    NASA Astrophysics Data System (ADS)

    Bjornsson, H.; Pedersen, G. N.; Arason, P.; Karlsdottir, S.; Vogfjord, K. S.; Thorsteinsson, H.; Palmason, B.; Sigurdsson, A.

    2010-12-01

    When the ice capped Eyjafjallajökull volcano erupted in April 2010 the Icelandic Meteorological Office (IMO) employed range of observation systems to monitor the eruption cloud and the progress of the eruption. The main tool for monitoring the volcanic cloud was a C-band weather radar located at Keflavik international airport, about 150 km from the volcano. Radar monitoring was supported by visual observations, on-site and from a network of web-cameras. Airborne observations allowed for detailed examination of the plume, and pilot reports proved to be an extremely useful aid in verifying the radar data. Furthermore, data from lightning sensors and radiosondes was used to supplement information on plume height. Satellite images, from several frequency bands and both polar as well as geostationary satellites were used to track the orientation of the eruption cloud, and brightness temperature difference was used to estimate far field ash dispersal. Ash fall monitoring and meteorological observations supplemented with atmospheric reanalysis and wind forecasts were used to track local ash dispersal. Information from these data sources was combined with geophysical and hydrological measurements (seismic, GPS, strain and river flow gauges) made by the IMO, the Earth Institute of the University of Iceland and other institutions. The data generated by these different observation types gives a consistent picture of the progression of the eruption and reveals interesting connections. For example, volcanic tremors tended to be inversly related to the eruption cloud height, increasing tremors were associated lower plume height and reduced eruption strength. Furthermore, the occurrence of lighting seems to be explained by both sufficiently strong plume and cold ambient air. Wind also had a clear effect on the eruption cloud height. In general, simple scaling laws for the relationship between the emission rate of the volcano, and the height of the eruption do not seem to explain all the height variations in the eruption cloud.

  7. Learning with the ATLAS Experiment at CERN

    ERIC Educational Resources Information Center

    Barnett, R. M.; Johansson, K. E.; Kourkoumelis, C.; Long, L.; Pequenao, J.; Reimers, C.; Watkins, P.

    2012-01-01

    With the start of the LHC, the new particle collider at CERN, the ATLAS experiment is also providing high-energy particle collisions for educational purposes. Several education projects--education scenarios--have been developed and tested on students and teachers in several European countries within the Learning with ATLAS@CERN project. These…

  8. First experience with the new .cern Top Level Domain

    NASA Astrophysics Data System (ADS)

    Alvarez, E.; Malo de Molina, M.; Salwerowicz, M.; Silva De Sousa, B.; Smith, T.; Wagner, A.

    2017-10-01

    In October 2015, CERN’s core website has been moved to a new address, http://home.cern, marking the launch of the brand new top-level domain .cern. In combination with a formal governance and registration policy, the IT infrastructure needed to be extended to accommodate the hosting of Web sites in this new top level domain. We will present the technical implementation in the framework of the CERN Web Services that allows to provide virtual hosting, a reverse proxy solution and that also includes the provisioning of SSL server certificates for secure communications.

  9. Inflight - Apollo 9 (Crew Activities)

    NASA Image and Video Library

    1969-03-06

    S69-26150 (6 March 1969) --- Television watchers on Earth saw this view of the Apollo 9 Command Module during the second live telecast from Apollo 9 early Thursday afternoon on the fourth day in space. This view is looking through the docking window of the Lunar Module. The cloud-covered Earth can be seen in the background. Inside the Lunar Module "Spider" were Astronauts James A. McDivitt, Apollo 9 commander; and Russell L. Schweickart, lunar module pilot. At this moment Apollo 9 was orbiting Earth with the Command and Service Modules docked nose-to-nose with the Lunar Module. Astronaut David R. Scott, command module pilot, remained at the controls in the Command Module "Gumdrop" while the other two astronauts checked out the Lunar Module. McDivitt and Schweickart moved into the Lunar Module from the Command Module by way of the docking tunnel.

  10. Seasonal and Vegetational Variation in Albedo Measured During CERES Ground-Validation Pilot Study

    NASA Technical Reports Server (NTRS)

    Schuster, G. L.; Whitlock, C. H.; Plant, J. V.; Wheeler, R. J.; Moats, C. D.; Larman, K. T.; Ayers, J. K.; Feldl, E. K.

    1997-01-01

    The Clouds and the Earth's Radiant Energy System (CERES) satellite is scheduled for launch in the Fall of 1997 aboard the Tropical Rainfall Measuring Mission (TRMM). A surface measurement pilot study has been initiated in a 37-km region near Richmond, VA, for comparison with the CERES surface flux retrievals. Two-minute averaged upwelling and downwelling surface fluxes over a mostly deciduous forest have been recorded daily for the past two years, and show a broadband, shortwave daily albedo increase during the summer months. Evidence is presented that indicates vegetational changes in the forest as the overriding mechanism for this change. Upwelling flux measured over the entire region by helicopter-mounted instrumentation has been processed for four solar seasons. Future plans include the installation of four more albedo surface sites over various types of vegetation throughout the region.

  11. Hangout with CERN: a direct conversation with the public

    NASA Astrophysics Data System (ADS)

    Rao, Achintya; Goldfarb, Steven; Kahle, Kate

    2016-04-01

    Hangout with CERN refers to a weekly, half-hour-long, topical webcast hosted at CERN. The aim of the programme is threefold: (i) to provide a virtual tour of various locations and facilities at CERN, (ii) to discuss the latest scientific results from the laboratory, and, most importantly, (iii) to engage in conversation with the public and answer their questions. For each ;episode;, scientists gather around webcam-enabled computers at CERN and partner institutes/universities, connecting to one another using the Google+ social network's ;Hangouts; tool. The show is structured as a conversation mediated by a host, usually a scientist, and viewers can ask questions to the experts in real time through a Twitter hashtag or YouTube comments. The history of Hangout with CERN can be traced back to ICHEP 2012, where several physicists crowded in front of a laptop connected to Google+, using a ;Hangout On Air; webcast to explain to the world the importance of the discovery of the Higgs-like boson, announced just two days before at the same conference. Hangout with CERN has also drawn inspiration from two existing outreach endeavours: (i) ATLAS Virtual Visits, which connected remote visitors with scientists in the ATLAS Control Room via video conference, and (ii) the Large Hangout Collider, in which CMS scientists gave underground tours via Hangouts to groups of schools and members of the public around the world. In this paper, we discuss the role of Hangout with CERN as a bi-directional outreach medium and an opportunity to train scientists in effective communication.

  12. University of Wisconsin Cirrus Remote Sensing Pilot Experiment

    NASA Technical Reports Server (NTRS)

    Ackerman, Steven A.; Eloranta, Ed W.; Grund, Chris J.; Knuteson, Robert O.; Revercomb, Henry E.; Smith, William L.; Wylie, Donald P.

    1993-01-01

    During the period of 26 October 1989 through 6 December 1989 a unique complement of measurements was made at the University of Wisconsin-Madison to study the radiative properties of cirrus clouds. Simultaneous observations were obtained from a scanning lidar, two interferometers, a high spectral resolution lidar, geostationary and polar orbiting satellites, radiosonde launches, and a whole-sky imager. This paper describes the experiment, the instruments deployed, and, as an example, the data collected during one day of the experiment.

  13. Unmanned Aerial Vehicles and Weapons of Mass Destruction: A Lethal Combination?

    DTIC Science & Technology

    1997-08-01

    assessment ( BDA ), where they allow armed forces to avoid placing pilots at risk. They have also been used to gather nonmilitary information in... BDA purposes. In fact, the Pioneer UAV was praised as "the single most valuable intelligence collector" in the war against Iraq.6 They have proved to...carries half the weapon’s total energy. Finally, as the explosion takes on the familiar " mushroom " shape, winds suck back into the cloud, adding to the

  14. The Real-Time Monitoring Service Platform for Land Supervision Based on Cloud Integration

    NASA Astrophysics Data System (ADS)

    Sun, J.; Mao, M.; Xiang, H.; Wang, G.; Liang, Y.

    2018-04-01

    Remote sensing monitoring has become the important means for land and resources departments to strengthen supervision. Aiming at the problems of low monitoring frequency and poor data currency in current remote sensing monitoring, this paper researched and developed the cloud-integrated real-time monitoring service platform for land supervision which enhanced the monitoring frequency by acquiring the domestic satellite image data overall and accelerated the remote sensing image data processing efficiency by exploiting the intelligent dynamic processing technology of multi-source images. Through the pilot application in Jinan Bureau of State Land Supervision, it has been proved that the real-time monitoring technical method for land supervision is feasible. In addition, the functions of real-time monitoring and early warning are carried out on illegal land use, permanent basic farmland protection and boundary breakthrough in urban development. The application has achieved remarkable results.

  15. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-15

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network Constituents, Fundamental Forces and Symmetries of the Universe. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva.

  16. CERN@school: bringing CERN into the classroom

    NASA Astrophysics Data System (ADS)

    Whyntie, T.; Cook, J.; Coupe, A.; Fickling, R. L.; Parker, B.; Shearer, N.

    2016-04-01

    CERN@school brings technology from CERN into the classroom to aid with the teaching of particle physics. It also aims to inspire the next generation of physicists and engineers by giving participants the opportunity to be part of a national collaboration of students, teachers and academics, analysing data obtained from detectors based on the ground and in space to make new, curiosity-driven discoveries at school. CERN@school is based around the Timepix hybrid silicon pixel detector developed by the Medipix 2 Collaboration, which features a 300 μm thick silicon sensor bump-bonded to a Timepix readout ASIC. This defines a 256-by-256 grid of pixels with a pitch of 55 μm, the data from which can be used to visualise ionising radiation in a very accessible way. Broadly speaking, CERN@school consists of a web portal that allows access to data collected by the Langton Ultimate Cosmic ray Intensity Detector (LUCID) experiment in space and the student-operated Timepix detectors on the ground; a number of Timepix detector kits for ground-based experiments, to be made available to schools for both teaching and research purposes; and educational resources for teachers to use with LUCID data and detector kits in the classroom. By providing access to cutting-edge research equipment, raw data from ground and space-based experiments, CERN@school hopes to provide the foundation for a programme that meets the many of the aims and objectives of CERN and the project's supporting academic and industrial partners. The work presented here provides an update on the status of the programme as supported by the UK Science and Technology Facilities Council (STFC) and the Royal Commission for the Exhibition of 1851. This includes recent results from work with the GridPP Collaboration on using grid resources with schools to run GEANT4 simulations of CERN@school experiments.

  17. Volume-based response evaluation with consensual lesion selection: a pilot study by using cloud solutions and comparison to RECIST 1.1.

    PubMed

    Oubel, Estanislao; Bonnard, Eric; Sueoka-Aragane, Naoko; Kobayashi, Naomi; Charbonnier, Colette; Yamamichi, Junta; Mizobe, Hideaki; Kimura, Shinya

    2015-02-01

    Lesion volume is considered as a promising alternative to Response Evaluation Criteria in Solid Tumors (RECIST) to make tumor measurements more accurate and consistent, which would enable an earlier detection of temporal changes. In this article, we report the results of a pilot study aiming at evaluating the effects of a consensual lesion selection on volume-based response (VBR) assessments. Eleven patients with lung computed tomography scans acquired at three time points were selected from Reference Image Database to Evaluate Response to therapy in lung cancer (RIDER) and proprietary databases. Images were analyzed according to RECIST 1.1 and VBR criteria by three readers working in different geographic locations. Cloud solutions were used to connect readers and carry out a consensus process on the selection of lesions used for computing response. Because there are not currently accepted thresholds for computing VBR, we have applied a set of thresholds based on measurement variability (-35% and +55%). The benefit of this consensus was measured in terms of multiobserver agreement by using Fleiss kappa (κfleiss) and corresponding standard errors (SE). VBR after consensual selection of target lesions allowed to obtain κfleiss = 0.85 (SE = 0.091), which increases up to 0.95 (SE = 0.092), if an extra consensus on new lesions is added. As a reference, the agreement when applying RECIST without consensus was κfleiss = 0.72 (SE = 0.088). These differences were found to be statistically significant according to a z-test. An agreement on the selection of lesions allows reducing the inter-reader variability when computing VBR. Cloud solutions showed to be an interesting and feasible strategy for standardizing response evaluations, reducing variability, and increasing consistency of results in multicenter clinical trials. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  18. Unexpectedly acidic nanoparticles formed in dimethylamine-ammonia-sulfuric-acid nucleation experiments at CLOUD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawler, Michael J.; Winkler, Paul M.; Kim, Jaeseok

    New particle formation driven by acid–base chemistry was initiated in the CLOUD chamber at CERN by introducing atmospherically relevant levels of gas-phase sulfuric acid and dimethylamine (DMA). Ammonia was also present in the chamber as a gas-phase contaminant from earlier experiments. The composition of particles with volume median diameters (VMDs) as small as 10 nm was measured by the Thermal Desorption Chemical Ionization Mass Spectrometer (TDCIMS). Particulate ammonium-to-dimethylaminium ratios were higher than the gas-phase ammonia-to-DMA ratios, suggesting preferential uptake of ammonia over DMA for the collected 10–30 nm VMD particles. This behavior is not consistent with present nanoparticle physicochemical models,more » which predict a higher dimethylaminium fraction when NH 3 and DMA are present at similar gas-phase concentrations. Despite the presence in the gas phase of at least 100 times higher base concentrations than sulfuric acid, the recently formed particles always had measured base : acid ratios lower than 1 : 1. The lowest base fractions were found in particles below 15 nm VMD, with a strong size-dependent composition gradient. The reasons for the very acidic composition remain uncertain, but a plausible explanation is that the particles did not reach thermodynamic equilibrium with respect to the bases due to rapid heterogeneous conversion of SO 2 to sulfate. Furthermore, these results indicate that sulfuric acid does not require stabilization by ammonium or dimethylaminium as acid–base pairs in particles as small as 10 nm.« less

  19. Unexpectedly acidic nanoparticles formed in dimethylamine-ammonia-sulfuric-acid nucleation experiments at CLOUD

    DOE PAGES

    Lawler, Michael J.; Winkler, Paul M.; Kim, Jaeseok; ...

    2016-11-03

    New particle formation driven by acid–base chemistry was initiated in the CLOUD chamber at CERN by introducing atmospherically relevant levels of gas-phase sulfuric acid and dimethylamine (DMA). Ammonia was also present in the chamber as a gas-phase contaminant from earlier experiments. The composition of particles with volume median diameters (VMDs) as small as 10 nm was measured by the Thermal Desorption Chemical Ionization Mass Spectrometer (TDCIMS). Particulate ammonium-to-dimethylaminium ratios were higher than the gas-phase ammonia-to-DMA ratios, suggesting preferential uptake of ammonia over DMA for the collected 10–30 nm VMD particles. This behavior is not consistent with present nanoparticle physicochemical models,more » which predict a higher dimethylaminium fraction when NH 3 and DMA are present at similar gas-phase concentrations. Despite the presence in the gas phase of at least 100 times higher base concentrations than sulfuric acid, the recently formed particles always had measured base : acid ratios lower than 1 : 1. The lowest base fractions were found in particles below 15 nm VMD, with a strong size-dependent composition gradient. The reasons for the very acidic composition remain uncertain, but a plausible explanation is that the particles did not reach thermodynamic equilibrium with respect to the bases due to rapid heterogeneous conversion of SO 2 to sulfate. Furthermore, these results indicate that sulfuric acid does not require stabilization by ammonium or dimethylaminium as acid–base pairs in particles as small as 10 nm.« less

  20. Chance Encounter with a Stratospheric Kerosene Rocket Plume from Russia over California

    NASA Technical Reports Server (NTRS)

    Newman, P. A.; Wilson, J. C.; Ross, M. N.; Brock, C.; Sheridan, P.; Schoeberl, M. R.; Lait, L. R.; Bui, T. P.; Loewenstein, M.

    1999-01-01

    During a routine ER-2 aircraft high-altitude test flight on April 18, 1997, an unusual aerosol cloud was detected at 20 km altitude near the California coast at about 370 degrees N latitude. Not visually observed by the ER-2 pilot, the cloud was characterized bv high concentration of soot and sulfate aerosol in a region over 100 km in horizontal extent indicating that the source of the plume was a large hydrocarbon fueled vehicle, most likely a launch vehicle powered only by rocket motors burning liquid oxygen and kerosene. Two Russian Soyuz rockets could conceivably have produced the plume. The first was launched from the Baikonur Cosmodrome, Kazakhstan on April 6th; the second was launched from Plesetsk, Russia on April 9. Air parcel trajectory calculations and long-lived tracer gas concentrations in the cloud indicate that the Baikonur rocket launch is the most probable source of the plume. The parcel trajectory calculations do not unambiguously trace the transport of the Soyuz plume from Asia to North America, illustrating serious flaws in the point-to-point trajectory calculations. This chance encounter represents the only measurement of the stratospheric effects of emissions from a rocket powered exclusively with hydrocarbon fuel.

  1. News Conference: Serbia hosts teachers' seminar Resources: Teachers TV website closes for business Festival: Science takes to the stage in Denmark Research: How noise affects learning in secondary schools CERN: CERN visit inspires new teaching ideas Education: PLS aims to improve perception of science for school students Conference: Scientix conference discusses challenges in science education

    NASA Astrophysics Data System (ADS)

    2011-07-01

    Conference: Serbia hosts teachers' seminar Resources: Teachers TV website closes for business Festival: Science takes to the stage in Denmark Research: How noise affects learning in secondary schools CERN: CERN visit inspires new teaching ideas Education: PLS aims to improve perception of science for school students Conference: Scientix conference discusses challenges in science education

  2. News Particle Physics: ATLAS unveils mural at CERN Prize: Corti Trust invites essay entries Astrophysics: CERN holds cosmic-ray conference Researchers in Residence: Lord Winston returns to school Music: ATLAS scientists record physics music Conference: Champagne flows at Reims event Competition: Students triumph at physics olympiad Teaching: Physics proves popular in Japanese schools Forthcoming Events

    NASA Astrophysics Data System (ADS)

    2011-01-01

    Particle Physics: ATLAS unveils mural at CERN Prize: Corti Trust invites essay entries Astrophysics: CERN holds cosmic-ray conference Researchers in Residence: Lord Winston returns to school Music: ATLAS scientists record physics music Conference: Champagne flows at Reims event Competition: Students triumph at physics olympiad Teaching: Physics proves popular in Japanese schools Forthcoming Events

  3. Signature CERN-URSS

    ScienceCinema

    None

    2017-12-09

    Le DG W.Jentschke souhaite la bienvenue à l'assemblée et aux invités pour la signature du protocole entre le Cern et l'URSS qui est un événement important. C'est en 1955 que 55 visiteurs soviétiques ont visité le Cern pour la première fois. Le premier DG au Cern, F.Bloch, et Mons.Amaldi sont aussi présents. Tandis que le discours anglais de W.Jentschke est traduit en russe, le discours russe de Mons.Morozov est traduit en anglais.

  4. Ceph-based storage services for Run2 and beyond

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel C.; Lamanna, Massimo; Mascetti, Luca; Peters, Andreas J.; Rousseau, Hervé

    2015-12-01

    In 2013, CERN IT evaluated then deployed a petabyte-scale Ceph cluster to support OpenStack use-cases in production. With now more than a year of smooth operations, we will present our experience and tuning best-practices. Beyond the cloud storage use-cases, we have been exploring Ceph-based services to satisfy the growing storage requirements during and after Run2. First, we have developed a Ceph back-end for CASTOR, allowing this service to deploy thin disk server nodes which act as gateways to Ceph; this feature marries the strong data archival and cataloging features of CASTOR with the resilient and high performance Ceph subsystem for disk. Second, we have developed RADOSFS, a lightweight storage API which builds a POSIX-like filesystem on top of the Ceph object layer. When combined with Xrootd, RADOSFS can offer a scalable object interface compatible with our HEP data processing applications. Lastly the same object layer is being used to build a scalable and inexpensive NFS service for several user communities.

  5. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  6. Connecting Restricted, High-Availability, or Low-Latency Resources to a Seamless Global Pool for CMS

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B.; Hufnagel, D.; Hurtado Anampa, K.; Jayatilaka, B.; Khan, F.; Larson, K.; Letts, J.; Mascheroni, M.; Mohapatra, A.; Marra Da Silva, J.; Mason, D.; Perez-Calero Yzquierdo, A.; Piperov, S.; Tiradani, A.; Verguilov, V.; CMS Collaboration

    2017-10-01

    The connection of diverse and sometimes non-Grid enabled resource types to the CMS Global Pool, which is based on HTCondor and glideinWMS, has been a major goal of CMS. These resources range in type from a high-availability, low latency facility at CERN for urgent calibration studies, called the CAF, to a local user facility at the Fermilab LPC, allocation-based computing resources at NERSC and SDSC, opportunistic resources provided through the Open Science Grid, commercial clouds, and others, as well as access to opportunistic cycles on the CMS High Level Trigger farm. In addition, we have provided the capability to give priority to local users of beyond WLCG pledged resources at CMS sites. Many of the solutions employed to bring these diverse resource types into the Global Pool have common elements, while some are very specific to a particular project. This paper details some of the strategies and solutions used to access these resources through the Global Pool in a seamless manner.

  7. Hands on CERN: A Well-Used Physics Education Project

    ERIC Educational Resources Information Center

    Johansson, K. E.

    2006-01-01

    The "Hands on CERN" education project makes it possible for students and teachers to get close to the forefront of scientific research. The project confronts the students with contemporary physics at its most fundamental level with the help of particle collisions from the DELPHI particle physics experiment at CERN. It now exists in 14 languages…

  8. Geospatial Field Methods: An Undergraduate Course Built Around Point Cloud Construction and Analysis to Promote Spatial Learning and Use of Emerging Technology in Geoscience

    NASA Astrophysics Data System (ADS)

    Bunds, M. P.

    2017-12-01

    Point clouds are a powerful data source in the geosciences, and the emergence of structure-from-motion (SfM) photogrammetric techniques has allowed them to be generated quickly and inexpensively. Consequently, applications of them as well as methods to generate, manipulate, and analyze them warrant inclusion in undergraduate curriculum. In a new course called Geospatial Field Methods at Utah Valley University, students in small groups use SfM to generate a point cloud from imagery collected with a small unmanned aerial system (sUAS) and use it as a primary data source for a research project. Before creating their point clouds, students develop needed technical skills in laboratory and class activities. The students then apply the skills to construct the point clouds, and the research projects and point cloud construction serve as a central theme for the class. Intended student outcomes for the class include: technical skills related to acquiring, processing, and analyzing geospatial data; improved ability to carry out a research project; and increased knowledge related to their specific project. To construct the point clouds, students first plan their field work by outlining the field site, identifying locations for ground control points (GCPs), and loading them onto a handheld GPS for use in the field. They also estimate sUAS flight elevation, speed, and the flight path grid spacing required to produce a point cloud with the resolution required for their project goals. In the field, the students place the GCPs using handheld GPS, and survey the GCP locations using post-processed-kinematic (PPK) or real-time-kinematic (RTK) methods. The students pilot the sUAS and operate its camera according to the parameters that they estimated in planning their field work. Data processing includes obtaining accurate locations for the PPK/RTK base station and GCPs, and SfM processing with Agisoft Photoscan. The resulting point clouds are rasterized into digital surface models, assessed for accuracy, and analyzed in Geographic Information System software. Student projects have included mapping and analyzing landslide morphology, fault scarps, and earthquake ground surface rupture. Students have praised the geospatial skills they learn, whereas helping them stay on schedule to finish their projects is a challenge.

  9. Use of Remotely Piloted Aircraft System (RPAS) in the analysis of historical landslide occurred in 1885 in the Rječina River Valley, Croatia

    NASA Astrophysics Data System (ADS)

    Dugonjić Jovančević, Sanja; Peranić, Josip; Ružić, Igor; Arbanas, Željko; Kalajžić, Duje; Benac, Čedomir

    2016-04-01

    Numerous instability phenomena have been recorded in the Rječina River Valley, near the City of Rijeka, in the past 250 years. Large landslides triggered by rainfall and floods, were registered on both sides of the Valley. Landslide inventory in the Valley was established based on recorded historical events and LiDAR imagery. The Rječina River is a typical karstic river 18.7km long, originating from the Gorski Kotar Mountains. The central part of the Valley, belongs to the dominant morphostructural unit that strikes in the northwest-southeast direction along the Rječina River. Karstified limestone rock mass is visible on the top of the slopes, while the flysch rock mass is present on the lower slopes and at the bottom of the Valley. Different types of movements can be distinguished in the area, such as the sliding of slope deposits over the flysch bedrock, rockfalls from limestone cliffs, sliding of huge rocky blocks, and active landslide on the north-eastern slope. The paper presents investigation of the dormant landslide located on the south-western slope of the Valley, which was recorded in 1870 in numerous historical descriptions. Due to intense and long-term rainfall, the landslide was reactivated in 1885, destroying and damaging houses in the eastern part of the Grohovo Village. To predict possible reactivation of the dormant landslide on the south-western side of the Valley, 2D stability back analyses were performed on the basis of landslide features, in order to approximate the position of sliding surface and landslide dimensions. The landslide topography is very steep, and the slope is covered by unstable debris material, so therefore hard to perform any terrestrial geodetic survey. Consumer-grade DJI Phantom 2 Remotely Piloted Aircraft System (RPAS) was used to provide the data about the present slope topography. The landslide 3D point cloud was derived from approximately 200 photographs taken with RPAS, using structure-from-motion (SfM) photogrammetry. Images were processed using the online Autodesk service "ReCap". Ground control points (GCP) collected with Total Station are identified on photorealistic point cloud and used for geo-referencing. Cloud Compare software was used for the point cloud processing. This study compared georeferenced landslide point cloud delivered from images with data acquired from laser scanning. RAPS and SfM application produced high accuracy landslide 3D point cloud, characterized by safe and quick data acquisition. Based on the adopted rock mass strength parameters, obtained from the back analysis, a stability analysis of the present slope situation was performed, and the present stability of the landslide body is determined. The unfavourable conditions and possible triggering factors such as saturation of the slope, caused by heavy rain and earthquake, were included in the analyses what enabled estimation of future landslide hazard and risk.

  10. Building a Cloud Infrastructure for a Virtual Environmental Observatory

    NASA Astrophysics Data System (ADS)

    El-khatib, Y.; Blair, G. S.; Gemmell, A. L.; Gurney, R. J.

    2012-12-01

    Environmental science is often fragmented: data is collected by different organizations using mismatched formats and conventions, and models are misaligned and run in isolation. Cloud computing offers a lot of potential in the way of resolving such issues by supporting data from different sources and at various scales, and integrating models to create more sophisticated and collaborative software services. The Environmental Virtual Observatory pilot (EVOp) project, funded by the UK Natural Environment Research Council, aims to demonstrate how cloud computing principles and technologies can be harnessed to develop more effective solutions to pressing environmental issues. The EVOp infrastructure is a tailored one constructed from resources in both private clouds (owned and managed by us) and public clouds (leased from third party providers). All system assets are accessible via a uniform web service interface in order to enable versatile and transparent resource management, and to support fundamental infrastructure properties such as reliability and elasticity. The abstraction that this 'everything as a service' principle brings also supports mashups, i.e. combining different web services (such as models) and data resources of different origins (in situ gauging stations, warehoused data stores, external sources, etc.). We adopt the RESTful style of web services in order to draw a clear line between client and server (i.e. cloud host) and also to keep the server completely stateless. This significantly improves the scalability of the infrastructure and enables easy infrastructure management. For instance, tasks such as load balancing and failure recovery are greatly simplified without the need for techniques such as advance resource reservation or shared block devices. Upon this infrastructure, we developed a web portal composed of a bespoke collection of web-based visualization tools to help bring out relationships or patterns within the data. The portal was designed for use without any programming prerequisites by stakeholders from different backgrounds such as scientists, policy makers, local communities, and the general public. The development of the portal was carried out using an iterative behaviour-driven approach. We have developed six distinct storyboards to determine the requirements of different users. From these, we identified two storyboards to implement during the pilot phase. The first explores flooding at a local catchment scale for farmers and the public. We simulate hydrological interactions to determine where saturated land-surface areas develop. Model parameter values resembling catchment characteristics could be specified either explicitly (for domain specialists) or indirectly using one of several predefined land use scenarios (for less familiar audiences). The second storyboard investigates the diffuse of agricultural pollution at a national level, with regulators as users. We study the flux of Nitrogen and Phosphorus from land to rivers and coastal regions at various scales of drainage and reporting units. This is particularly useful to uncover the impact of existing policy instruments or risk from future environmental changes on the levels of N and P flux.

  11. The ATLAS Software Installation System v2: a highly available system to install and validate Grid and Cloud sites via Panda

    NASA Astrophysics Data System (ADS)

    De Salvo, A.; Kataoka, M.; Sanchez Pineda, A.; Smirnov, Y.

    2015-12-01

    The ATLAS Installation System v2 is the evolution of the original system, used since 2003. The original tool has been completely re-designed in terms of database backend and components, adding support for submission to multiple backends, including the original Workload Management Service (WMS) and the new PanDA modules. The database engine has been changed from plain MySQL to Galera/Percona and the table structure has been optimized to allow a full High-Availability (HA) solution over Wide Area Network. The servlets, running on each frontend, have been also decoupled from local settings, to allow an easy scalability of the system, including the possibility of an HA system with multiple sites. The clients can also be run in multiple copies and in different geographical locations, and take care of sending the installation and validation jobs to the target Grid or Cloud sites. Moreover, the Installation Database is used as source of parameters by the automatic agents running in CVMFS, in order to install the software and distribute it to the sites. The system is in production for ATLAS since 2013, having as main sites in HA the INFN Roma Tier 2 and the CERN Agile Infrastructure. The Light Job Submission Framework for Installation (LJSFi) v2 engine is directly interfacing with PanDA for the Job Management, the Atlas Grid Information System (AGIS) for the site parameter configurations, and CVMFS for both core components and the installation of the software itself. LJSFi2 is also able to use other plugins, and is essentially Virtual Organization (VO) agnostic, so can be directly used and extended to cope with the requirements of any Grid or Cloud enabled VO. In this work we will present the architecture, performance, status and possible evolutions to the system for the LHC Run2 and beyond.

  12. View of Skylab space station cluster in Earth orbit from CSM

    NASA Image and Video Library

    2008-08-18

    SL4-143-4706 (8 Feb. 1974) --- An overhead view of the Skylab space station cluster in Earth orbit as photographed from the Skylab 4 Command and Service Modules (CSM) during the final fly-around by the CSM before returning home. The space station is contrasted against a cloud-covered Earth. Note the solar shield which was deployed by the second crew of Skylab and from which a micro meteoroid shield has been missing since the cluster was launched on May 14, 1973. The Orbital Workshop (OWS) solar panel on the left side was also lost on workshop launch day. Inside the Command Module (CM) when this picture was made were astronaut Gerald P. Carr, commander; scientist-astronaut Edward G. Gibson, science pilot; and astronaut William R. Pogue, pilot. The crew used a 70mm hand-held Hasselblad camera to take this photograph. Photo credit: NASA

  13. The Global Framework for Providing Information about Volcanic-Ash Hazards to International Air Navigation

    NASA Astrophysics Data System (ADS)

    Romero, R. W.; Guffanti, M.

    2009-12-01

    The International Civil Aviation Organization (ICAO) created the International Airways Volcano Watch (IAVW) in 1987 to establish a requirement for international dissemination of information about airborne ash hazards to safe air navigation. The IAVW is a set of operational protocols and guidelines that member countries agree to follow in order to implement a global, multi-faceted program to support the strategy of ash-cloud avoidance. Under the IAVW, the elements of eruption reporting, ash-cloud detecting, and forecasting expected cloud dispersion are coordinated to culminate in warnings sent to air traffic controllers, dispatchers, and pilots about the whereabouts of ash clouds. Nine worldwide Volcanic Ash Advisory Centers (VAAC) established under the IAVW have the responsibility for detecting the presence of ash in the atmosphere, primarily by looking at imagery from civilian meteorological satellites, and providing advisories about the location and movement of ash clouds to aviation meteorological offices and other aviation users. Volcano Observatories also are a vital part of the IAVW, as evidenced by the recent introduction of a universal message format for reporting the status of volcanic activity, including precursory unrest, to aviation users. Since 2003, the IAVW has been overseen by a standing group of scientific, technical, and regulatory experts that assists ICAO in the development of standards and other regulatory material related to volcanic ash. Some specific problems related to the implementation of the IAVW include: the lack of implementation of SIGMET (warning to aircraft in flight) provisions and delayed notifications of volcanic eruptions. Expected future challenges and developments involve the improvement in early notifications of volcanic eruptions, the consolidation of the issuance of SIGMETs, and the possibility of determining a “safe” concentration of volcanic ash.

  14. Fibre optique à la maison en Pays de Gex et de Bellegarde

    ScienceCinema

    None

    2017-12-09

    Le Syndicat Intercommunal d’Electricité de l’Ain(SIEA) déploie un réseau FTTH (Fiber To The Home) de fibres optiques dans le département de l’Ain vers l’ensemble des habitations. Le déploiement sur la zone pilote du Pays de Gex et du Bassin Bellegardien arrive dans la phase terminale vers les habitations. Le SIEA présentera ses activités, l’état du développement du réseau, les implications d’une connexion fibre optique et les procédures d’abonnement. La présentation sera donnée en Français. -------------------------------------------------------------------------------------------------------------------------------Public conference "Optical fiber To The Home in Pays de Gex et de Bellegarde" Michel Chanel et Jean Paul Goy (SIEA) Wednesday, 19th May at 11.30 hrs., Council Chamber, CERN The ‘’ Syndicat Intercommunal d’Electricité de l’Ain’’(SIEA) is deploying an optical fiber network FTTH (Fiber To The Home) in the Ain department towards the ensemble of houses. The installation on the pilot areas of Pays de Gex and Bassin Bellegardien is arriving in the phase of connecting houses. The SIEA will show its activities, the state of the network development, the implications of an optical fiber connection and the contract procedures. The presentation will be given in French

  15. Fibre optique à la maison en Pays de Gex et de Bellegarde

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-05-19

    Le Syndicat Intercommunal d’Electricité de l’Ain(SIEA) déploie un réseau FTTH (Fiber To The Home) de fibres optiques dans le département de l’Ain vers l’ensemble des habitations. Le déploiement sur la zone pilote du Pays de Gex et du Bassin Bellegardien arrive dans la phase terminale vers les habitations. Le SIEA présentera ses activités, l’état du développement du réseau, les implications d’une connexion fibre optique et les procédures d’abonnement. La présentation sera donnée en Français. -------------------------------------------------------------------------------------------------------------------------------Public conference "Optical fiber To The Home in Pays de Gex et de Bellegarde" Michel Chanel et Jean Paul Goy (SIEA) Wednesday, 19th May at 11.30more » hrs., Council Chamber, CERN The ‘’ Syndicat Intercommunal d’Electricité de l’Ain’’(SIEA) is deploying an optical fiber network FTTH (Fiber To The Home) in the Ain department towards the ensemble of houses. The installation on the pilot areas of Pays de Gex and Bassin Bellegardien is arriving in the phase of connecting houses. The SIEA will show its activities, the state of the network development, the implications of an optical fiber connection and the contract procedures. The presentation will be given in French« less

  16. Using a balloon-borne accelerometer to improve understanding of the turbulent structure of the atmosphere for aviation.

    NASA Astrophysics Data System (ADS)

    Marlton, Graeme; Harrison, Giles; Nicoll, Keri; Williams, Paul

    2017-04-01

    This work describes the instrument development, characterisation and data analysis from 51 radiosondes specially equipped with accelerometers to measure atmospheric turbulence. Turbulence is hazardous to aircraft as it cannot be observed in advance. It is estimated that turbulence costs the airline industry millions of US dollars a year through damage to aircraft and injuries to passengers and crew. To avoid turbulence pilots and passengers rely on Clear Air Turbulence forecasts, which have limited skill. One limitation in this area is lack of quantitative unbiased observations. The main source of turbulence observations is from commercial airline pilot reports, which are subjective, biased by the size of aircraft and pilot experience. This work seeks to improve understanding of turbulence through a standardised method of turbulence observations amenable throughout the troposphere. A sensing package has been developed to measure the acceleration of the radiosonde as it swings in response to turbulent agitation of its carrier balloon. The accelerometer radiosonde has been compared against multiple turbulence remote sensing methods to characterise its measurements including calibration with Doppler lidar eddy dissipation rate in the boundary layer. A further relationship has been found by comparison with the spectral width of a Mesospheric, Stratospheric and Tropospheric (MST) radar. From the full dataset of accelerometer sonde ascents a standard deviation of 5 m s-2 is defined as a threshold for significant turbulence. The dataset spans turbulence generated in meteorological phenomena such as jet streams, clouds and in the presence of convection. The analysis revealed that 77% of observed turbulence could be explained by the aforementioned phenomena. In jet streams, turbulence generation was often caused by horizontal processes such as deformation. In convection, turbulence is found to form when CAPE >150 J kg-1. Deeper clouds were found to be more turbulent due to the increased intensity of in-cloud processes. The accelerometer data were used to verify the skill of turbulence diagnostics, in order to assess which diagnostics are best at forecasting turbulence. It was found using a Receiver Operating Characteristics curve analysis that turbulence diagnostics calculated using ECMWF high resolution data that featured wind speed, deformation and relative vorticity advection predicted turbulence best with area under curve values of 0.7,0.66 and 0.62 respectively. This work provides a new, safe and inexpensive method to retrieve in-situ information about the turbulent structure of the atmosphere. It can inform the aviation industry through identifying turbulence generation regions and assess which predictive diagnostics are the most skilful.

  17. 25th Birthday Cern- Amphi

    ScienceCinema

    None

    2017-12-09

    Cérémonie du 25ème anniversaire du Cern avec 2 orateurs: le Prof.Weisskopf parle de la signification et le rôle du Cern et le Prof.Casimir(?) fait un exposé sur les rélations entre la science pure et la science appliquée et la "big science" (science légère)

  18. Simulation evaluation of a speed-guidance law for Harrier approach transitions

    NASA Technical Reports Server (NTRS)

    Merrick, Vernon K.; Moralez, Ernesto; Stortz, Michael W.; Hardy, Gordon H.; Gerdes, Ronald M.

    1991-01-01

    An exponential-deceleration speed guidance law is formulated which mimics the technique currently used by Harrier pilots to perform decelerating approaches to a hover. This guidance law was tested along with an existing two-step constant deceleration speed guidance law, using a fixed-base piloted simulator programmed to represent a YAV-8B Harrier. Decelerating approaches to a hover at a predetermined station-keeping point were performed along a straight (-3 deg glideslope) path in headwinds up to 40 knots and turbulence up to 6 ft./sec. Visibility was fixed at one-quarter nautical mile and 100 ft. cloud ceiling. Three Harrier pilots participated in the experiment. Handling qualities with the aircraft equipped with the standard YAV-8B rate damped attitude stability augmentation system were adequate (level 2) using either speed guidance law. However, the exponential deceleration speed guidance law was rated superior to the constant-deceleration speed guidance law by a Cooper-Harper handling qualities rating of about one unit independent of the level of wind and turbulence. Replacing the attitude control system of the YAV-8B with a high fidelity model following attitude flight controller increased the approach accuracy and reduced the pilot workload. With one minor exception, the handling qualities for the approach were rated satisfactory (level 1). It is concluded that the exponential deceleration speed guidance law is the most cost effective.

  19. 3D-Sonification for Obstacle Avoidance in Brownout Conditions

    NASA Technical Reports Server (NTRS)

    Godfroy-Cooper, M.; Miller, J. D.; Szoboszlay, Z.; Wenzel, E. M.

    2017-01-01

    Helicopter brownout is a phenomenon that occurs when making landing approaches in dusty environments, whereby sand or dust particles become swept up in the rotor outwash. Brownout is characterized by partial or total obscuration of the terrain, which degrades visual cues necessary for hovering and safe landing. Furthermore, the motion of the dust cloud produced during brownout can lead to the pilot experiencing motion cue anomalies such as vection illusions. In this context, the stability and guidance control functions can be intermittently or continuously degraded, potentially leading to undetected surface hazards and obstacles as well as unnoticed drift. Safe and controlled landing in brownout can be achieved using an integrated presentation of LADAR and RADAR imagery and aircraft state symbology. However, though detected by the LADAR and displayed on the sensor image, small obstacles can be difficult to discern from the background so that changes in obstacle elevation may go unnoticed. Moreover, pilot workload associated with tracking the displayed symbology is often so high that the pilot cannot give sufficient attention to the LADAR/RADAR image. This paper documents a simulation evaluating the use of 3D auditory cueing for obstacle avoidance in brownout as a replacement for or compliment to LADAR/RADAR imagery.

  20. The STS-93 crew takes part in payload familiarization of the Chandra X-ray Observatory

    NASA Technical Reports Server (NTRS)

    1999-01-01

    A TRW technician joins STS-93 Commander Eileen Collins (center) and Pilot Jeffrey S. Ashby (right) as they observe the Chandra X- ray Observatory on its work stand inside the Vertical Processing Facility. Other members of the STS-93 crew who are at KSC for payload familiarization are Mission Specialists Catherine G. Coleman and Michel Tognini of France, who represents the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as a shuttle mission commander. She was the first woman pilot of a Space Shuttle, on mission STS-63, and also served as pilot on mission STS-84. The fifth member of the crew is Mission Specialist Steven A. Hawley. Chandra is scheduled for launch July 9 aboard Space Shuttle Columbia, on mission STS-93 . Formerly called the Advanced X-ray Astrophysics Facility, Chandra comprises three major elements: the spacecraft, the science instrument module (SIM), and the world's most powerful X-ray telescope. Chandra will allow scientists from around the world to see previously invisible black holes and high-temperature gas clouds, giving the observatory the potential to rewrite the books on the structure and evolution of our universe.

  1. Examining the Pilot and Controller Performance Data When in a Free Flight with Weather Phenomenon

    NASA Technical Reports Server (NTRS)

    Nituen, Celestine A.; Lozito, Sandra C. (Technical Monitor)

    2002-01-01

    The present study investigated effects of weather related factors on the performance of pilots under free flight. A weather scenario was defined by a combination of precipitation factors (light rain, moderate rain, and heavy rain or snow), visibility (1,4,8 miles), wind conditions (light, medium, or heavy), cloud ceiling (800ft. below, 1800ft above, and 4000ft horizontal). The performance of the aircraft self-separation was evaluated in terms of detection accuracy and detection times for student- and commercial (expert) pilots. Overall, the results obtained from a behavioral analysis showed that in general, the ability to recognize intruder aircraft conflict incidents, followed by the ability to acquire the spatial location of the intruder aircraft relative to ownership aircraft were judged to be the major cognitive tasks as perceived by the participants during self-separation. Further, the participants rarely used cockpit display of traffic information (CDTI) during conflict management related to aircraft separation, but used CDTI highly during decision-making tasks. In all weather scenarios, there were remarkable differences between expert and student pilots in detection times. In summary, weather scenarios were observed to affect intruder aircraft detection performance accuracies. There was interaction effects between weather Scenario-1 and Scenario-2 for climbing task data generated by both expert- and student- pilots at high traffic density. Scenario-3 weather condition provided an opportunity for poor detection accuracy as well as detection time increase. This may be attributed to low visibility. The intruder aircraft detection times were not affected by the weather conditions during climbing and descending tasks. The decision of pilots to fly into certain weather condition was dependent in part on the warning distance to the location of the weather. When pilots were warned of the weather conditions, they were more likely to fly their aircraft into it, but mostly when the warning was not close to the weather location.

  2. INTEGRATED OPERATIONAL DOSIMETRY SYSTEM AT CERN.

    PubMed

    Dumont, Gérald; Pedrosa, Fernando Baltasar Dos Santos; Carbonez, Pierre; Forkel-Wirth, Doris; Ninin, Pierre; Fuentes, Eloy Reguero; Roesler, Stefan; Vollaire, Joachim

    2017-04-01

    CERN, the European Organization for Nuclear Research, upgraded its operational dosimetry system in March 2013 to be prepared for the first Long Shutdown of CERN's facilities. The new system allows the immediate and automatic checking and recording of the dosimetry data before and after interventions in radiation areas. To facilitate the analysis of the data in context of CERN's approach to As Low As Reasonably Achievable (ALARA), this new system is interfaced to the Intervention Management Planning and Coordination Tool (IMPACT). IMPACT is a web-based application widely used in all CERN's accelerators and their associated technical infrastructures for the planning, the coordination and the approval of interventions (work permit principle). The coupling of the operational dosimetry database with the IMPACT repository allows a direct and almost immediate comparison of the actual dose with the estimations, in addition to enabling the configuration of alarm levels in the dosemeter in function of the intervention to be performed. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Introduction to CERN

    ScienceCinema

    Heuer, R.-D.

    2018-02-19

    Summer Student Lecture Programme Introduction. The mission of CERN; push back the frontiers of knowledge, e.g. the secrets of the Big Bang...what was the matter like within the first moments of the Universe's existence? You have to develop new technologies for accelerators and detectors (also information technology--the Web and the GRID and medicine--diagnosis and therapy). There are three key technology areas at CERN; accelerating, particle detection, large-scale computing.

  4. HIGH ENERGY PHYSICS: Bulgarians Sue CERN for Leniency.

    PubMed

    Koenig, R

    2000-10-13

    In cash-strapped Bulgaria, scientists are wondering whether a ticket for a front-row seat in high-energy physics is worth the price: Membership dues in CERN, the European particle physics lab, nearly equal the country's entire budget for competitive research grants. Faced with that grim statistic and a plea for leniency from Bulgaria's government, CERN's governing council is considering slashing the country's membership dues for the next 2 years.

  5. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    McAllister, Liam

    2018-05-14

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  6. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-22

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons.Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  7. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-06-28

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  8. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-23

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  9. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2017-12-09

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  10. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    McAllister, Liam

    2018-05-24

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions".This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher

  11. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    Sen, Ashoke

    2018-04-27

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network". The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher.

  12. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-05-23

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  13. Clouds and Open Ocean near the Bahamas

    NASA Image and Video Library

    1982-07-04

    STS004-41-1206 (27 June-4July 1982) --- Sunglint reflects off the water of the North Atlantic Ocean in an area to the east of the Bahamas Islands sometimes called the Sargasso Sea. The area has also been referred to as the ?Bermuda Triangle.? Astronauts Thomas K. Mattingly II, STS-4 commander, and Henry W. Hartsfield Jr., pilot, spent seven days and one hour aboard the Earth-orbiting space shuttle Columbia and performed a variety of duties in addition to those of recording 70mm and 35mm imagery. Photo credit: NASA

  14. The Swarm, the Cloud, and the Importance of Getting There First: What’s at Stake in the Remote Aviation Culture Debate

    DTIC Science & Technology

    2013-08-01

    following former Air Force chiefs of staff: Gen Mi- chael Ryan, Gen John Jumper, and Gen Norton Schwartz. See Boyne , “How the Predator Grew Teeth.” 14. P... John L. McLucas has written: I believe we are entering an era when RPVs [remotely piloted ve- hicles] will play an increasingly important role in...technology as an amplifier of integrated human agency; and Col John Boyd observes how our definitions of cultural membership shift over time. By way of

  15. Service management at CERN with Service-Now

    NASA Astrophysics Data System (ADS)

    Toteva, Z.; Alvarez Alonso, R.; Alvarez Granda, E.; Cheimariou, M.-E.; Fedorko, I.; Hefferman, J.; Lemaitre, S.; Clavo, D. Martin; Martinez Pedreira, P.; Pera Mira, O.

    2012-12-01

    The Information Technology (IT) and the General Services (GS) departments at CERN have decided to combine their extensive experience in support for IT and non-IT services towards a common goal - to bring the services closer to the end user based on Information Technology Infrastructure Library (ITIL) best practice. The collaborative efforts have so far produced definitions for the incident and the request fulfilment processes which are based on a unique two-dimensional service catalogue that combines both the user and the support team views of all services. After an extensive evaluation of the available industrial solutions, Service-now was selected as the tool to implement the CERN Service-Management processes. The initial release of the tool provided an attractive web portal for the users and successfully implemented two basic ITIL processes; the incident management and the request fulfilment processes. It also integrated with the CERN personnel databases and the LHC GRID ticketing system. Subsequent releases continued to integrate with other third-party tools like the facility management systems of CERN as well as to implement new processes such as change management. Independently from those new development activities it was decided to simplify the request fulfilment process in order to achieve easier acceptance by the CERN user community. We believe that due to the high modularity of the Service-now tool, the parallel design of ITIL processes e.g., event management and non-ITIL processes, e.g., computer centre hardware management, will be easily achieved. This presentation will describe the experience that we have acquired and the techniques that were followed to achieve the CERN customization of the Service-Now tool.

  16. Vidyo@CERN: A Service Update

    NASA Astrophysics Data System (ADS)

    Fernandes, J.; Baron, T.

    2015-12-01

    We will present an overview of the current real-time video service offering for the LHC, in particular the operation of the CERN Vidyo service will be described in terms of consolidated performance and scale: The service is an increasingly critical part of the daily activity of the LHC collaborations, topping recently more than 50 million minutes of communication in one year, with peaks of up to 852 simultaneous connections. We will elaborate on the improvement of some front-end key features such as the integration with CERN Indico, or the enhancements of the Unified Client and also on new ones, released or in the pipeline, such as a new WebRTC client and CERN SSO/Federated SSO integration. An overview of future infrastructure improvements, such as virtualization techniques of Vidyo routers and geo-location mechanisms for load-balancing and optimum user distribution across the service infrastructure will also be discussed. The work done by CERN to improve the monitoring of its Vidyo network will also be presented and demoed. As a last point, we will touch the roadmap and strategy established by CERN and Vidyo with a clear objective of optimizing the service both on the end client and backend infrastructure to make it truly universal, to serve Global Science. To achieve those actions, the introduction of the multitenant concept to serve different communities is needed. This is one of the consequences of CERN's decision to offer the Vidyo service currently operated for the LHC, to other Sciences, Institutions and Virtual Organizations beyond HEP that might express interest for it.

  17. Weather Avoidance Guidelines for NASA Global Hawk High-Altitude UAS

    NASA Technical Reports Server (NTRS)

    Cecil, Daniel J.; Zipser, Edward J.; Velden, Chris; Monette, Sarah; Heymsfield, Gerry; Braun, Scott; Newman, Paul; Black, Pete; Black, Michael; Dunion, Jason

    2014-01-01

    NASA operates two Global Hawk unmanned aircraft systems for Earth Science research projects. In particular, they are used in the Hurricane and Severe Storm Sentinel (HS3) project during 2012, 2013, and 2014 to take measurements from the environment around tropical cyclones, and from directly above tropical cyclones. There is concern that strict adherence to the weather avoidance rules used in 2012 may sacrifice the ability to observe important science targets. We have proposed modifications to these weather avoidance rules that we believe will improve the ability to observe science targets without compromising aircraft safety. The previous guidelines, used in 2012, specified: Do not approach thunderstorms within 25 nm during flight at FL500 or below. When flying above FL500: Do not approach reported lightning within 25NM in areas where cloud tops are reported at FL500 or higher. Aircraft should maintain at least 10000 ft vertical separation from reported lightning if cloud tops are below FL500. No over-flight of cumulus tops higher than FL500. No flight into forecast or reported icing conditions. No flight into forecast or reported moderate or severe turbulence Based on past experience with high-altitude flights over tropical cyclones, we have recommended changing this guidance to: Do not approach thunderstorms within 25 nm during flight at FL500 or below. Aircraft should maintain at least 5000 ft vertical separation from significant convective cloud tops except: a) When cloud tops above FL500: In the event of reported significant lightning activity or indicators of significant overshooting tops, do not approach within 10-25 nm, depending on pilot discretion and advice from Mission Scientist. b) When cloud tops are below FL500, maintain 10000 ft separation from reported significant lightning or indicators of significant overshooting tops. No flight into forecasted or reported icing conditions. No flight into forecasted or reported moderate or severe turbulence The key changes have to do with overflight of high convective cloud tops and those producing lightning. Experience shows that most tropical oceanic convection (including that in tropical cyclones) is relatively gentle even if the cloud tops are quite high, and can be safely overflown. Exceptions are convective elements producing elevated lightning flash rates (more than just the occasional flash, which would trigger avoidance under the previous rules) and significant overshooting cloud tops.

  18. Heterogeneous ice nucleation and phase transition of viscous α-pinene secondary organic aerosol

    NASA Astrophysics Data System (ADS)

    Ignatius, Karoliina; Kristensen, Thomas B.; Järvinen, Emma; Nichman, Leonid; Fuchs, Claudia; Gordon, Hamish; Herenz, Paul; Hoyle, Christopher R.; Duplissy, Jonathan; Baltensperger, Urs; Curtius, Joachim; Donahue, Neil M.; Gallagher, Martin W.; Kirkby, Jasper; Kulmala, Markku; Möhler, Ottmar; Saathoff, Harald; Schnaiter, Martin; Virtanen, Annele; Stratmann, Frank

    2016-04-01

    There are strong indications that particles containing secondary organic aerosol (SOA) exhibit amorphous solid or semi-solid phase states in the atmosphere. This may facilitate deposition ice nucleation and thus influence cirrus cloud properties. Global model simulations of monoterpene SOA particles suggest that viscous biogenic SOA are indeed present in regions where cirrus cloud formation takes place. Hence, they could make up an important contribution to the global ice nucleating particle (INP) budget. However, experimental ice nucleation studies of biogenic SOA are scarce. Here, we investigated the ice nucleation ability of viscous SOA particles at the CLOUD (Cosmics Leaving OUtdoor Droplets) experiment at CERN (Ignatius et al., 2015, Järvinen et al., 2015). In the CLOUD chamber, the SOA particles were produced from the ozone initiated oxidation of α-pinene at temperatures in the range from -38 to -10° C at 5-15 % relative humidity with respect to water (RHw) to ensure their formation in a highly viscous phase state, i.e. semi-solid or glassy. We found that particles formed and grown in the chamber developed an asymmetric shape through coagulation. As the RHw was increased to between 35 % at -10° C and 80 % at -38° C, a transition to spherical shape was observed with a new in-situ optical method. This transition confirms previous modelling of the viscosity transition conditions. The ice nucleation ability of SOA particles was investigated with a new continuous flow diffusion chamber SPIN (Spectrometer for Ice Nuclei) for different SOA particle sizes. For the first time, we observed heterogeneous ice nucleation of viscous α-pinene SOA in the deposition mode for ice saturation ratios between 1.3 and 1.4, significantly below the homogeneous freezing limit. The maximum frozen fractions found at temperatures between -36.5 and -38.3° C ranged from 6 to 20 % and did not depend on the particle surface area. References Ignatius, K. et al., Heterogeneous ice nucleation of secondary organic aerosol produced from ozonolysis of α-pinene, Atmos. Chem. Phys. Discuss., 15, 35719-35752, doi:10.5194/acpd-15-35719-2015, 2015. Järvinen, E. et al., Observation of viscosity transition in α-pinene secondary organic aerosol, Atmos. Chem. Phys. Discuss., 15, 28575-28617, doi:10.5194/acpd-15-28575-2015, 2015.

  19. GenomeVIP: a cloud platform for genomic variant discovery and interpretation

    PubMed Central

    Mashl, R. Jay; Scott, Adam D.; Huang, Kuan-lin; Wyczalkowski, Matthew A.; Yoon, Christopher J.; Niu, Beifang; DeNardo, Erin; Yellapantula, Venkata D.; Handsaker, Robert E.; Chen, Ken; Koboldt, Daniel C.; Ye, Kai; Fenyö, David; Raphael, Benjamin J.; Wendl, Michael C.; Ding, Li

    2017-01-01

    Identifying genomic variants is a fundamental first step toward the understanding of the role of inherited and acquired variation in disease. The accelerating growth in the corpus of sequencing data that underpins such analysis is making the data-download bottleneck more evident, placing substantial burdens on the research community to keep pace. As a result, the search for alternative approaches to the traditional “download and analyze” paradigm on local computing resources has led to a rapidly growing demand for cloud-computing solutions for genomics analysis. Here, we introduce the Genome Variant Investigation Platform (GenomeVIP), an open-source framework for performing genomics variant discovery and annotation using cloud- or local high-performance computing infrastructure. GenomeVIP orchestrates the analysis of whole-genome and exome sequence data using a set of robust and popular task-specific tools, including VarScan, GATK, Pindel, BreakDancer, Strelka, and Genome STRiP, through a web interface. GenomeVIP has been used for genomic analysis in large-data projects such as the TCGA PanCanAtlas and in other projects, such as the ICGC Pilots, CPTAC, ICGC-TCGA DREAM Challenges, and the 1000 Genomes SV Project. Here, we demonstrate GenomeVIP's ability to provide high-confidence annotated somatic, germline, and de novo variants of potential biological significance using publicly available data sets. PMID:28522612

  20. Crop classification and mapping based on Sentinel missions data in cloud environment

    NASA Astrophysics Data System (ADS)

    Lavreniuk, M. S.; Kussul, N.; Shelestov, A.; Vasiliev, V.

    2017-12-01

    Availability of high resolution satellite imagery (Sentinel-1/2/3, Landsat) over large territories opens new opportunities in agricultural monitoring. In particular, it becomes feasible to solve crop classification and crop mapping task at country and regional scale using time series of heterogenous satellite imagery. But in this case, we face with the problem of Big Data. Dealing with time series of high resolution (10 m) multispectral imagery we need to download huge volumes of data and then process them. The solution is to move "processing chain" closer to data itself to drastically shorten time for data transfer. One more advantage of such approach is the possibility to parallelize data processing workflow and efficiently implement machine learning algorithms. This could be done with cloud platform where Sentinel imagery are stored. In this study, we investigate usability and efficiency of two different cloud platforms Amazon and Google for crop classification and crop mapping problems. Two pilot areas were investigated - Ukraine and England. Google provides user friendly environment Google Earth Engine for Earth observation applications with a lot of data processing and machine learning tools already deployed. At the same time with Amazon one gets much more flexibility in implementation of his own workflow. Detailed analysis of pros and cons will be done in the presentation.

  1. Analysis of rainfall over northern Peru during El Nino: A PCDS application

    NASA Technical Reports Server (NTRS)

    Goldberg, R.; Tisnado, G.

    1986-01-01

    In an examination of GOES satellite data during the 1982 through 1983 El Nino period, the appearance of lee wave cloud patterns was revealed. A correlation was hypothesized relating an anomalous easterly flow across the Andes with the appearance of these wave patterns and with the subsequent onset of intense rainfall. The cloud patterns are belived to be associated with the El Nino period and could be viewed as precursors to significant changes in weather patterns. The ultimate goal of the researchers will be the ability to predict occurrences of rainstorms associated with the appearance of lee waves and related cloud patterns as harbingers of destruction caused by flooding, huaycos, and other catastrophic consequences of heavy and abnormal rainfall. Rainfall data from about 70 stations in northern Peru from 1980 through 1984 were formatted to be utilized within the Pilot Climate Data System (PCDS). This time period includes the 1982 through 1983 El Nino period. As an example of the approach, a well-pronounced lee wave pattern was shown from a GOES satellite image of April 4, 1983. The ground truth data were then displayed via the PCDS to graphically demonstrate the increase in intensity and areal distribution of rainfall in the northern Peruvian area in the next 4 to 5 days.

  2. Public Lecture

    ScienceCinema

    None

    2017-12-09

    An outreach activity is being organized by the Turkish community at CERN, on 5 June 2010 at CERN Main Auditorium. The activity consists of several talks that will take 1.5h in total. The main goal of the activity will be describing the CERN based activities and experiments as well as stimulating the public's attention to the science related topics. We believe the wide communication of the event has certain advantages especially for the proceeding membership process of Turkey.

  3. Prospects for observation at CERN in NA62

    NASA Astrophysics Data System (ADS)

    Hahn, F.; NA62 Collaboration; Aglieri Rinella, G.; Aliberti, R.; Ambrosino, F.; Angelucci, B.; Antonelli, A.; Anzivino, G.; Arcidiacono, R.; Azhinenko, I.; Balev, S.; Bendotti, J.; Biagioni, A.; Biino, C.; Bizzeti, A.; Blazek, T.; Blik, A.; Bloch-Devaux, B.; Bolotov, V.; Bonaiuto, V.; Bragadireanu, M.; Britton, D.; Britvich, G.; Brook, N.; Bucci, F.; Butin, F.; Capitolo, E.; Capoccia, C.; Capussela, T.; Carassiti, V.; Cartiglia, N.; Cassese, A.; Catinaccio, A.; Cecchetti, A.; Ceccucci, A.; Cenci, P.; Cerny, V.; Cerri, C.; Chikilev, O.; Ciaranfi, R.; Collazuol, G.; Cooke, P.; Cooper, P.; Corradi, G.; Cortina Gil, E.; Costantini, F.; Cotta Ramusino, A.; Coward, D.; D'Agostini, G.; Dainton, J.; Dalpiaz, P.; Danielsson, H.; Degrange, J.; De Simone, N.; Di Filippo, D.; Di Lella, L.; Dixon, N.; Doble, N.; Duk, V.; Elsha, V.; Engelfried, J.; Enik, T.; Falaleev, V.; Fantechi, R.; Federici, L.; Fiorini, M.; Fry, J.; Fucci, A.; Fulton, L.; Gallorini, S.; Gatignon, L.; Gianoli, A.; Giudici, S.; Glonti, L.; Goncalves Martins, A.; Gonnella, F.; Goudzovski, E.; Guida, R.; Gushchin, E.; Hahn, F.; Hallgren, B.; Heath, H.; Herman, F.; Hutchcroft, D.; Iacopini, E.; Jamet, O.; Jarron, P.; Kampf, K.; Kaplon, J.; Karjavin, V.; Kekelidze, V.; Kholodenko, S.; Khoriauli, G.; Khudyakov, A.; Kiryushin, Yu; Kleinknecht, K.; Kluge, A.; Koval, M.; Kozhuharov, V.; Krivda, M.; Kudenko, Y.; Kunze, J.; Lamanna, G.; Lazzeroni, C.; Leitner, R.; Lenci, R.; Lenti, M.; Leonardi, E.; Lichard, P.; Lietava, R.; Litov, L.; Lomidze, D.; Lonardo, A.; Lurkin, N.; Madigozhin, D.; Maire, G.; Makarov, A.; Mannelli, I.; Mannocchi, G.; Mapelli, A.; Marchetto, F.; Massarotti, P.; Massri, K.; Matak, P.; Mazza, G.; Menichetti, E.; Mirra, M.; Misheva, M.; Molokanova, N.; Morant, J.; Morel, M.; Moulson, M.; Movchan, S.; Munday, D.; Napolitano, M.; Newson, F.; Norton, A.; Noy, M.; Nuessle, G.; Obraztsov, V.; Padolski, S.; Page, R.; Palladino, V.; Pardons, A.; Pedreschi, E.; Pepe, M.; Perez Gomez, F.; Perrin-Terrin, M.; Petrov, P.; Petrucci, F.; Piandani, R.; Piccini, M.; Pietreanu, D.; Pinzino, J.; Pivanti, M.; Polenkevich, I.; Popov, I.; Potrebenikov, Yu; Protopopescu, D.; Raffaelli, F.; Raggi, M.; Riedler, P.; Romano, A.; Rubin, P.; Ruggiero, G.; Russo, V.; Ryjov, V.; Salamon, A.; Salina, G.; Samsonov, V.; Santovetti, E.; Saracino, G.; Sargeni, F.; Schifano, S.; Semenov, V.; Sergi, A.; Serra, M.; Shkarovskiy, S.; Sotnikov, A.; Sougonyaev, V.; Sozzi, M.; Spadaro, T.; Spinella, F.; Staley, R.; Statera, M.; Sutcliffe, P.; Szilasi, N.; Tagnani, D.; Valdata-Nappi, M.; Valente, P.; Vasile, M.; Vassilieva, V.; Velghe, B.; Veltri, M.; Venditti, S.; Vormstein, M.; Wahl, H.; Wanke, R.; Wertelaers, P.; Winhart, A.; Winston, R.; Wrona, B.; Yushchenko, O.; Zamkovsky, M.; Zinchenko, A.

    2015-07-01

    The rare decays are excellent processes to probe the Standard Model and indirectly search for new physics complementary to the direct LHC searches. The NA62 experiment at CERN SPS aims to collect and analyse O(1013) kaon decays before the CERN long-shutdown 2 (in 2018). This will allow to measure the branching ratio to a level of 10% accuracy. The experimental apparatus has been commissioned during a first run in autumn 2014.

  4. The trigger system for K0→2 π0 decays of the NA48 experiment at CERN

    NASA Astrophysics Data System (ADS)

    Mikulec, I.

    1998-02-01

    A fully pipelined 40 MHz "dead-time-free" trigger system for neutral K0 decays for the NA48 experiment at CERN is described. The NA48 experiment studies CP-violation using the high intensity beam of the CERN SPS accelerator. The trigger system sums, digitises, filters and processes signals from 13 340 channels of the liquid krypton electro-magnetic calorimeter. In 1996 the calorimeter and part of the trigger electronics were installed and tested. In 1997 the system was completed and prepared to be used in the first NA48 physics data taking period. Cagliari, Cambridge, CERN, Dubna, Edinburgh, Ferrara, Firenze, Mainz, Orsay, Perugia, Pisa, Saclay, Siegen, Torino, Warszawa, Wien Collaboration.

  5. A DMA-train for precision measurement of sub-10 nm aerosol dynamics

    NASA Astrophysics Data System (ADS)

    Stolzenburg, Dominik; Steiner, Gerhard; Winkler, Paul M.

    2017-05-01

    Measurements of aerosol dynamics in the sub-10 nm size range are crucially important for quantifying the impact of new particle formation onto the global budget of cloud condensation nuclei. Here we present the development and characterization of a differential mobility analyzer train (DMA-train), operating six DMAs in parallel for high-time-resolution particle-size-distribution measurements below 10 nm. The DMAs are operated at six different but fixed voltages and hence sizes, together with six state-of-the-art condensation particle counters (CPCs). Two Airmodus A10 particle size magnifiers (PSM) are used for channels below 2.5 nm while sizes above 2.5 nm are detected by TSI 3776 butanol-based or TSI 3788 water-based CPCs. We report the transfer functions and characteristics of six identical Grimm S-DMAs as well as the calibration of a butanol-based TSI model 3776 CPC, a water-based TSI model 3788 CPC and an Airmodus A10 PSM. We find cutoff diameters similar to those reported in the literature. The performance of the DMA-train is tested with a rapidly changing aerosol of a tungsten oxide particle generator during warmup. Additionally we report a measurement of new particle formation taken during a nucleation event in the CLOUD chamber experiment at CERN. We find that the DMA-train is able to bridge the gap between currently well-established measurement techniques in the cluster-particle transition regime, providing high time resolution and accurate size information of neutral and charged particles even at atmospheric particle concentrations.

  6. Atmospheric radiation measurement unmanned aerospace vehicle (ARM-UAV) program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolton, W.R.

    1996-11-01

    ARM-UAV is part of the multi-agency U.S. Global Change Research Program and is addressing the largest source of uncertainty in predicting climatic response: the interaction of clouds and the sun`s energy in the Earth`s atmosphere. An important aspect of the program is the use of unmanned aerospace vehicles (UAVs) as the primary airborne platform. The ARM-UAV Program has completed two major flight series: The first series conducted in April, 1994, using an existing UAV (the General Atomics Gnat 750) consisted of eight highly successful flights at the DOE climate site in Oklahoma. The second series conducted in September/October, 1995, usingmore » two piloted aircraft (Egrett and Twin Otter), featured simultaneous measurements above and below clouds and in clear sky. Additional flight series are planned to continue study of the cloudy and clear sky energy budget in the Spring and Fall of 1996 over the DOE climate site in Oklahoma. 3 refs., 4 figs., 1 tab.« less

  7. Health Informatics for Neonatal Intensive Care Units: An Analytical Modeling Perspective

    PubMed Central

    Mench-Bressan, Nadja; McGregor, Carolyn; Pugh, James Edward

    2015-01-01

    The effective use of data within intensive care units (ICUs) has great potential to create new cloud-based health analytics solutions for disease prevention or earlier condition onset detection. The Artemis project aims to achieve the above goals in the area of neonatal ICUs (NICU). In this paper, we proposed an analytical model for the Artemis cloud project which will be deployed at McMaster Children’s Hospital in Hamilton. We collect not only physiological data but also the infusion pumps data that are attached to NICU beds. Using the proposed analytical model, we predict the amount of storage, memory, and computation power required for the system. Capacity planning and tradeoff analysis would be more accurate and systematic by applying the proposed analytical model in this paper. Numerical results are obtained using real inputs acquired from McMaster Children’s Hospital and a pilot deployment of the system at The Hospital for Sick Children (SickKids) in Toronto. PMID:27170907

  8. CERN and 60 years of science for peace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heuer, Rolf-Dieter, E-mail: Rolf.Heuer@cern.ch

    2015-02-24

    This paper presents CERN as it celebrates its 60{sup th} Anniversary since its founding. The presentation first discusses the mission of CERN and its role as an inter-governmental Organization. The paper also reviews aspects of the particle physics research programme, looking at both current and future accelerator-based facilities at the high-energy and intensity frontiers. Finally, the paper considers issues beyond fundamental research, such as capacity-building and the interface between Art and Science.

  9. Meeting Jentschke

    ScienceCinema

    None

    2018-05-18

    After an introduction about the latest research and news at CERN, the DG W. Jentschke speaks about future management of CERN with two new general managers, who will be in charge for the next 5 years: Dr. J.B. Adams who will focus on the administration of CERN and also the construction of buildings and equipment, and Dr. L. Van Hove who will be responsible for research activities. The DG speaks about expected changes, shared services, different divisions and their leaders, etc.

  10. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    Sen, Ashoke

    2017-12-18

    Part 7.The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  11. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    ScienceCinema

    None

    2018-02-09

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental InteractionS". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher

  12. CERN Winter School on Supergravity, Strings, and Gauge Theory 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-01-22

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  13. NA61/SHINE facility at the CERN SPS: beams and detector system

    NASA Astrophysics Data System (ADS)

    Abgrall, N.; Andreeva, O.; Aduszkiewicz, A.; Ali, Y.; Anticic, T.; Antoniou, N.; Baatar, B.; Bay, F.; Blondel, A.; Blumer, J.; Bogomilov, M.; Bogusz, M.; Bravar, A.; Brzychczyk, J.; Bunyatov, S. A.; Christakoglou, P.; Cirkovic, M.; Czopowicz, T.; Davis, N.; Debieux, S.; Dembinski, H.; Diakonos, F.; Di Luise, S.; Dominik, W.; Drozhzhova, T.; Dumarchez, J.; Dynowski, K.; Engel, R.; Efthymiopoulos, I.; Ereditato, A.; Fabich, A.; Feofilov, G. A.; Fodor, Z.; Fulop, A.; Gaździcki, M.; Golubeva, M.; Grebieszkow, K.; Grzeszczuk, A.; Guber, F.; Haesler, A.; Hasegawa, T.; Hierholzer, M.; Idczak, R.; Igolkin, S.; Ivashkin, A.; Jokovic, D.; Kadija, K.; Kapoyannis, A.; Kaptur, E.; Kielczewska, D.; Kirejczyk, M.; Kisiel, J.; Kiss, T.; Kleinfelder, S.; Kobayashi, T.; Kolesnikov, V. I.; Kolev, D.; Kondratiev, V. P.; Korzenev, A.; Koversarski, P.; Kowalski, S.; Krasnoperov, A.; Kurepin, A.; Larsen, D.; Laszlo, A.; Lyubushkin, V. V.; Maćkowiak-Pawłowska, M.; Majka, Z.; Maksiak, B.; Malakhov, A. I.; Maletic, D.; Manglunki, D.; Manic, D.; Marchionni, A.; Marcinek, A.; Marin, V.; Marton, K.; Mathes, H.-J.; Matulewicz, T.; Matveev, V.; Melkumov, G. L.; Messina, M.; Mrówczyński, St.; Murphy, S.; Nakadaira, T.; Nirkko, M.; Nishikawa, K.; Palczewski, T.; Palla, G.; Panagiotou, A. D.; Paul, T.; Peryt, W.; Petukhov, O.; Pistillo, C.; Płaneta, R.; Pluta, J.; Popov, B. A.; Posiadala, M.; Puławski, S.; Puzovic, J.; Rauch, W.; Ravonel, M.; Redij, A.; Renfordt, R.; Richter-Was, E.; Robert, A.; Röhrich, D.; Rondio, E.; Rossi, B.; Roth, M.; Rubbia, A.; Rustamov, A.; Rybczyński, M.; Sadovsky, A.; Sakashita, K.; Savic, M.; Schmidt, K.; Sekiguchi, T.; Seyboth, P.; Sgalaberna, D.; Shibata, M.; Sipos, R.; Skrzypczak, E.; Słodkowski, M.; Sosin, Z.; Staszel, P.; Stefanek, G.; Stepaniak, J.; Stroebele, H.; Susa, T.; Szuba, M.; Tada, M.; Tereshchenko, V.; Tolyhi, T.; Tsenov, R.; Turko, L.; Ulrich, R.; Unger, M.; Vassiliou, M.; Veberic, D.; Vechernin, V. V.; Vesztergombi, G.; Vinogradov, L.; Wilczek, A.; Włodarczyk, Z.; Wojtaszek-Szwarz, A.; Wyszyński, O.; Zambelli, L.; Zipper, W.

    2014-06-01

    NA61/SHINE (SPS Heavy Ion and Neutrino Experiment) is a multi-purpose experimental facility to study hadron production in hadron-proton, hadron-nucleus and nucleus-nucleus collisions at the CERN Super Proton Synchrotron. It recorded the first physics data with hadron beams in 2009 and with ion beams (secondary 7Be beams) in 2011. NA61/SHINE has greatly profited from the long development of the CERN proton and ion sources and the accelerator chain as well as the H2 beamline of the CERN North Area. The latter has recently been modified to also serve as a fragment separator as needed to produce the Be beams for NA61/SHINE. Numerous components of the NA61/SHINE set-up were inherited from its predecessors, in particular, the last one, the NA49 experiment. Important new detectors and upgrades of the legacy equipment were introduced by the NA61/SHINE Collaboration. This paper describes the state of the NA61/SHINE facility — the beams and the detector system — before the CERN Long Shutdown I, which started in March 2013.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network Constituents, Fundamental Forces and Symmetries of the Universe. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva.« less

  15. Atmospheric Retrieval Analysis of the Directly Imaged Exoplanet HR 8799b

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Min; Heng, Kevin; Irwin, Patrick G. J.

    2013-12-01

    Directly imaged exoplanets are unexplored laboratories for the application of the spectral and temperature retrieval method, where the chemistry and composition of their atmospheres are inferred from inverse modeling of the available data. As a pilot study, we focus on the extrasolar gas giant HR 8799b, for which more than 50 data points are available. We upgrade our non-linear optimal estimation retrieval method to include a phenomenological model of clouds that requires the cloud optical depth and monodisperse particle size to be specified. Previous studies have focused on forward models with assumed values of the exoplanetary properties; there is no consensus on the best-fit values of the radius, mass, surface gravity, and effective temperature of HR 8799b. We show that cloud-free models produce reasonable fits to the data if the atmosphere is of super-solar metallicity and non-solar elemental abundances. Intermediate cloudy models with moderate values of the cloud optical depth and micron-sized particles provide an equally reasonable fit to the data and require a lower mean molecular weight. We report our best-fit values for the radius, mass, surface gravity, and effective temperature of HR 8799b. The mean molecular weight is about 3.8, while the carbon-to-oxygen ratio is about unity due to the prevalence of carbon monoxide. Our study emphasizes the need for robust claims about the nature of an exoplanetary atmosphere to be based on analyses involving both photometry and spectroscopy and inferred from beyond a few photometric data points, such as are typically reported for hot Jupiters.

  16. Use of Cloud Computing to Calibrate a Highly Parameterized Model

    NASA Astrophysics Data System (ADS)

    Hayley, K. H.; Schumacher, J.; MacMillan, G.; Boutin, L.

    2012-12-01

    We present a case study using cloud computing to facilitate the calibration of a complex and highly parameterized model of regional groundwater flow. The calibration dataset consisted of many (~1500) measurements or estimates of static hydraulic head, a high resolution time series of groundwater extraction and disposal rates at 42 locations and pressure monitoring at 147 locations with a total of more than one million raw measurements collected over a ten year pumping history, and base flow estimates at 5 surface water monitoring locations. This modeling project was undertaken to assess the sustainability of groundwater withdrawal and disposal plans for insitu heavy oil extraction in Northeast Alberta, Canada. The geological interpretations used for model construction were based on more than 5,000 wireline logs collected throughout the 30,865 km2 regional study area (RSA), and resulted in a model with 28 slices, and 28 hydro stratigraphic units (average model thickness of 700 m, with aquifers ranging from a depth of 50 to 500 m below ground surface). The finite element FEFLOW model constructed on this geological interpretation had 331,408 nodes and required 265 time steps to simulate the ten year transient calibration period. This numerical model of groundwater flow required 3 hours to run on a on a server with two, 2.8 GHz processers and 16 Gb. RAM. Calibration was completed using PEST. Horizontal and vertical hydraulic conductivity as well as specific storage for each unit were independent parameters. For the recharge and the horizontal hydraulic conductivity in the three aquifers with the most transient groundwater use, a pilot point parameterization was adopted. A 7*7 grid of pilot points was defined over the RSA that defined a spatially variable horizontal hydraulic conductivity or recharge field. A 7*7 grid of multiplier pilot points that perturbed the more regional field was then superimposed over the 3,600 km2 local study area (LSA). The pilot point multipliers were implemented so a higher resolution of spatial variability could be obtained where there was a higher density of observation data. Five geologic boundaries were modeled with a specified flux boundary condition and the transfer rate was used as an adjustable parameter for each of these boundaries. This parameterization resulted in 448 parameters for calibration. In the project planning stage it was estimated that the calibration might require as much 15,000 hours (1.7 years) of computing. In an effort to complete the calibration in a timely manner, the inversion was parallelized and implemented on as many as 250 computing nodes located on Amazon's EC2 servers. The results of the calibration provided a better fit to the data than previous efforts with homogenous parameters, and the highly parameterized approach facilitated subspace Monte Carlo analysis for predictive uncertainty. This scale of cloud computing is relatively new for the hydrogeology community and at the time of implementation it was believed to be the first implementation of FEFLOW model at this scale. While the experience provided several challenges, the implementation was successful and provides some valuable learning for future efforts.

  17. CERN Collider, France-Switzerland

    NASA Image and Video Library

    2013-08-23

    This image, acquired by NASA Terra spacecraft, is of the CERN Large Hadron Collider, the world largest and highest-energy particle accelerator laying beneath the French-Swiss border northwest of Geneva yellow circle.

  18. CERN: A European laboratory for a global project

    NASA Astrophysics Data System (ADS)

    Voss, Rüdiger

    2015-06-01

    In the most important shift of paradigm of its membership rules in 60 years, CERN in 2010 introduced a policy of “Geographical Enlargement” which for the first time opened the door for membership of non-European States in the Organization. This short article reviews briefly the history of CERN's membership rules, discusses the rationale behind the new policy, its relationship with the emerging global roadmap of particle physics, and gives a short overview of the status of the enlargement process.

  19. PARTICLE PHYSICS: CERN Gives Higgs Hunters Extra Month to Collect Data.

    PubMed

    Morton, O

    2000-09-22

    After 11 years of banging electrons and positrons together at higher energies than any other machine in the world, CERN, the European laboratory for particle physics, had decided to shut down the Large Electron-Positron collider (LEP) and install a new machine, the Large Hadron Collider (LHC), in its 27-kilometer tunnel. In 2005, the LHC will start bashing protons together at even higher energies. But tantalizing hints of a long-sought fundamental particle have forced CERN managers to grant LEP a month's reprieve.

  20. NASA Experiment on Tropospheric-Stratospheric Water Vapor Transport in the Intertropical Convergence Zone

    NASA Technical Reports Server (NTRS)

    Page, William A.

    1982-01-01

    The following six papers report preliminary results obtained from a field experiment designed to study the role of tropical cumulo-nimbus clouds in the transfer of water vapor from the troposphere to the stratosphere over the region of Panama. The measurements were made utilizing special NOAA enhanced IR satellite images, radiosonde-ozonesondes and a NASA U-2 aircraft carrying. nine experiments. The experiments were provided by a group of NASA, NOAA, industry, and university scientists. Measurements included atmospheric humidity, air and cloud top temperatures, atmospheric tracer constituents, cloud particle characteristics and cloud morphology. The aircraft made a total of eleven flights from August 30 through September 18, 1980, from Howard Air Force Base, Panama; the pilots obtained horizontal and vertical profiles in and near convectively active regions and flew around and over cumulo-nimbus towers and through the extended anvils in the stratosphere. Cumulo-nimbus clouds in the tropics appear to play an important role in upward water vapor transport and may represent the principal source influencing the stratospheric water vapor budget. The clouds provide strong vertical circulation in the troposphere, mixing surface air and its trace materials (water vapor, CFM's sulfur compounds, etc.) quickly up to the tropopause. It is usually assumed that large scale mean motions or eddy scale motions transport the trace materials through the tropopause and into the stratosphere where they are further dispersed and react with other stratospheric constituents. The important step between the troposphere and stratosphere for water vapor appears to depend upon the processes occurring at or near the tropopause at the tops of the cumulo-nimbus towers. Several processes have been sugested: (1) The highest towers penetrate the tropopause and carry water in the form of small ice particles directly into the stratosphere. (2) Water vapor from the tops of the cumulonimbus clouds is transported somehow through the tropopause, the vapor pressure being controlled by the temperature at the tops of the clouds; the dryness of the stratosphere could be explained if most of the transport occurs in connection with very high clouds in regions with very high and cold tropopause. (3) Cumulo-nimbus anvils act as terrestrial-radiation shields allowing the ice particle temperatures near cloud tops to cool radiatively below the supersaturation point; this cooling would cause a vapor deposition on the ice particles which will settle out and thus act as water scavengers. The experiment was designed to collect information on these detailed physical processes near and above the tropopause in order to assess their importance and the role they play in controlling stratospheric water vapor humidity.

  1. Réunion publique HR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-04-30

    Chers Collègues,Je me permets de vous rappeler qu'une réunion publique organisée par le Département HR se tiendra aujourd'hui:Vendredi 30 avril 2010 à 9h30 dans l'Amphithéâtre principal (café offert dès 9h00).Durant cette réunion, des informations générales seront données sur:le CERN Admin e-guide, qui est un nouveau guide des procédures administratives du CERN ayant pour but de faciliter la recherche d'informations pratiques et d'offrir un format de lecture convivial;le régime d'Assurance Maladie de l'Organisation (présentation effectuée par Philippe Charpentier, Président du CHIS Board) et;la Caisse de Pensions (présentation effectuée par Théodore Economou, Administrateur de la Caisse de Pensions du CERN).Une transmission simultanéemore » de cette réunion sera assurée dans l'Amphithéâtre BE de Prévessin et également disponible à l'adresse suivante: http://webcast.cern.chJe me réjouis de votre participation!Meilleures salutations,Anne-Sylvie CatherinChef du Département des Ressources humaines__________________________________________________________________________________Dear Colleagues,I should like to remind you that a plublic meeting organised by HR Department will be held today:Friday 30 April 2010 at 9:30 am in the Main Auditorium (coffee from 9:00 am).During this meeting, general information will be given about:the CERN Admin e-guide which is a new guide to the Organization's administrative procedures, drawn up to facilitate the retrieval of practical information and to offer a user-friendly format;the CERN Health Insurance System (presentation by Philippe Charpentier, President of the CHIS Board) and;the Pension Fund (presentation by Theodore Economou, Administrator of the CERN Pension Fund).A simultaneous transmission of this meeting will be broadcast in the BE Auditorium at Prévessin and will also be available at the following address. http://webcast.cern.chI look forward to your participation!Best regards,Anne-Sylvie CatherinHead, Human Resources Department« less

  2. Réunion publique HR

    ScienceCinema

    None

    2017-12-09

    Chers Collègues,Je me permets de vous rappeler qu'une réunion publique organisée par le Département HR se tiendra aujourd'hui:Vendredi 30 avril 2010 à 9h30 dans l'Amphithéâtre principal (café offert dès 9h00).Durant cette réunion, des informations générales seront données sur:le CERN Admin e-guide, qui est un nouveau guide des procédures administratives du CERN ayant pour but de faciliter la recherche d'informations pratiques et d'offrir un format de lecture convivial;le régime d'Assurance Maladie de l'Organisation (présentation effectuée par Philippe Charpentier, Président du CHIS Board) et;la Caisse de Pensions (présentation effectuée par Théodore Economou, Administrateur de la Caisse de Pensions du CERN).Une transmission simultanée de cette réunion sera assurée dans l'Amphithéâtre BE de Prévessin et également disponible à l'adresse suivante: http://webcast.cern.chJe me réjouis de votre participation!Meilleures salutations,Anne-Sylvie CatherinChef du Département des Ressources humaines__________________________________________________________________________________Dear Colleagues,I should like to remind you that a plublic meeting organised by HR Department will be held today:Friday 30 April 2010 at 9:30 am in the Main Auditorium (coffee from 9:00 am).During this meeting, general information will be given about:the CERN Admin e-guide which is a new guide to the Organization's administrative procedures, drawn up to facilitate the retrieval of practical information and to offer a user-friendly format;the CERN Health Insurance System (presentation by Philippe Charpentier, President of the CHIS Board) and;the Pension Fund (presentation by Theodore Economou, Administrator of the CERN Pension Fund).A simultaneous transmission of this meeting will be broadcast in the BE Auditorium at Prévessin and will also be available at the following address. http://webcast.cern.chI look forward to your participation!Best regards,Anne-Sylvie CatherinHead, Human Resources Department

  3. New Stereo Vision Digital Camera System for Simultaneous Measurement of Cloud Base Height and Atmospheric Visibility

    NASA Astrophysics Data System (ADS)

    Janeiro, F. M.; Carretas, F.; Palma, N.; Ramos, P. M.; Wagner, F.

    2013-12-01

    Clouds play an important role in many aspects of everyday life. They affect both the local weather as well as the global climate and are an important parameter on climate change studies. Cloud parameters are also important for weather prediction models which make use of actual measurements. It is thus important to have low-cost instrumentation that can be deployed in the field to measure those parameters. This kind of instruments should also be automated and robust since they may be deployed in remote places and be subject to adverse weather conditions. Although clouds are very important in environmental systems, they are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Under VFR there are strict limits on the height of the cloud base, cloud cover and atmospheric visibility that ensure the safety of the pilots and planes. Although there are instruments, available in the market, to measure those parameters, their relatively high cost makes them unavailable in many local aerodromes. In this work we present a new prototype which has been recently developed and deployed in a local aerodrome as proof of concept. It is composed by two digital cameras that capture photographs of the sky and allow the measurement of the cloud height from the parallax effect. The new developments consist on having a new geometry which allows the simultaneous measurement of cloud base height, wind speed at cloud base height and atmospheric visibility, which was not previously possible with only two cameras. The new orientation of the cameras comes at the cost of a more complex geometry to measure the cloud base height. The atmospheric visibility is calculated from the Lambert-Beer law after the measurement of the contrast between a set of dark objects and the background sky. The prototype includes the latest hardware developments that allow its cost to remain low even with its increased functionality. Also, a new control software was also developed to ensure that the two cameras are triggered simultaneously. This is a major requirement that affects the final uncertainty of the measurements due to the constant movement of the clouds in the sky. Since accurate orientation of the cameras can be a very demanding task in field deployments, an automated calibration procedure has been developed, that removes the need for an accurate alignment. It consists on photographing the stars, which do not exhibit parallax due to the long distances involved, and deducing the inherent misalignments of the two cameras. The known misalignments are then used to correct the cloud photos. These developments will be described in the detail, along with an uncertainty analysis of the measurement setup. Measurements of cloud base height and atmospheric visibility will be presented and compared with measurements from other in-situ instruments. This work was supported by FCT project PTDC/CTE-ATM/115833/2009 and Program COMPETE FCOMP-01-0124-FEDER-014508

  4. SHiP: a new facility with a dedicated detector for studying tau neutrino properties

    NASA Astrophysics Data System (ADS)

    Komatsu, M.; SHiP Collaboration

    2017-06-01

    SHiP (Search for Hidden Particles) is a new general purpose fixed target facility at the CERN SPS accelerator, with the aim of search for New Physics which has small coupling with standard particles by searching for long lived beyond standard model particles with masses below a few GeV/c2. The SHiP facility is a high intensity beam bump, the 400GeV proton beam extracted from the SPS will be dumped on a heavy target with the aim of integrating 2 ×1020 pot in 5 years. A dedicated detector, based on the OPERA-like ECC (Emulsion Cloud Chamber), will provide tau and anti-tau neutrino detection capability to study ντ and ν‾τ cross-sections with a statistics a few 100 times larger than the DONUT experiment. Moreover, the structure functions F4 and F5 which is only accessible by tau neutrino interactions can be measured first time. SHiP is the unique chance to study tau and anti tau neutrino properties.

  5. Watching Grass - a Pilot Study on the Suitability of Photogrammetric Techniques for Quantifying Change in Aboveground Biomass in Grassland Experiments

    NASA Astrophysics Data System (ADS)

    Kröhnert, M.; Anderson, R.; Bumberger, J.; Dietrich, P.; Harpole, W. S.; Maas, H.-G.

    2018-05-01

    Grassland ecology experiments in remote locations requiring quantitative analysis of the biomass in defined plots are becoming increasingly widespread, but are still limited by manual sampling methodologies. To provide a cost-effective automated solution for biomass determination, several photogrammetric techniques are examined to generate 3D point cloud representations of plots as a basis, to estimate aboveground biomass on grassland plots, which is a key ecosystem variable used in many experiments. Methods investigated include Structure from Motion (SfM) techniques for camera pose estimation with posterior dense matching as well as the usage of a Time of Flight (TOF) 3D camera, a laser light sheet triangulation system and a coded light projection system. In this context, plants of small scales (herbage) and medium scales are observed. In the first pilot study presented here, the best results are obtained by applying dense matching after SfM, ideal for integration into distributed experiment networks.

  6. ASTRONAUT WHITE, EDWARD H. II - GEMINI-IV - EXTRAVEHICULAR ACTIVITY (EVA) - CREW TRAINING - MSC

    NASA Image and Video Library

    2009-06-03

    S65-30431 (3 June 1965) --- Astronaut Edward H. White II, pilot of the Gemini IV four-day Earth-orbital mission, floats in the zero gravity of space outside the Gemini IV spacecraft. Behind him is the brilliant blue Earth and its white cloud cover. White wears a specially designed spacesuit; and the visor of the helmet is gold plated to protect him against the unfiltered rays of the sun. He wears an emergency oxygen pack, also. He is secured to the spacecraft by a 25-feet umbilical line and a 23-feet tether line, both wrapped in gold tape to form one cord. In his left hand is a Hand-Held Self-Maneuvering Unit (HHSMU) with which he controls his movements in space. Astronaut James A. McDivitt, command pilot of the mission, remained inside the spacecraft. Photo credit: NASA EDITOR'S NOTE: Astronaut White died in the Apollo/Saturn 204 fire at Cape Kennedy on Jan. 27, 1967.

  7. Commissioning of a CERN Production and Analysis Facility Based on xrootd

    NASA Astrophysics Data System (ADS)

    Campana, Simone; van der Ster, Daniel C.; Di Girolamo, Alessandro; Peters, Andreas J.; Duellmann, Dirk; Coelho Dos Santos, Miguel; Iven, Jan; Bell, Tim

    2011-12-01

    The CERN facility hosts the Tier-0 of the four LHC experiments, but as part of WLCG it also offers a platform for production activities and user analysis. The CERN CASTOR storage technology has been extensively tested and utilized for LHC data recording and exporting to external sites according to experiments computing model. On the other hand, to accommodate Grid data processing activities and, more importantly, chaotic user analysis, it was realized that additional functionality was needed including a different throttling mechanism for file access. This paper will describe the xroot-based CERN production and analysis facility for the ATLAS experiment and in particular the experiment use case and data access scenario, the xrootd redirector setup on top of the CASTOR storage system, the commissioning of the system and real life experience for data processing and data analysis.

  8. CERN alerter—RSS based system for information broadcast to all CERN offices

    NASA Astrophysics Data System (ADS)

    Otto, R.

    2008-07-01

    Nearly every large organization uses a tool to broadcast messages and information across the internal campus (messages like alerts announcing interruption in services or just information about upcoming events). These tools typically allow administrators (operators) to send 'targeted' messages which are sent only to specific groups of users or computers, e/g only those located in a specified building or connected to a particular computing service. CERN has a long history of such tools: CERNVMS's SPM_quotMESSAGE command, Zephyr [2] and the most recent the NICE Alerter based on the NNTP protocol. The NICE Alerter used on all Windows-based computers had to be phased out as a consequence of phasing out NNTP at CERN. The new solution to broadcast information messages on the CERN campus continues to provide the service based on cross-platform technologies, hence minimizing custom developments and relying on commercial software as much as possible. The new system, called CERN Alerter, is based on RSS (Really Simple Syndication) [9] for the transport protocol and uses Microsoft SharePoint as the backend for database and posting interface. The windows-based client relies on Internet Explorer 7.0 with custom code to trigger the window pop-ups and the notifications for new events. Linux and Mac OS X clients could also rely on any RSS readers to subscribe to targeted notifications. The paper covers the architecture and implementation aspects of the new system.

  9. OBITUARY: Maurice Jacob (1933 2007)

    NASA Astrophysics Data System (ADS)

    Quercigh, Emanuele; Šándor, Ladislav

    2008-04-01

    Maurice Jacob passed away on 2 May 2007. With his death, we have lost one of the founding fathers of the ultra-relativistic heavy ion programme. His interest in high-energy nuclear physics started in 1981 when alpha alpha collisions could first be studied in the CERN ISR. An enthusiastic supporter of ion beam experiments at CERN, Maurice was at the origin of the 1982 Quark Matter meeting in Bielefeld [1] which brought together more than 100 participants from both sides of the Atlantic, showing a good enthusiastic constituency for such research. There were twice as many the following year at Brookhaven. Finally in the mid-eighties, a heavy ion programme was approved both at CERN and at Brookhaven involving as many nuclear as particle physicists. It was the start of a fruitful interdisciplinary collaboration which is nowadays continuing both at RHIC and at LHC. Maurice followed actively the development of this field, reporting at a number of conferences and meetings (Les Arcs, Bielefeld, Beijing, Brookhaven, Lenox, Singapore, Taormina,...). This activity culminated in 2000, when Maurice, together with Ulrich Heinz, summarized the main results of the CERN SPS heavy-ion experiments and the evidence was obtained for a new state of matter [2]. Maurice was a brilliant theoretical physicist. His many contributions have been summarized in a recent article in the CERN Courier by two leading CERN theorists, John Ellis and Andre Martin [3]. The following is an excerpt from their article: `He began his research career at Saclay and, while still a PhD student, he continued brilliantly during a stay at Brookhaven. It was there in 1959 that Maurice, together with Giancarlo Wick, developed the helicity amplitude formalism that is the basis of many modern theoretical calculations. Maurice obtained his PhD in 1961 and, after a stay at Caltech, returned to Saclay. A second American foray was to SLAC, where he and Sam Berman made the crucial observation that the point-like structures (partons) seen in deep-inelastic scattering implied the existence of high-transverse-momentum processes in proton proton collisions, as the ISR at CERN subsequently discovered. In 1967 Maurice joined CERN, where he remained, apart from influential visits to Yale, Fermilab and elsewhere, until his retirement in 1998. He became one of the most respected international experts on the phenomenology of strong interactions, including diffraction, scaling, high-transverse-momentum processes and the formation of quark gluon plasma. In particular, he pioneered the studies of inclusive hadron-production processes, including scaling and its violations. Also, working with Ron Horgan, he made detailed predictions for the production of jets at CERN's proton antiproton collider. The UA2 and UA1 experiments subsequently discovered these. He was also interested in electron positron colliders, making pioneering calculations, together with Tai Wu, of radiation in high-energy collisions. Maurice was one of the scientific pillars of CERN, working closely with experimental colleagues in predicting and interpreting results from successive CERN colliders. He was indefatigable in organizing regular meetings on ISR physics, bringing together theorists and experimentalists to debate the meaning of new results and propose new measurements. He was one of the strongest advocates of Carlo Rubbia's proposal for a proton antiproton collider at CERN, and was influential in preparing and advertising its physics. In 1978 he organized the Les Houches workshop that brought the LEP project to the attention of the wider European particle physics community. He also organized the ECFA workshop at Lausanne in 1984 that made the first exploration of the possible physics of the LHC. It is a tragedy that Maurice has not lived to enjoy data from the LHC.' References [1] Maurice Jacob and Helmut Satz (eds) 1982 Proc. Workshop on Quark Matter Formation and Heavy Ion Collisions, Bielefeld, 10 14 May 1982 (Singapore: World Scientific Publishing) [2] Heinz Ulrich W and Jacob Maurice 2000 Evidence for a new state of matter: An assessment of the results from the CERN lead beam program. Preprint nucl-th/0002042 [3] Ellis J and Martin A 2007 CERN Courier 47 issue 6

  10. Air force Thunderbirds

    NASA Image and Video Library

    2007-02-01

    Silhouetted against the cloud-strewn sky over NASA's Kennedy Space Center, a U.S. Air Force Thunderbird F-16D aircraft displays its prowess. The pilot is Maj. Tad Clark, who, after landing at the Shuttle Landing Facility, announced that Kennedy Space Center Visitor Complex will host the inaugural World Space Expo from Nov. 3 to 11, featuring an aerial salute by the Thunderbirds on its opening weekend. The Expo will create one of the largest displays of space artifacts, hardware and personalities ever assembled in one location with the objective to inspire, educate and engage the public by highlighting the achievements and benefits of space exploration.

  11. Final Project Report - ARM CLASIC CIRPAS Twin Otter Aerosol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John A. Ogren

    2010-04-05

    The NOAA/ESRL/GMD aerosol group made three types of contributions related to airborne measurements of aerosol light scattering and absorption for the Cloud and Land Surface Interaction Campaign (CLASIC) in June 2007 on the Twin Otter research airplane operated by the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS). GMD scientists served as the instrument mentor for the integrating nephelometer and particle soot absorption photometer (PSAP) on the Twin Otter during CLASIC, and were responsible for (1) instrument checks/comparisons; (2) instrument trouble shooting/repair; and (3) data quality control (QC) and submittal to the archive.

  12. Preliminary Flight Deck Observations During Flight in High Ice Water Content Conditions

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas; Duchanoy, Dominque; Bourdinot, Jean-Francois; Harrah, Steven; Strapp, Walter; Schwarzenboeck, Alfons; Dezitter, Fabien; Grandin, Alice

    2015-01-01

    In 2006, Mason et al. identified common observations that occurred in engine power-loss events attributed to flight in high concentrations of ice crystals. Observations included light to moderate turbulence, precipitation on the windscreen (often reported as rain), aircraft total temperature anomalies, lack of significant airframe icing, and no flight radar echoes at the location and altitude of the engine event. Since 2006, Mason et al. and others have collected information from pilots who experienced engine power-loss events via interviews and questionnaires to substantiate earlier observations and support event analyses. In 2011, Mason and Grzych reported that vertical acceleration data showed increases in turbulence prior to engine events, although the turbulence was usually light to moderate and not unique to high ice water content (HIWC) clouds. Mason concluded that the observation of rain on the windscreen was due to melting of ice high concentrations of ice crystals on the windscreen, coalescing into drops. Mason also reported that these pilot observations of rain on the windscreen were varied. Many pilots indicated no rain was observed, while others observed moderate rain with unique impact sounds. Mason concluded that the variation in the reports may be due to variation in the ice concentration, particle size, and temperature.

  13. CERN openlab: Engaging industry for innovation in the LHC Run 3-4 R&D programme

    NASA Astrophysics Data System (ADS)

    Girone, M.; Purcell, A.; Di Meglio, A.; Rademakers, F.; Gunne, K.; Pachou, M.; Pavlou, S.

    2017-10-01

    LHC Run3 and Run4 represent an unprecedented challenge for HEP computing in terms of both data volume and complexity. New approaches are needed for how data is collected and filtered, processed, moved, stored and analysed if these challenges are to be met with a realistic budget. To develop innovative techniques we are fostering relationships with industry leaders. CERN openlab is a unique resource for public-private partnership between CERN and leading Information Communication and Technology (ICT) companies. Its mission is to accelerate the development of cutting-edge solutions to be used by the worldwide HEP community. In 2015, CERN openlab started its phase V with a strong focus on tackling the upcoming LHC challenges. Several R&D programs are ongoing in the areas of data acquisition, networks and connectivity, data storage architectures, computing provisioning, computing platforms and code optimisation and data analytics. This paper gives an overview of the various innovative technologies that are currently being explored by CERN openlab V and discusses the long-term strategies that are pursued by the LHC communities with the help of industry in closing the technological gap in processing and storage needs expected in Run3 and Run4.

  14. Measurements and FLUKA simulations of bismuth and aluminium activation at the CERN Shielding Benchmark Facility (CSBF)

    NASA Astrophysics Data System (ADS)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.

    2018-03-01

    The CERN High Energy AcceleRator Mixed field facility (CHARM) is located in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5 ṡ1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7 ṡ1010 p/s that then impacts on the CHARM target. The shielding of the CHARM facility also includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target. This facility consists of 80 cm of cast iron and 360 cm of concrete with barite concrete in some places. Activation samples of bismuth and aluminium were placed in the CSBF and in the CHARM access corridor in July 2015. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields for these samples. The results estimated by FLUKA Monte Carlo simulations are compared to activation measurements of these samples. The comparison between FLUKA simulations and the measured values from γ-spectrometry gives an agreement better than a factor of 2.

  15. Contributions to the Characterization and Mitigation of Rotorcraft Brownout

    NASA Astrophysics Data System (ADS)

    Tritschler, John Kirwin

    Rotorcraft brownout, the condition in which the flow field of a rotorcraft mobilizes sediment from the ground to generate a cloud that obscures the pilot's field of view, continues to be a significant hazard to civil and military rotorcraft operations. This dissertation presents methodologies for: (i) the systematic mitigation of rotorcraft brownout through operational and design strategies and (ii) the quantitative characterization of the visual degradation caused by a brownout cloud. In Part I of the dissertation, brownout mitigation strategies are developed through simulation-based brownout studies that are mathematically formulated within a numerical optimization framework. Two optimization studies are presented. The first study involves the determination of approach-to-landing maneuvers that result in reduced brownout severity. The second study presents a potential methodology for the design of helicopter rotors with improved brownout characteristics. The results of both studies indicate that the fundamental mechanisms underlying brownout mitigation are aerodynamic in nature, and the evolution of a ground vortex ahead of the rotor disk is seen to be a key element in the development of a brownout cloud. In Part II of the dissertation, brownout cloud characterizations are based upon the Modulation Transfer Function (MTF), a metric commonly used in the optics community for the characterization of imaging systems. The use of the MTF in experimentation is examined first, and the application of MTF calculation and interpretation methods to actual flight test data is described. The potential for predicting the MTF from numerical simulations is examined second, and an initial methodology is presented for the prediction of the MTF of a brownout cloud. Results from the experimental and analytical studies rigorously quantify the intuitively-known facts that the visual degradation caused by brownout is a space and time-dependent phenomenon, and that high spatial frequency features, i.e., fine-grained detail, are obscured before low spatial frequency features, i.e., large objects. As such, the MTF is a metric that is amenable to Handling Qualities (HQ) analyses.

  16. Memorial W.Gentner

    ScienceCinema

    None

    2018-05-25

    The DG H. Schopper gives an introduction for the commemoration and ceremony of the life and work of Professor Wolfgang Gentner. W. Gentner, German physicist, born in 1906 in Frankfurt and died in September 1980 in Heidelberg, was director of CERN from 1955 to 1960, president of the Scientific Policy Committee from 1968 to 1971 and president of the Council of CERN from 1972 to 1974. He was one of the founders of CERN and four people who knew him well pay tribute to him, among others one of his students, as well as J.B. Adams and O. Sheffard.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The DG H. Schopper gives an introduction for the commemoration and ceremony of the life and work of Professor Wolfgang Gentner. W. Gentner, German physicist, born in 1906 in Frankfurt and died in September 1980 in Heidelberg, was director of CERN from 1955 to 1960, president of the Scientific Policy Committee from 1968 to 1971 and president of the Council of CERN from 1972 to 1974. He was one of the founders of CERN and four people who knew him well pay tribute to him, among others one of his students, as well as J.B. Adams and O. Sheffard.

  18. OPERA - First Beam Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamura, M.

    2008-02-21

    OPERA is a long base-line neutrino oscillation experiment to detect tau-neutrino appearance and to prove that the origin of the atmospheric muon neutrino deficit observed by Kamiokande is the neutrino oscillation. A Hybrid emulsion detector, of which weight is about 1.3 kton, has been installed in Gran Sasso laboratory. New muon neutrino beam line, CNGS, has been constructed at CERN to send neutrinos to Gran Sasso, 730 km apart from CERN. In 2006, first neutrinos were sent from CERN to LNGS and were detected by the OPERA detector successfully as planned.

  19. Mic Flocks in the Cloud: Harnessing Mobile Ubiquitous Sensor Networks

    NASA Astrophysics Data System (ADS)

    Garces, M. A.; Christe, A.

    2015-12-01

    Smartphones provide a commercial, off-the-shelf solution to capture, store, analyze, and distribute infrasound using on-board or external microphones (mics) as well as on-board barometers. Free iOS infrasound apps can be readily downloaded from the Apple App Store, and Android versions are in progress. Infrasound propagates for great distances, has low sample rates, and provides a tractable pilot study scenario for open distributed sensor networks at regional and global scales using one of the most ubiquitous sensors on Earth - microphones. Data collection is no longer limited to selected vendors at exclusive prices: anybody on Earth can record and stream infrasound, and the diversity of recording systems and environments is rapidly expanding. Global deployment may be fast and easy (www.redvox.io), but comes with the cost of increasing data volume, velocity, variety, and complexity. Flocking - the collective motion of mobile agents - is a natural human response to threats or events of interest. Anticipating, modeling and harnessing flocking sensor topologies will be necessary for adaptive array and network processing. The increasing data quantity and complexity will exceed the processing capacity of human analysts and most research servers. We anticipate practical real-time applications will require the on-demand adaptive scalability and resources of the Cloud. Cloud architectures for such heterogeneous sensor networks will consider eventual integration into the Global Earth Observation System of Systems (GEOSS).

  20. Looking at Earth from Space: Teacher's Guide with Activities for Earth and Space Science

    NASA Technical Reports Server (NTRS)

    Steele, Colleen (Editor); Steele, Colleen; Ryan, William F.

    1995-01-01

    The Maryland Pilot Earth Science and Technology Education Network (MAPS-NET) project was sponsored by the National Aeronautics and Space Administration (NASA) to enrich teacher preparation and classroom learning in the area of Earth system science. This publication includes a teacher's guide that replicates material taught during a graduate-level course of the project and activities developed by the teachers. The publication was developed to provide teachers with a comprehensive approach to using satellite imagery to enhance science education. The teacher's guide is divided into topical chapters and enables teachers to expand their knowledge of the atmosphere, common weather patterns, and remote sensing. Topics include: weather systems and satellite imagery including mid-latitude weather systems; wave motion and the general circulation; cyclonic disturbances and baroclinic instability; clouds; additional common weather patterns; satellite images and the internet; environmental satellites; orbits; and ground station set-up. Activities are listed by suggested grade level and include the following topics: using weather symbols; forecasting the weather; cloud families and identification; classification of cloud types through infrared Automatic Picture Transmission (APT) imagery; comparison of visible and infrared imagery; cold fronts; to ski or not to ski (imagery as a decision making tool), infrared and visible satellite images; thunderstorms; looping satellite images; hurricanes; intertropical convergence zone; and using weather satellite images to enhance a study of the Chesapeake Bay. A list of resources is also included.

  1. Investigating Gravity Waves in Polar Mesospheric Clouds Using Tomographic Reconstructions of AIM Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Hart, V. P.; Taylor, M. J.; Doyle, T. E.; Zhao, Y.; Pautet, P.-D.; Carruth, B. L.; Rusch, D. W.; Russell, J. M.

    2018-01-01

    This research presents the first application of tomographic techniques for investigating gravity wave structures in polar mesospheric clouds (PMCs) imaged by the Cloud Imaging and Particle Size instrument on the NASA AIM satellite. Albedo data comprising consecutive PMC scenes were used to tomographically reconstruct a 3-D layer using the Partially Constrained Algebraic Reconstruction Technique algorithm and a previously developed "fanning" technique. For this pilot study, a large region (760 × 148 km) of the PMC layer (altitude 83 km) was sampled with a 2 km horizontal resolution, and an intensity weighted centroid technique was developed to create novel 2-D surface maps, characterizing the individual gravity waves as well as their altitude variability. Spectral analysis of seven selected wave events observed during the Northern Hemisphere 2007 PMC season exhibited dominant horizontal wavelengths of 60-90 km, consistent with previous studies. These tomographic analyses have enabled a broad range of new investigations. For example, a clear spatial anticorrelation was observed between the PMC albedo and wave-induced altitude changes, with higher-albedo structures aligning well with wave troughs, while low-intensity regions aligned with wave crests. This result appears to be consistent with current theories of PMC development in the mesopause region. This new tomographic imaging technique also provides valuable wave amplitude information enabling further mesospheric gravity wave investigations, including quantitative analysis of their hemispheric and interannual characteristics and variations.

  2. Membership Finland

    ScienceCinema

    None

    2018-05-18

    The DG C. Rubbia and the vice president of the council of CERN gives a warm welcome to the membership of Finland, as the 15th member of CERN since January 1 1991 in the presence of the Secretary-General and the ambassador.

  3. Visit CD

    ScienceCinema

    None

    2017-12-09

    Le DG H.Schopper souhaite la bienvenue aux ambassadeurs des pays membres et aux représentants des pays avec lesquels le Cern entretient des relations proches et fait un exposé sur les activités au Cern

  4. Terbium Radionuclides for Theranostics Applications: A Focus On MEDICIS-PROMED

    NASA Astrophysics Data System (ADS)

    Cavaier, R. Formento; Haddad, F.; Sounalet, T.; Stora, T.; Zahi, I.

    A new facility, named CERN-MEDICIS, is under construction at CERN to produce radionuclides for medical applications. In parallel, the MEDICIS-PROMED, a Marie Sklodowska-Curie innovative training network of the Horizon 2020 European Commission's program, is being coordinated by CERN to train young scientists on the production and use of innovative radionuclides and develop a network of experts within Europe. One program within MEDICIS-PROMED is to determine the feasibility of producing innovative radioisotopes for theranostics using a commercial middle-sized high-current cyclotron and the mass separation technology developed at CERN-MEDICIS. This will allow the production of high specific activity radioisotopes not achievable with the common post-processing by chemical separation. Radioisotopes of scandium, copper, arsenic and terbium have been identified. Preliminary studies of activation yield and irradiation parameters optimization for the production of Tb-149 will be described.

  5. LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    NASA Astrophysics Data System (ADS)

    Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor

    2017-12-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.

  6. Cryogenic Control System Migration and Developments towards the UNICOS CERN Standard at INFN

    NASA Astrophysics Data System (ADS)

    Modanese, Paolo; Calore, Andrea; Contran, Tiziano; Friso, Alessandro; Pengo, Marco; Canella, Stefania; Burioli, Sergio; Gallese, Benedetto; Inglese, Vitaliano; Pezzetti, Marco; Pengo, Ruggero

    The cryogenic control systems at Laboratori Nazionali di Legnaro (LNL) are undergoing an important and radical modernization, allowing all the plants controls and supervision systems to be renewed in a homogeneous way towards the CERN-UNICOS standard. Before the UNICOS migration project started there were as many as 7 different types of PLC and 7 different types of SCADA, each one requiring its own particular programming language. In these conditions, even a simple modification and/or integration on the program or on the supervision, required the intervention of a system integrator company, specialized in its specific control system. Furthermore it implied that the operators have to be trained to learn the different types of control systems. The CERN-UNICOS invented for LHC [1] has been chosen due to its reliability and planned to run and be maintained for decades on. The complete migration is part of an agreement between CERN and INFN.

  7. Measurements of Isoprene and its Oxidation Products during the CLOUD9 Experiment

    NASA Astrophysics Data System (ADS)

    Bernhammer, Anne-Kathrin; Breitenlechner, Martin; Coburn, Sean; Volkamer, Rainer; Hansel, Armin

    2015-04-01

    Isoprene (C5H8), being produced and emitted by the biosphere, is by far the dominant biogenic volatile organic compound (BVOC) in the atmosphere. Its complex reaction pathways with OH radicals, O3 and NO3, lead to compounds with lower volatilities and increasing water solubility. The high hydrophilicity allows for easy partitioning between the gas and liquid phase making those compounds good candidates for aqueous phase droplet chemistry that may contribute to particle growth. (Ervens et al., 2008). The CLOUD experiment (Cosmics Leaving Outdoor Droplets) at CERN allows the studying the evolution of particles originating from precursor gases in, in our case isoprene, in an ultraclean and very well controlled environmental chamber. Gas phase concentrations of isoprene and its first reaction products were measured in real-time with a Proton-Transfer-Reaction Time-of-Flight Mass Spectrometer (PTR-ToF-MS, Graus et al., 2010) and Cavity Enhanced Differential Optical Absorption Spectroscopy (CE-DOAS, Thalman and Volkamer, 2010). PTR-ToF-MS was calibrated using gas standards with known VOC concentrations. The PTR-ToF-MS was operated with H3O+ and NO+ as primary ions, continuously switching between both operating modes throughout the experiments. The use of different primary ions allows the discrimination of isomeric compounds like the main high NOx oxidation products methyl vinyl ketone (MVK) and methacroleine (MACR). The experiment was conducted at high isoprene concentrations and a constant level of O3. The highly water soluble gas phase oxidation products from the reaction of isoprene with O3 and OH radicals (from isoprene ozonolysis) were investigated and compared for two temperatures (+10 °C and -10 °C) and different NOx concentrations during cloud formation experiments. Here we will present first results of isoprene oxidation products observed with PTR-ToF-MS and CE-DOAS. References Ervens et al. (2008), Geophys. Res. Lett., 35, L02816 Graus et al. (2010), J. Am. Soc. Mass. Spectrom., 21, 1037-1044 Thalman and Volkamer (2010), Atmos. Meas. Tech., 3(6), 2681-2721.

  8. PREFACE: Lectures from the CERN Winter School on Strings, Supergravity and Gauge Theories, CERN, 9-13 February 2009 Lectures from the CERN Winter School on Strings, Supergravity and Gauge Theories, CERN, 9-13 February 2009

    NASA Astrophysics Data System (ADS)

    Uranga, A. M.

    2009-11-01

    This special section is devoted to the proceedings of the conference `Winter School on Strings, Supergravity and Gauge Theories', which took place at CERN, the European Centre for Nuclear Research, in Geneva, Switzerland 9-13 February 2009. This event is part of a yearly series of scientific schools, which represents a well established tradition. Previous events have been held at SISSA, in Trieste, Italy, in February 2005 and at CERN in January 2006, January 2007 and January 2008, and were funded by the European Mobility Research and Training Network `Constituents, Fundamental Forces and Symmetries of the Universe'. The next event will take place again at CERN, in January 2010. The school was primarily meant for young doctoral students and postdoctoral researchers working in the area of string theory. It consisted of several general lectures of four hours each, whose notes are published in this special section, and six working group discussion sessions, focused on specific topics of the network research program. It was well attended by over 200 participants. The topics of the lectures were chosen to provide an introduction to some of the areas of recent progress, and to the open problems, in string theory. One of the most active areas in string theory in recent years has been the AdS/CFT or gauge/gravity correspondence, which proposes the complete equivalence of string theory on (asymptotically) anti de Sitter spacetimes with certain quantum (gauge) field theories. The duality has recently been applied to understanding the hydrodynamical properties of a hot plasma in gauge theories (like the quark-gluon plasma created in heavy ion collisions at the RHIC experiment at Brookhaven, and soon at the LHC at CERN) in terms of a dual gravitational AdS theory in the presence of a black hole. These developments were reviewed in the lecture notes by M Rangamani. In addition, the AdS/CFT duality has been proposed as a tool to study interesting physical properties in other physical systems described by quantum field theory, for instance in the context of a condensed matter system. The lectures by S Hartnoll provided an introduction to this recent development with an emphasis on the dual holographic description of superconductivity. Finally, ideas inspired by the AdS/CFT correspondence are yielding deep insights into fundamental questions of quantum gravity, like the entropy of black holes and its interpretation in terms of microstates. The lectures by S Mathur reviewed the black hole entropy and information paradox, and the proposal for its resolution in terms of `fuzzball' microstates. Further sets of lectures, not included in this special section, by F Zwirner and V Mukhanov, covered phenomenological aspects of high energy physics beyond the Standard Model and of cosmology. The coming experimental data in these two fields are expected to foster new developments in connecting string theory to the real world. The conference was financially supported by CERN and partially by the Arnold Sommerfeld Center for Theoretical Physics of the Ludwig Maximilians University of Munich. It is a great pleasure for us to warmly thank the Theory Unit of CERN for its very kind hospitality and for the high quality of the assistance and the infrastructures that it has provided. A M Uranga CERN, Switzerland Guest Editor

  9. Gravity Waves and Mesospheric Clouds in the Summer Middle Atmosphere: A Comparison of Lidar Measurements and Ray Modeling of Gravity Waves Over Sondrestrom, Greenland

    NASA Technical Reports Server (NTRS)

    Gerrard, Andrew J.; Kane, Timothy J.; Eckermann, Stephen D.; Thayer, Jeffrey P.

    2004-01-01

    We conducted gravity wave ray-tracing experiments within an atmospheric region centered near the ARCLITE lidar system at Sondrestrom, Greenland (67N, 310 deg E), in efforts to understand lidar observations of both upper stratospheric gravity wave activity and mesospheric clouds during August 1996 and the summer of 2001. The ray model was used to trace gravity waves through realistic three-dimensional daily-varying background atmospheres in the region, based on forecasts and analyses in the troposphere and stratosphere and climatologies higher up. Reverse ray tracing based on upper stratospheric lidar observations at Sondrestrom was also used to try to objectively identify wave source regions in the troposphere. A source spectrum specified by reverse ray tracing experiments in early August 1996 (when atmospheric flow patterns produced enhanced transmission of waves into the upper stratosphere) yielded model results throughout the remainder of August 1996 that agreed best with the lidar observations. The model also simulated increased vertical group propagation of waves between 40 km and 80 km due to intensifying mean easterlies, which allowed many of the gravity waves observed at 40 km over Sondrestrom to propagate quasi-vertically from 40-80 km and then interact with any mesospheric clouds at 80 km near Sondrestrom, supporting earlier experimentally-inferred correlations between upper stratospheric gravity wave activity and mesospheric cloud backscatter from Sondrestrom lidar observations. A pilot experiment of real-time runs with the model in 2001 using weather forecast data as a low-level background produced less agreement with lidar observations. We believe this is due to limitations in our specified tropospheric source spectrum, the use of climatological winds and temperatures in the upper stratosphere and mesosphere, and missing lidar data from important time periods.

  10. CERES and the S'COOL Project

    NASA Technical Reports Server (NTRS)

    Chambers, Lin H.; Young, David F.; Barkstrom, Bruce R.; Wielicki, Bruce A.

    1997-01-01

    The first Clouds and the Earth's Radiant Energy System (CERES) instrument will be launched on the Tropical Rainfall Measuring Mission (TRMM) spacecraft from a Japanese launch site in November 1997. This instrument is a follow-on to the Earth Radiation Budget Experiment (ERBE) begun in the 1980's. The instrument will measure the radiation budget - incoming and outgoing radiant energy - of the Earth. It will establish a baseline and look for climatic trends. The major feature of interest is clouds, which play a very strong role in regulating our climate. CERES will identify clear and cloudy regions and determine cloud physical and microphysical properties using imager data from a companion instrument. Validation efforts for the remote sensing algorithms will be intensive. As one component of the validation, the S'COOL (Students' Cloud Observations On-Line) project will involve school children from around the globe in making ground truth measurements at the time of a CERES overpass. Their observations will be collected at the NASA Langley Distributed Active Archive Center (DAAC) and made available over the Internet for educational purposes as well as for use by the CERES Science Team in validation efforts. Pilot testing of the S'COOL project began in January 1997 with two local schools in Southeastern Virginia and one remote site in Montana. This experience is helping guide the development of the S'COOL project. National testing is planned for April 1997, international testing for July 1997, and global testing for October 1997. In 1998, when the CERES instrument is operational, a global observer network should be in place providing useful information to the scientists and learning opportunities to the students.

  11. Numerical simulation of radiation fog in complex terrain

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Musson-Genon, L.; Carissimo, B.; Dupont, E.

    2009-09-01

    The interest for micro-scale modeling of the atmosphere is growing for environmental applications related, for example, to energy production, transport and urban development. The turbulence in the stable layers where pollutant dispersion is low and can lead to strong pollution events. This could be further complicated by the presence of clouds or fog and is specifically difficult in urban or industrial area due to the presence of buildings. In this context, radiation fog formation and dissipation over complex terrain were therefore investigated with a state-of-the-art model. This study is divided into two phases. The first phase is a pilot stage, which consist of employing a database from the ParisFog campaign which took place in the south of Paris during winter 2006-07 to assess the ability of the cloud model to reproduce the detailed structure of radiation fog. The second phase use the validated model for the study of influence of complex terrain on fog evolution. Special attention is given to the detailed and complete simulations and validation technique used is to compare the simulated results using the 3D cloud model of computational fluid dynamical software Code_Saturne with one of the best collected in situ data during the ParisFog campaign. Several dynamical, microphysical parameterizations and simulation conditions have been described. The resulting 3D cloud model runs at a horizontal resolution of 30 m and a vertical resolution comparable to the 1D model. First results look very promising and are able to reproduce the spatial distribution of fog. The analysis of the behavior of the different parameterized physical processes suggests that the subtle balance between the various processes is achieved.

  12. Offering Global Collaboration Services beyond CERN and HEP

    NASA Astrophysics Data System (ADS)

    Fernandes, J.; Ferreira, P.; Baron, T.

    2015-12-01

    The CERN IT department has built over the years a performant and integrated ecosystem of collaboration tools, from videoconference and webcast services to event management software. These services have been designed and evolved in very close collaboration with the various communities surrounding the laboratory and have been massively adopted by CERN users. To cope with this very heavy usage, global infrastructures have been deployed which take full advantage of CERN's international and global nature. If these services and tools are instrumental in enabling the worldwide collaboration which generates major HEP breakthroughs, they would certainly also benefit other sectors of science in which globalization has already taken place. Some of these services are driven by commercial software (Vidyo or Wowza for example), some others have been developed internally and have already been made available to the world as Open Source Software in line with CERN's spirit and mission. Indico for example is now installed in 100+ institutes worldwide. But providing the software is often not enough and institutes, collaborations and project teams do not always possess the expertise, or human or material resources that are needed to set up and maintain such services. Regional and national institutions have to answer needs, which are growingly global and often contradict their operational capabilities or organizational mandate and so are looking at existing worldwide service offers such as CERN's. We believe that the accumulated experience obtained through the operation of a large scale worldwide collaboration service combined with CERN's global network and its recently- deployed Agile Infrastructure would allow the Organization to set up and operate collaborative services, such as Indico and Vidyo, at a much larger scale and on behalf of worldwide research and education institutions and thus answer these pressing demands while optimizing resources at a global level. Such services would be built over a robust and massively scalable Indico server to which the concept of communities would be added, and which would then serve as a hub for accessing other collaboration services such as Vidyo, on the same simple and successful model currently in place for CERN users. This talk will describe this vision, its benefits and the steps that have already been taken to make it come to life.

  13. Recent data for the p p at tevatron and odderon description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fazal-e-Aleem; Ali, M.; Rashid, H.

    1991-06-01

    The experimental data for pp and {bar p}p at {radical}s = 53 GeV shows the difference between the differential cross sections in the dip region. This prompted the need for a crossing-odd amplitude even at this energy. Further support to this idea was provided by the order of magnitude rise of the measured pp differential cross section in the dip region as we go from ISR to CERN collider energies. In order to overcome the difficulty to explain these phenomena, Gauron et al used the idea of an odderon in addition to the pomeron and explained the then available datamore » for pp and {bar p}p. Dynamical origin to the idea of an odderon was later provided by Islam. He has pointed out that, if the nucleon consists of a core of valence quarks surrounded by a cloud of quark-antiquark pairs, then in elastic scattering an odderon amplitude occurs when the cores interact by exchanging a J = 1, C = {minus}1, u{bar u} + d{bar d} state and the cloud undergoes maximal diffraction scattering. The model has recently been modified by these authors so as to fit the very recent data of {bar p}p at 546 GeV and make predictions at 1.8 TeV. The same idea was also used by Barnbard et al to explain the pp and {bar p}p data , Jankovszky et al have also fitted the data for p ({bar p}) p by employing the odderon in conjunction with the dipole pomeron. In this paper the authors will compare the results of these models with the most recent measurements at tevatron and also compare them with those of other models.« less

  14. Role of the ATLAS Grid Information System (AGIS) in Distributed Data Analysis and Simulation

    NASA Astrophysics Data System (ADS)

    Anisenkov, A. V.

    2018-03-01

    In modern high-energy physics experiments, particular attention is paid to the global integration of information and computing resources into a unified system for efficient storage and processing of experimental data. Annually, the ATLAS experiment performed at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) produces tens of petabytes raw data from the recording electronics and several petabytes of data from the simulation system. For processing and storage of such super-large volumes of data, the computing model of the ATLAS experiment is based on heterogeneous geographically distributed computing environment, which includes the worldwide LHC computing grid (WLCG) infrastructure and is able to meet the requirements of the experiment for processing huge data sets and provide a high degree of their accessibility (hundreds of petabytes). The paper considers the ATLAS grid information system (AGIS) used by the ATLAS collaboration to describe the topology and resources of the computing infrastructure, to configure and connect the high-level software systems of computer centers, to describe and store all possible parameters, control, configuration, and other auxiliary information required for the effective operation of the ATLAS distributed computing applications and services. The role of the AGIS system in the development of a unified description of the computing resources provided by grid sites, supercomputer centers, and cloud computing into a consistent information model for the ATLAS experiment is outlined. This approach has allowed the collaboration to extend the computing capabilities of the WLCG project and integrate the supercomputers and cloud computing platforms into the software components of the production and distributed analysis workload management system (PanDA, ATLAS).

  15. Aircraft Weather Mitigation for the Next Generation Air Transportation System

    NASA Technical Reports Server (NTRS)

    Stough, H. Paul, III

    2007-01-01

    Atmospheric effects on aviation are described by Mahapatra (1999) as including (1) atmospheric phenomena involving air motion - wind shear and turbulence; (2) hydrometeorological phenomena - rain, snow and hail; (3) aircraft icing; (4) low visibility; and (5) atmospheric electrical phenomena. Aircraft Weather Mitigation includes aircraft systems (e.g. airframe, propulsion, avionics, controls) that can be enacted (by a pilot, automation or hybrid systems) to suppress and/or prepare for the effects of encountered or unavoidable weather or to facilitate a crew operational decision-making process relative to weather. Aircraft weather mitigation can be thought of as a continuum (Figure 1) with the need to avoid all adverse weather at one extreme and the ability to safely operate in all weather conditions at the other extreme. Realistic aircraft capabilities fall somewhere between these two extremes. The capabilities of small general aviation aircraft would be expected to fall closer to the "Avoid All Adverse Weather" point, and the capabilities of large commercial jet transports would fall closer to the "Operate in All Weather Conditions" point. The ability to safely operate in adverse weather conditions is dependent upon the pilot s capabilities (training, total experience and recent experience), the airspace in which the operation is taking place (terrain, navigational aids, traffic separation), the capabilities of the airport (approach guidance, runway and taxiway lighting, availability of air traffic control), as well as the capabilities of the airplane. The level of mitigation may vary depending upon the type of adverse weather. For example, a small general aviation airplane may be equipped to operate "in the clouds" without outside visual references, but not be equipped to prevent airframe ice that could be accreted in those clouds.

  16. Tracing Star Formation in the Outskirts of the Milky Way

    NASA Astrophysics Data System (ADS)

    Casetti, Dana

    Discovery of the presence of young stars in the Leading Arm of the Magellanic Stream and in the periphery of the Large Magellanic Cloud (Casetti-Dinescu et al. 2014, Moni Bidin et al. 2017) poses a fundamental question as to how star formation can occur in intergalactic space within an environment of very low gas density. Recent models indicate that the hydrodynamical interaction with the gaseous component of the Milky Way may be of significant importance in shaping the Leading Arm of the Magellanic Stream; however models are still poorly constrained due to a lack of observational data. The existence of such stars is crucial as it informs on both star-formation and the Clouds' interaction with one another and with the Milky Way. Moreover, stars, as opposed to gas, provide secure distances to constrain the interactions. In the discovery of these young stars, the GALEX UV mission played the key role in selecting potential candidates. Together with infrared photometry from 2MASS and optical V from ground-based data, our team developed a method to select such candidates that were then followed up with spectroscopy (Casetti-Dinescu et al. 2012). This pilot study demonstrated that, with large sky coverage, our team could explore significant portions of the Magellanic Stream, whereas previously only regions adjacent to the Clouds had been studied. Still, the pilot study was limited to the southern sky (Dec. d -20°). Here, we propose to recreate a young-star candidate list using two completed NASA space missions: the recently updated GALEX (DR6plus7) and the infrared WISE missions. Together with optical photometry from Gaia DR1 (and/or PanSTARRS), we will increase the sample of candidate OB-type stars by exploring a volume of space over four times that of our previous, pilot study. The area coverage for the proposed new study will be the entire sky; previous spatial gaps in earlier versions of GALEX are now filled in, and the depth of the study will increase by 0.3 to 0.5 magnitudes due to use of AllWISE. By covering the entire sky, we will be able to explore the presence (or lack thereof) of such stars diametrically opposite to the LA, where it is inferred the Magellanic Stream is crossing the Galactic plane a second time, if the Clouds have had two pericenter passages about the Galaxy. Alternatively, we may find entirely new structure at the edge of the Galactic disk, related to interactions with other yet-unknown Milky-Way satellites, or due to ejection mechanisms from OB associations in the disk. Star-forming regions as informed from OB-type stars have been studied in our Galaxy and in external galaxies, in well-known gas-rich regions. The novelty of our study is that it is designed to find such stars in unexpected regions by exploring the entire sky. It is noted that within the time frame of this proposal, Gaia data release 2 will become available; therefore, with these candidates having already been identified, we will be able to further investigate their distances and kinematics. Our list of candidates will be made publicly available for follow-up spectroscopic studies.

  17. The ATLAS Experiment at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    ATLAS Collaboration; Aad, G.; Abat, E.; Abdallah, J.; Abdelalim, A. A.; Abdesselam, A.; Abdinov, O.; Abi, B. A.; Abolins, M.; Abramowicz, H.; Acerbi, E.; Acharya, B. S.; Achenbach, R.; Ackers, M.; Adams, D. L.; Adamyan, F.; Addy, T. N.; Aderholz, M.; Adorisio, C.; Adragna, P.; Aharrouche, M.; Ahlen, S. P.; Ahles, F.; Ahmad, A.; Ahmed, H.; Aielli, G.; Åkesson, P. F.; Åkesson, T. P. A.; Akimov, A. V.; Alam, S. M.; Albert, J.; Albrand, S.; Aleksa, M.; Aleksandrov, I. N.; Aleppo, M.; Alessandria, F.; Alexa, C.; Alexander, G.; Alexopoulos, T.; Alimonti, G.; Aliyev, M.; Allport, P. P.; Allwood-Spiers, S. E.; Aloisio, A.; Alonso, J.; Alves, R.; Alviggi, M. G.; Amako, K.; Amaral, P.; Amaral, S. P.; Ambrosini, G.; Ambrosio, G.; Amelung, C.; Ammosov, V. V.; Amorim, A.; Amram, N.; Anastopoulos, C.; Anderson, B.; Anderson, K. J.; Anderssen, E. C.; Andreazza, A.; Andrei, V.; Andricek, L.; Andrieux, M.-L.; Anduaga, X. S.; Anghinolfi, F.; Antonaki, A.; Antonelli, M.; Antonelli, S.; Apsimon, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Arce, A. T. H.; Archambault, J. P.; Arguin, J.-F.; Arik, E.; Arik, M.; Arms, K. E.; Armstrong, S. R.; Arnaud, M.; Arnault, C.; Artamonov, A.; Asai, S.; Ask, S.; Åsman, B.; Asner, D.; Asquith, L.; Assamagan, K.; Astbury, A.; Athar, B.; Atkinson, T.; Aubert, B.; Auerbach, B.; Auge, E.; Augsten, K.; Aulchenko, V. M.; Austin, N.; Avolio, G.; Avramidou, R.; Axen, A.; Ay, C.; Azuelos, G.; Baccaglioni, G.; Bacci, C.; Bachacou, H.; Bachas, K.; Bachy, G.; Badescu, E.; Bagnaia, P.; Bailey, D. C.; Baines, J. T.; Baker, O. K.; Ballester, F.; Baltasar Dos Santos Pedrosa, F.; Banas, E.; Banfi, D.; Bangert, A.; Bansal, V.; Baranov, S. P.; Baranov, S.; Barashkou, A.; Barberio, E. L.; Barberis, D.; Barbier, G.; Barclay, P.; Bardin, D. Y.; Bargassa, P.; Barillari, T.; Barisonzi, M.; Barnett, B. M.; Barnett, R. M.; Baron, S.; Baroncelli, A.; Barone, M.; Barr, A. J.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Barrillon, P.; Barriuso Poy, A.; Barros, N.; Bartheld, V.; Bartko, H.; Bartoldus, R.; Basiladze, S.; Bastos, J.; Batchelor, L. E.; Bates, R. L.; Batley, J. R.; Batraneanu, S.; Battistin, M.; Battistoni, G.; Batusov, V.; Bauer, F.; Bauss, B.; Baynham, D. E.; Bazalova, M.; Bazan, A.; Beauchemin, P. H.; Beaugiraud, B.; Beccherle, R. B.; Beck, G. A.; Beck, H. P.; Becks, K. H.; Bedajanek, I.; Beddall, A. J.; Beddall, A.; Bednár, P.; Bednyakov, V. A.; Bee, C.; Behar Harpaz, S.; Belanger, G. A. N.; Belanger-Champagne, C.; Belhorma, B.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellachia, F.; Bellagamba, L.; Bellina, F.; Bellomo, G.; Bellomo, M.; Beltramello, O.; Belymam, A.; Ben Ami, S.; Ben Moshe, M.; Benary, O.; Benchekroun, D.; Benchouk, C.; Bendel, M.; Benedict, B. H.; Benekos, N.; Benes, J.; Benhammou, Y.; Benincasa, G. P.; Benjamin, D. P.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Beretta, M.; Berge, D.; Bergeaas, E.; Berger, N.; Berghaus, F.; Berglund, S.; Bergsma, F.; Beringer, J.; Bernabéu, J.; Bernardet, K.; Berriaud, C.; Berry, T.; Bertelsen, H.; Bertin, A.; Bertinelli, F.; Bertolucci, S.; Besson, N.; Beteille, A.; Bethke, S.; Bialas, W.; Bianchi, R. M.; Bianco, M.; Biebel, O.; Bieri, M.; Biglietti, M.; Bilokon, H.; Binder, M.; Binet, S.; Bingefors, N.; Bingul, A.; Bini, C.; Biscarat, C.; Bischof, R.; Bischofberger, M.; Bitadze, A.; Bizzell, J. P.; Black, K. M.; Blair, R. E.; Blaising, J. J.; Blanch, O.; Blanchot, G.; Blocker, C.; Blocki, J.; Blondel, A.; Blum, W.; Blumenschein, U.; Boaretto, C.; Bobbink, G. J.; Bocci, A.; Bocian, D.; Bock, R.; Boehm, M.; Boek, J.; Bogaerts, J. A.; Bogouch, A.; Bohm, C.; Bohm, J.; Boisvert, V.; Bold, T.; Boldea, V.; Bondarenko, V. G.; Bonino, R.; Bonis, J.; Bonivento, W.; Bonneau, P.; Boonekamp, M.; Boorman, G.; Boosten, M.; Booth, C. N.; Booth, P. S. L.; Booth, P.; Booth, J. R. A.; Borer, K.; Borisov, A.; Borjanovic, I.; Bos, K.; Boscherini, D.; Bosi, F.; Bosman, M.; Bosteels, M.; Botchev, B.; Boterenbrood, H.; Botterill, D.; Boudreau, J.; Bouhova-Thacker, E. V.; Boulahouache, C.; Bourdarios, C.; Boutemeur, M.; Bouzakis, K.; Boyd, G. R.; Boyd, J.; Boyer, B. H.; Boyko, I. R.; Bozhko, N. I.; Braccini, S.; Braem, A.; Branchini, P.; Brandenburg, G. W.; Brandt, A.; Brandt, O.; Bratzler, U.; Braun, H. M.; Bravo, S.; Brawn, I. P.; Brelier, B.; Bremer, J.; Brenner, R.; Bressler, S.; Breton, D.; Brett, N. D.; Breugnon, P.; Bright-Thomas, P. G.; Brochu, F. M.; Brock, I.; Brock, R.; Brodbeck, T. J.; Brodet, E.; Broggi, F.; Broklova, Z.; Bromberg, C.; Brooijmans, G.; Brouwer, G.; Broz, J.; Brubaker, E.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruni, A.; Bruni, G.; Bruschi, M.; Buanes, T.; Buchanan, N. J.; Buchholz, P.; Budagov, I. A.; Büscher, V.; Bugge, L.; Buira-Clark, D.; Buis, E. J.; Bujor, F.; Buran, T.; Burckhart, H.; Burckhart-Chromek, D.; Burdin, S.; Burns, R.; Busato, E.; Buskop, J. J. F.; Buszello, K. P.; Butin, F.; Butler, J. M.; Buttar, C. M.; Butterworth, J.; Butterworth, J. M.; Byatt, T.; Cabrera Urbán, S.; Cabruja Casas, E.; Caccia, M.; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calderón Terol, D.; Callahan, J.; Caloba, L. P.; Caloi, R.; Calvet, D.; Camard, A.; Camarena, F.; Camarri, P.; Cambiaghi, M.; Cameron, D.; Cammin, J.; Campabadal Segura, F.; Campana, S.; Canale, V.; Cantero, J.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Caprio, M.; Caracinha, D.; Caramarcu, C.; Carcagno, Y.; Cardarelli, R.; Cardeira, C.; Cardiel Sas, L.; Cardini, A.; Carli, T.; Carlino, G.; Carminati, L.; Caron, B.; Caron, S.; Carpentieri, C.; Carr, F. S.; Carter, A. A.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Cascella, M.; Caso, C.; Castelo, J.; Castillo Gimenez, V.; Castro, N.; Castrovillari, F.; Cataldi, G.; Cataneo, F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Caughron, S.; Cauz, D.; Cavallari, A.; Cavalleri, P.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerna, C.; Cernoch, C.; Cerqueira, A. S.; Cerri, A.; Cerutti, F.; Cervetto, M.; Cetin, S. A.; Cevenini, F.; Chalifour, M.; Chamizo llatas, M.; Chan, A.; Chapman, J. W.; Charlton, D. G.; Charron, S.; Chekulaev, S. V.; Chelkov, G. A.; Chen, H.; Chen, L.; Chen, T.; Chen, X.; Cheng, S.; Cheng, T. L.; Cheplakov, A.; Chepurnov, V. F.; Cherkaoui El Moursli, R.; Chesneanu, D.; Cheu, E.; Chevalier, L.; Chevalley, J. L.; Chevallier, F.; Chiarella, V.; Chiefari, G.; Chikovani, L.; Chilingarov, A.; Chiodini, G.; Chouridou, S.; Chren, D.; Christiansen, T.; Christidi, I. A.; Christov, A.; Chu, M. L.; Chudoba, J.; Chuguev, A. G.; Ciapetti, G.; Cicalini, E.; Ciftci, A. K.; Cindro, V.; Ciobotaru, M. D.; Ciocio, A.; Cirilli, M.; Citterio, M.; Ciubancan, M.; Civera, J. V.; Clark, A.; Cleland, W.; Clemens, J. C.; Clement, B. C.; Clément, C.; Clements, D.; Clifft, R. W.; Cobal, M.; Coccaro, A.; Cochran, J.; Coco, R.; Coe, P.; Coelli, S.; Cogneras, E.; Cojocaru, C. D.; Colas, J.; Colijn, A. P.; Collard, C.; Collins-Tooth, C.; Collot, J.; Coluccia, R.; Comune, G.; Conde Muiño, P.; Coniavitis, E.; Consonni, M.; Constantinescu, S.; Conta, C.; Conventi, F. A.; Cook, J.; Cooke, M.; Cooper-Smith, N. J.; Cornelissen, T.; Corradi, M.; Correard, S.; Corso-Radu, A.; Coss, J.; Costa, G.; Costa, M. J.; Costanzo, D.; Costin, T.; Coura Torres, R.; Courneyea, L.; Couyoumtzelis, C.; Cowan, G.; Cox, B. E.; Cox, J.; Cragg, D. A.; Cranmer, K.; Cranshaw, J.; Cristinziani, M.; Crosetti, G.; Cuenca Almenar, C.; Cuneo, S.; Cunha, A.; Curatolo, M.; Curtis, C. J.; Cwetanski, P.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; Da Rocha Gesualdi Mello, A.; Da Silva, P. V. M.; Da Silva, R.; Dabrowski, W.; Dael, A.; Dahlhoff, A.; Dai, T.; Dallapiccola, C.; Dallison, S. J.; Dalmau, J.; Daly, C. H.; Dam, M.; Damazio, D.; Dameri, M.; Danielsen, K. M.; Danielsson, H. O.; Dankers, R.; Dannheim, D.; Darbo, G.; Dargent, P.; Daum, C.; Dauvergne, J. P.; David, M.; Davidek, T.; Davidson, N.; Davidson, R.; Dawson, I.; Dawson, J. W.; Daya, R. K.; De, K.; de Asmundis, R.; de Boer, R.; DeCastro, S.; DeGroot, N.; de Jong, P.; de La Broise, X.; DeLa Cruz-Burelo, E.; DeLa Taille, C.; DeLotto, B.; DeOliveira Branco, M.; DePedis, D.; de Saintignon, P.; DeSalvo, A.; DeSanctis, U.; DeSanto, A.; DeVivie DeRegie, J. B.; DeZorzi, G.; Dean, S.; Dedes, G.; Dedovich, D. V.; Defay, P. O.; Degele, R.; Dehchar, M.; Deile, M.; DelPapa, C.; DelPeso, J.; DelPrete, T.; Delagnes, E.; Delebecque, P.; Dell'Acqua, A.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delpierre, P.; Delruelle, N.; Delsart, P. A.; Deluca Silberberg, C.; Demers, S.; Demichev, M.; Demierre, P.; Demirköz, B.; Deng, W.; Denisov, S. P.; Dennis, C.; Densham, C. J.; Dentan, M.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K. K.; Dewhurst, A.; Di Ciaccio, A.; Di Ciaccio, L.; Di Domenico, A.; Di Girolamo, A.; Di Girolamo, B.; Di Luise, S.; Di Mattia, A.; Di Simone, A.; Diaz Gomez, M. M.; Diehl, E. B.; Dietl, H.; Dietrich, J.; Dietsche, W.; Diglio, S.; Dima, M.; Dindar, K.; Dinkespiler, B.; Dionisi, C.; Dipanjan, R.; Dita, P.; Dita, S.; Dittus, F.; Dixon, S. D.; Djama, F.; Djilkibaev, R.; Djobava, T.; do Vale, M. A. B.; Dobbs, M.; Dobinson, R.; Dobos, D.; Dobson, E.; Dobson, M.; Dodd, J.; Dogan, O. B.; Doherty, T.; Doi, Y.; Dolejsi, J.; Dolenc, I.; Dolezal, Z.; Dolgoshein, B. A.; Domingo, E.; Donega, M.; Dopke, J.; Dorfan, D. E.; Dorholt, O.; Doria, A.; Dos Anjos, A.; Dosil, M.; Dotti, A.; Dova, M. T.; Dowell, J. D.; Doyle, A. T.; Drake, G.; Drakoulakos, D.; Drasal, Z.; Drees, J.; Dressnandt, N.; Drevermann, H.; Driouichi, C.; Dris, M.; Drohan, J. G.; Dubbert, J.; Dubbs, T.; Duchovni, E.; Duckeck, G.; Dudarev, A.; Dührssen, M.; Dür, H.; Duerdoth, I. P.; Duffin, S.; Duflot, L.; Dufour, M.-A.; Dumont Dayot, N.; Duran Yildiz, H.; Durand, D.; Dushkin, A.; Duxfield, R.; Dwuznik, M.; Dydak, F.; Dzahini, D.; Díez Cornell, S.; Düren, M.; Ebenstein, W. L.; Eckert, S.; Eckweiler, S.; Eerola, P.; Efthymiopoulos, I.; Egede, U.; Egorov, K.; Ehrenfeld, W.; Eifert, T.; Eigen, G.; Einsweiler, K.; Eisenhandler, E.; Ekelof, T.; Eklund, L. M.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Ely, R.; Emeliyanov, D.; Engelmann, R.; Engström, M.; Ennes, P.; Epp, B.; Eppig, A.; Epshteyn, V. S.; Ereditato, A.; Eremin, V.; Eriksson, D.; Ermoline, I.; Ernwein, J.; Errede, D.; Errede, S.; Escalier, M.; Escobar, C.; Espinal Curull, X.; Esposito, B.; Esteves, F.; Etienne, F.; Etienvre, A. I.; Etzion, E.; Evans, H.; Evdokimov, V. N.; Evtoukhovitch, P.; Eyring, A.; Fabbri, L.; Fabjan, C. W.; Fabre, C.; Faccioli, P.; Facius, K.; Fadeyev, V.; Fakhrutdinov, R. M.; Falciano, S.; Falleau, I.; Falou, A. C.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farrell, J.; Farthouat, P.; Fasching, D.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Fawzi, F.; Fayard, L.; Fayette, F.; Febbraro, R.; Fedin, O. L.; Fedorko, I.; Feld, L.; Feldman, G.; Feligioni, L.; Feng, C.; Feng, E. J.; Fent, J.; Fenyuk, A. B.; Ferencei, J.; Ferguson, D.; Ferland, J.; Fernando, W.; Ferrag, S.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferrer, A.; Ferrer, M. L.; Ferrere, D.; Ferretti, C.; Ferro, F.; Fiascaris, M.; Fichet, S.; Fiedler, F.; Filimonov, V.; Filipčič, A.; Filippas, A.; Filthaut, F.; Fincke-Keeler, M.; Finocchiaro, G.; Fiorini, L.; Firan, A.; Fischer, P.; Fisher, M. J.; Fisher, S. M.; Flaminio, V.; Flammer, J.; Flechl, M.; Fleck, I.; Flegel, W.; Fleischmann, P.; Fleischmann, S.; Fleta Corral, C. M.; Fleuret, F.; Flick, T.; Flix, J.; Flores Castillo, L. R.; Flowerdew, M. J.; Föhlisch, F.; Fokitis, M.; Fonseca Martin, T. M.; Fopma, J.; Forbush, D. A.; Formica, A.; Foster, J. M.; Fournier, D.; Foussat, A.; Fowler, A. J.; Fox, H.; Francavilla, P.; Francis, D.; Franz, S.; Fraser, J. T.; Fraternali, M.; Fratianni, S.; Freestone, J.; French, R. S.; Fritsch, K.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fulachier, J.; Fullana Torregrosa, E.; Fuster, J.; Gabaldon, C.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Gallas, E. J.; Gallas, M. V.; Gallop, B. J.; Gan, K. K.; Gannaway, F. C.; Gao, Y. S.; Gapienko, V. A.; Gaponenko, A.; Garciá, C.; Garcia-Sciveres, M.; Garcìa Navarro, J. E.; Garde, V.; Gardner, R. W.; Garelli, N.; Garitaonandia, H.; Garonne, V. G.; Garvey, J.; Gatti, C.; Gaudio, G.; Gaumer, O.; Gautard, V.; Gauzzi, P.; Gavrilenko, I. L.; Gay, C.; Gayde, J.-C.; Gazis, E. N.; Gazo, E.; Gee, C. N. P.; Geich-Gimbel, C.; Gellerstedt, K.; Gemme, C.; Genest, M. H.; Gentile, S.; George, M. A.; George, S.; Gerlach, P.; Gernizky, Y.; Geweniger, C.; Ghazlane, H.; Ghete, V. M.; Ghez, P.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giakoumopoulou, V.; Giangiobbe, V.; Gianotti, F.; Gibbard, B.; Gibson, A.; Gibson, M. D.; Gibson, S. M.; Gieraltowski, G. F.; Gil Botella, I.; Gilbert, L. M.; Gilchriese, M.; Gildemeister, O.; Gilewsky, V.; Gillman, A. R.; Gingrich, D. M.; Ginzburg, J.; Giokaris, N.; Giordani, M. P.; Girard, C. G.; Giraud, P. F.; Girtler, P.; Giugni, D.; Giusti, P.; Gjelsten, B. K.; Glasman, C.; Glazov, A.; Glitza, K. W.; Glonti, G. L.; Gnanvo, K. G.; Godlewski, J.; Göpfert, T.; Gössling, C.; Göttfert, T.; Goldfarb, S.; Goldin, D.; Goldschmidt, N.; Golling, T.; Gollub, N. P.; Golonka, P. J.; Golovnia, S. N.; Gomes, A.; Gomes, J.; Gonçalo, R.; Gongadze, A.; Gonidec, A.; Gonzalez, S.; González de la Hoz, S.; González Millán, V.; Gonzalez Silva, M. L.; Gonzalez-Pineiro, B.; González-Sevilla, S.; Goodrick, M. J.; Goodson, J. J.; Goossens, L.; Gorbounov, P. A.; Gordeev, A.; Gordon, H.; Gorelov, I.; Gorfine, G.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Gorokhov, S. A.; Gorski, B. T.; Goryachev, S. V.; Goryachev, V. N.; Gosselink, M.; Gostkin, M. I.; Gouanère, M.; Gough Eschrich, I.; Goujdami, D.; Goulette, M.; Gousakov, I.; Gouveia, J.; Gowdy, S.; Goy, C.; Grabowska-Bold, I.; Grabski, V.; Grafström, P.; Grah, C.; Grahn, K.-J.; Grancagnolo, F.; Grancagnolo, S.; Grassmann, H.; Gratchev, V.; Gray, H. M.; Graziani, E.; Green, B.; Greenall, A.; Greenfield, D.; Greenwood, D.; Gregor, I. M.; Grewal, A.; Griesmayer, E.; Grigalashvili, N.; Grigson, C.; Grillo, A. A.; Grimaldi, F.; Grimm, K.; Gris, P. L. Y.; Grishkevich, Y.; Groenstege, H.; Groer, L. S.; Grognuz, J.; Groh, M.; Gross, E.; Grosse-Knetter, J.; Grothe, M. E. M.; Grudzinski, J.; Gruse, C.; Gruwe, M.; Grybel, K.; Grybos, P.; Gschwendtner, E. M.; Guarino, V. J.; Guicheney, C. J.; Guilhem, G.; Guillemin, T.; Gunther, J.; Guo, B.; Gupta, A.; Gurriana, L.; Gushchin, V. N.; Gutierrez, P.; Guy, L.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haas, S.; Haber, C.; Haboubi, G.; Hackenburg, R.; Hadash, E.; Hadavand, H. K.; Haeberli, C.; Härtel, R.; Haggerty, R.; Hahn, F.; Haider, S.; Hajduk, Z.; Hakimi, M.; Hakobyan, H.; Hakobyan, H.; Haller, J.; Hallewell, G. D.; Hallgren, B.; Hamacher, K.; Hamilton, A.; Han, H.; Han, L.; Hanagaki, K.; Hance, M.; Hanke, P.; Hansen, C. J.; Hansen, F. H.; Hansen, J. R.; Hansen, J. B.; Hansen, J. D.; Hansen, P. H.; Hansl-Kozanecka, T.; Hanson, G.; Hansson, P.; Hara, K.; Harder, S.; Harel, A.; Harenberg, T.; Harper, R.; Hart, J. C.; Hart, R. G. G.; Hartjes, F.; Hartman, N.; Haruyama, T.; Harvey, A.; Hasegawa, Y.; Hashemi, K.; Hassani, S.; Hatch, M.; Hatley, R. W.; Haubold, T. G.; Hauff, D.; Haug, F.; Haug, S.; Hauschild, M.; Hauser, R.; Hauviller, C.; Havranek, M.; Hawes, B. M.; Hawkings, R. J.; Hawkins, D.; Hayler, T.; Hayward, H. S.; Haywood, S. J.; Hazen, E.; He, M.; He, Y. P.; Head, S. J.; Hedberg, V.; Heelan, L.; Heinemann, F. E. W.; Heldmann, M.; Hellman, S.; Helsens, C.; Henderson, R. C. W.; Hendriks, P. J.; Henriques Correia, A. M.; Henrot-Versille, S.; Henry-Couannier, F.; Henß, T.; Herten, G.; Hertenberger, R.; Hervas, L.; Hess, M.; Hessey, N. P.; Hicheur, A.; Hidvegi, A.; Higón-Rodriguez, E.; Hill, D.; Hill, J.; Hill, J. C.; Hill, N.; Hillier, S. J.; Hinchliffe, I.; Hindson, D.; Hinkelbein, C.; Hodges, T. A.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoffman, J.; Hoffmann, A. E.; Hoffmann, D.; Hoffmann, H. F.; Holder, M.; Hollins, T. I.; Hollyman, G.; Holmes, A.; Holmgren, S. O.; Holt, R.; Holtom, E.; Holy, T.; Homer, R. J.; Homma, Y.; Homola, P.; Honerbach, W.; Honma, A.; Hooton, I.; Horazdovsky, T.; Horn, C.; Horvat, S.; Hostachy, J.-Y.; Hott, T.; Hou, S.; Houlden, M. A.; Hoummada, A.; Hover, J.; Howell, D. F.; Hrivnac, J.; Hruska, I.; Hryn'ova, T.; Huang, G. S.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, B. T.; Hughes, E.; Hughes, G.; Hughes-Jones, R. E.; Hulsbergen, W.; Hurst, P.; Hurwitz, M.; Huse, T.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Ibbotson, M.; Ibragimov, I.; Ichimiya, R.; Iconomidou-Fayard, L.; Idarraga, J.; Idzik, M.; Iengo, P.; Iglesias Escudero, M. C.; Igonkina, O.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Ilyushenka, Y.; Imbault, D.; Imbert, P.; Imhaeuser, M.; Imori, M.; Ince, T.; Inigo-Golfin, J.; Inoue, K.; Ioannou, P.; Iodice, M.; Ionescu, G.; Ishii, K.; Ishino, M.; Ishizawa, Y.; Ishmukhametov, R.; Issever, C.; Ito, H.; Ivashin, A. V.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jackson, J.; Jackson, J. N.; Jaekel, M.; Jagielski, S.; Jahoda, M.; Jain, V.; Jakobs, K.; Jakubek, J.; Jansen, E.; Jansweijer, P. P. M.; Jared, R. C.; Jarlskog, G.; Jarp, S.; Jarron, P.; Jelen, K.; Jen-La Plante, I.; Jenni, P.; Jeremie, A.; Jez, P.; Jézéquel, S.; Jiang, Y.; Jin, G.; Jin, S.; Jinnouchi, O.; Joffe, D.; Johansen, L. G.; Johansen, M.; Johansson, K. E.; Johansson, P.; Johns, K. A.; Jon-And, K.; Jones, M.; Jones, R.; Jones, R. W. L.; Jones, T. W.; Jones, T. J.; Jones, A.; Jonsson, O.; Joo, K. K.; Joos, D.; Joos, M.; Joram, C.; Jorgensen, S.; Joseph, J.; Jovanovic, P.; Junnarkar, S. S.; Juranek, V.; Jussel, P.; Kabachenko, V. V.; Kabana, S.; Kaci, M.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagawa, S.; Kaiser, S.; Kajomovitz, E.; Kakurin, S.; Kalinovskaya, L. V.; Kama, S.; Kambara, H.; Kanaya, N.; Kandasamy, A.; Kandasamy, S.; Kaneda, M.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kaplon, J.; Karagounis, M.; Karagoz Unel, M.; Karr, K.; Karst, P.; Kartvelishvili, V.; Karyukhin, A. N.; Kashif, L.; Kasmi, A.; Kass, R. D.; Kastanas, A.; Kataoka, M.; Kataoka, Y.; Katsoufis, E.; Katunin, S.; Kawagoe, K.; Kawai, M.; Kawamoto, T.; Kayumov, F.; Kazanin, V. A.; Kazarinov, M. Y.; Kazarov, A.; Kazi, S. I.; Keates, J. R.; Keeler, R.; Keener, P. T.; Kehoe, R.; Keil, M.; Kekelidze, G. D.; Kelly, M.; Kennedy, J.; Kenyon, M.; Kepka, O.; Kerschen, N.; Kerševan, B. P.; Kersten, S.; Ketterer, C.; Khakzad, M.; Khalilzade, F.; Khandanyan, H.; Khanov, A.; Kharchenko, D.; Khodinov, A.; Kholodenko, A. G.; Khomich, A.; Khomutnikov, V. P.; Khoriauli, G.; Khovanskiy, N.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kieft, G.; Kierstead, J. A.; Kilvington, G.; Kim, H.; Kim, H.; Kim, S. H.; Kind, P.; King, B. T.; Kirk, J.; Kirsch, G. P.; Kirsch, L. E.; Kiryunin, A. E.; Kisielewska, D.; Kisielewski, B.; Kittelmann, T.; Kiver, A. M.; Kiyamura, H.; Kladiva, E.; Klaiber-Lodewigs, J.; Kleinknecht, K.; Klier, A.; Klimentov, A.; Kline, C. R.; Klingenberg, R.; Klinkby, E. B.; Klioutchnikova, T.; Klok, P. F.; Klous, S.; Kluge, E.-E.; Kluit, P.; Klute, M.; Kluth, S.; Knecht, N. K.; Kneringer, E.; Knezo, E.; Knobloch, J.; Ko, B. R.; Kobayashi, T.; Kobel, M.; Kodys, P.; König, A. C.; König, S.; Köpke, L.; Koetsveld, F.; Koffas, T.; Koffeman, E.; Kohout, Z.; Kohriki, T.; Kokott, T.; Kolachev, G. M.; Kolanoski, H.; Kolesnikov, V.; Koletsou, I.; Kollefrath, M.; Kolos, S.; Kolya, S. D.; Komar, A. A.; Komaragiri, J. R.; Kondo, T.; Kondo, Y.; Kondratyeva, N. V.; Kono, T.; Kononov, A. I.; Konoplich, R.; Konovalov, S. P.; Konstantinidis, N.; Kootz, A.; Koperny, S.; Kopikov, S. V.; Korcyl, K.; Kordas, K.; Koreshev, V.; Korn, A.; Korolkov, I.; Korotkov, V. A.; Korsmo, H.; Kortner, O.; Kostrikov, M. E.; Kostyukhin, V. V.; Kotamäki, M. J.; Kotchetkov, D.; Kotov, S.; Kotov, V. M.; Kotov, K. Y.; Kourkoumelis, C.; Koutsman, A.; Kovalenko, S.; Kowalewski, R.; Kowalski, H.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kral, V.; Kramarenko, V.; Kramberger, G.; Kramer, A.; Krasel, O.; Krasny, M. W.; Krasznahorkay, A.; Krepouri, A.; Krieger, P.; Krivkova, P.; Krobath, G.; Kroha, H.; Krstic, J.; Kruchonak, U.; Krüger, H.; Kruger, K.; Krumshteyn, Z. V.; Kubik, P.; Kubischta, W.; Kubota, T.; Kudin, L. G.; Kudlaty, J.; Kugel, A.; Kuhl, T.; Kuhn, D.; Kukhtin, V.; Kulchitsky, Y.; Kundu, N.; Kupco, A.; Kupper, M.; Kurashige, H.; Kurchaninov, L. L.; Kurochkin, Y. A.; Kus, V.; Kuykendall, W.; Kuzhir, P.; Kuznetsova, E. K.; Kvasnicka, O.; Kwee, R.; La Marra, D.; La Rosa, M.; La Rotonda, L.; Labarga, L.; Labbe, J. A.; Lacasta, C.; Lacava, F.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lamanna, E.; Lambacher, M.; Lambert, F.; Lampl, W.; Lancon, E.; Landgraf, U.; Landon, M. P. J.; Landsman, H.; Langstaff, R. R.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Lapin, V. V.; Laplace, S.; Laporte, J. F.; Lara, V.; Lari, T.; Larionov, A. V.; Lasseur, C.; Lau, W.; Laurelli, P.; Lavorato, A.; Lavrijsen, W.; Lazarev, A. B.; LeBihan, A.-C.; LeDortz, O.; LeManer, C.; LeVine, M.; Leahu, L.; Leahu, M.; Lebel, C.; Lechowski, M.; LeCompte, T.; Ledroit-Guillon, F.; Lee, H.; Lee, J. S. H.; Lee, S. C.; Lefebvre, M.; Lefevre, R. P.; Legendre, M.; Leger, A.; LeGeyt, B. C.; Leggett, C.; Lehmacher, M.; Lehmann Miotto, G.; Lehto, M.; Leitner, R.; Lelas, D.; Lellouch, D.; Leltchouk, M.; Lendermann, V.; Leney, K. J. C.; Lenz, T.; Lenzen, G.; Lepidis, J.; Leroy, C.; Lessard, J.-R.; Lesser, J.; Lester, C. G.; Letheren, M.; Fook Cheong, A. Leung; Levêque, J.; Levin, D.; Levinson, L. J.; Levitski, M. S.; Lewandowska, M.; Leyton, M.; Li, J.; Li, W.; Liabline, M.; Liang, Z.; Liang, Z.; Liberti, B.; Lichard, P.; Liebig, W.; Lifshitz, R.; Liko, D.; Lim, H.; Limper, M.; Lin, S. C.; Lindahl, A.; Linde, F.; Lindquist, L.; Lindsay, S. W.; Linhart, V.; Lintern, A. J.; Liolios, A.; Lipniacka, A.; Liss, T. M.; Lissauer, A.; List, J.; Litke, A. M.; Liu, S.; Liu, T.; Liu, Y.; Livan, M.; Lleres, A.; Llosá Llácer, G.; Lloyd, S. L.; Lobkowicz, F.; Loch, P.; Lockman, W. S.; Loddenkoetter, T.; Loebinger, F. K.; Loginov, A.; Loh, C. W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Loken, J.; Lokwitz, S.; Long, M. C.; Lopes, L.; Lopez Mateos, D.; Losty, M. J.; Lou, X.; Loureiro, K. F.; Lovas, L.; Love, J.; Lowe, A.; Lozano Fantoba, M.; Lu, F.; Lu, J.; Lu, L.; Lubatti, H. J.; Lucas, S.; Luci, C.; Lucotte, A.; Ludwig, A.; Ludwig, I.; Ludwig, J.; Luehring, F.; Lüke, D.; Luijckx, G.; Luisa, L.; Lumb, D.; Luminari, L.; Lund, E.; Lund-Jensen, B.; Lundberg, B.; Lundquist, J.; Lupi, A.; Lupu, N.; Lutz, G.; Lynn, D.; Lynn, J.; Lys, J.; Lysan, V.; Lytken, E.; López-Amengual, J. M.; Ma, H.; Ma, L. L.; Maaß en, M.; Maccarrone, G.; Mace, G. G. R.; Macina, D.; Mackeprang, R.; Macpherson, A.; MacQueen, D.; Macwaters, C.; Madaras, R. J.; Mader, W. F.; Maenner, R.; Maeno, T.; Mättig, P.; Mättig, S.; Magrath, C. A.; Mahalalel, Y.; Mahboubi, K.; Mahout, G.; Maidantchik, C.; Maio, A.; Mair, G. M.; Mair, K.; Makida, Y.; Makowiecki, D.; Malecki, P.; Maleev, V. P.; Malek, F.; Malon, D.; Maltezos, S.; Malychev, V.; Malyukov, S.; Mambelli, M.; Mameghani, R.; Mamuzic, J.; Manabe, A.; Manara, A.; Manca, G.; Mandelli, L.; Mandić, I.; Mandl, M.; Maneira, J.; Maneira, M.; Mangeard, P. S.; Mangin-Brinet, M.; Manjavidze, I. D.; Mann, W. A.; Manolopoulos, S.; Manousakis-Katsikakis, A.; Mansoulie, B.; Manz, A.; Mapelli, A.; Mapelli, L.; March, L.; Marchand, J. F.; Marchesotti, M.; Marcisovsky, M.; Marin, A.; Marques, C. N.; Marroquim, F.; Marshall, R.; Marshall, Z.; Martens, F. K.; Garcia, S. Marti i.; Martin, A. J.; Martin, B.; Martin, B.; Martin, F. F.; Martin, J. P.; Martin, Ph; Martinez, G.; Martínez Lacambra, C.; Martinez Outschoorn, V.; Martini, A.; Martins, J.; Maruyama, T.; Marzano, F.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Maß, M.; Massa, I.; Massaro, G.; Massol, N.; Mathes, M.; Matheson, J.; Matricon, P.; Matsumoto, H.; Matsunaga, H.; Maugain, J. M.; Maxfield, S. J.; May, E. N.; Mayer, J. K.; Mayri, C.; Mazini, R.; Mazzanti, M.; Mazzanti, P.; Mazzoni, E.; Mazzucato, F.; McKee, S. P.; McCarthy, R. L.; McCormick, C.; McCubbin, N. A.; McDonald, J.; McFarlane, K. W.; McGarvie, S.; McGlone, H.; McLaren, R. A.; McMahon, S. J.; McMahon, T. R.; McMahon, T. J.; McPherson, R. A.; Mechtel, M.; Meder-Marouelli, D.; Medinnis, M.; Meera-Lebbai, R.; Meessen, C.; Mehdiyev, R.; Mehta, A.; Meier, K.; Meinhard, H.; Meinhardt, J.; Meirosu, C.; Meisel, F.; Melamed-Katz, A.; Mellado Garcia, B. R.; Mendes Jorge, P.; Mendez, P.; Menke, S.; Menot, C.; Meoni, E.; Merkl, D.; Merola, L.; Meroni, C.; Merritt, F. S.; Messmer, I.; Metcalfe, J.; Meuser, S.; Meyer, J.-P.; Meyer, T. C.; Meyer, W. T.; Mialkovski, V.; Michelotto, M.; Micu, L.; Middleton, R.; Miele, P.; Migliaccio, A.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikestikova, M.; Mikulec, B.; Mikuž, M.; Miller, D. W.; Miller, R. J.; Miller, W.; Milosavljevic, M.; Milstead, D. A.; Mima, S.; Minaenko, A. A.; Minano, M.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Mir, L. M.; Mirabelli, G.; Miralles Verge, L.; Misawa, S.; Miscetti, S.; Misiejuk, A.; Mitra, A.; Mitrofanov, G. Y.; Mitsou, V. A.; Miyagawa, P. S.; Miyazaki, Y.; Mjörnmark, J. U.; Mkrtchyan, S.; Mladenov, D.; Moa, T.; Moch, M.; Mochizuki, A.; Mockett, P.; Modesto, P.; Moed, S.; Mönig, K.; Möser, N.; Mohn, B.; Mohr, W.; Mohrdieck-Möck, S.; Moisseev, A. M.; Moles Valls, R. M.; Molina-Perez, J.; Moll, A.; Moloney, G.; Mommsen, R.; Moneta, L.; Monnier, E.; Montarou, G.; Montesano, S.; Monticelli, F.; Moore, R. W.; Moore, T. B.; Moorhead, G. F.; Moraes, A.; Morel, J.; Moreno, A.; Moreno, D.; Morettini, P.; Morgan, D.; Morii, M.; Morin, J.; Morley, A. K.; Mornacchi, G.; Morone, M.-C.; Morozov, S. V.; Morris, E. J.; Morris, J.; Morrissey, M. C.; Moser, H. G.; Mosidze, M.; Moszczynski, A.; Mouraviev, S. V.; Mouthuy, T.; Moye, T. H.; Moyse, E. J. W.; Mueller, J.; Müller, M.; Muijs, A.; Muller, T. R.; Munar, A.; Munday, D. J.; Murakami, K.; Murillo Garcia, R.; Murray, W. J.; Myagkov, A. G.; Myska, M.; Nagai, K.; Nagai, Y.; Nagano, K.; Nagasaka, Y.; Nairz, A. M.; Naito, D.; Nakamura, K.; Nakamura, Y.; Nakano, I.; Nanava, G.; Napier, A.; Nassiakou, M.; Nasteva, I.; Nation, N. R.; Naumann, T.; Nauyock, F.; Nderitu, S. K.; Neal, H. A.; Nebot, E.; Nechaeva, P.; Neganov, A.; Negri, A.; Negroni, S.; Nelson, C.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Nesterov, S. Y.; Neukermans, L.; Nevski, P.; Newcomer, F. M.; Nichols, A.; Nicholson, C.; Nicholson, R.; Nickerson, R. B.; Nicolaidou, R.; Nicoletti, G.; Nicquevert, B.; Niculescu, M.; Nielsen, J.; Niinikoski, T.; Niinimaki, M. J.; Nikitin, N.; Nikolaev, K.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, H.; Nilsson, B. S.; Nilsson, P.; Nisati, A.; Nisius, R.; Nodulman, L. J.; Nomachi, M.; Nomoto, H.; Noppe, J.-M.; Nordberg, M.; Norniella Francisco, O.; Norton, P. R.; Novakova, J.; Nowak, M.; Nozaki, M.; Nunes, R.; Nunes Hanninger, G.; Nunnemann, T.; Nyman, T.; O'Connor, P.; O'Neale, S. W.; O'Neil, D. C.; O'Neill, M.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermaier, M.; Oberson, P.; Ochi, A.; Ockenfels, W.; Odaka, S.; Odenthal, I.; Odino, G. A.; Ogren, H.; Oh, S. H.; Ohshima, T.; Ohshita, H.; Okawa, H.; Olcese, M.; Olchevski, A. G.; Oliver, C.; Oliver, J.; Olivo Gomez, M.; Olszewski, A.; Olszowska, J.; Omachi, C.; Onea, A.; Onofre, A.; Oram, C. J.; Ordonez, G.; Oreglia, M. J.; Orellana, F.; Oren, Y.; Orestano, D.; Orlov, I. O.; Orr, R. S.; Orsini, F.; Osborne, L. S.; Osculati, B.; Osuna, C.; Otec, R.; Othegraven, R.; Ottewell, B.; Ould-Saada, F.; Ouraou, A.; Ouyang, Q.; Øye, O. K.; Ozcan, V. E.; Ozone, K.; Ozturk, N.; Pacheco Pages, A.; Padhi, S.; Padilla Aranda, C.; Paganis, E.; Paige, F.; Pailler, P. M.; Pajchel, K.; Palestini, S.; Palla, J.; Pallin, D.; Palmer, M. J.; Pan, Y. B.; Panikashvili, N.; Panin, V. N.; Panitkin, S.; Pantea, D.; Panuskova, M.; Paolone, V.; Paoloni, A.; Papadopoulos, I.; Papadopoulou, T.; Park, I.; Park, W.; Parker, M. A.; Parker, S.; Parkman, C.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pasqualucci, E.; Passardi, G.; Passeri, A.; Passmore, M. S.; Pastore, F.; Pastore, Fr; Pataraia, S.; Pate, D.; Pater, J. R.; Patricelli, S.; Pauly, T.; Pauna, E.; Peak, L. S.; Peeters, S. J. M.; Peez, M.; Pei, E.; Peleganchuk, S. V.; Pellegrini, G.; Pengo, R.; Pequenao, J.; Perantoni, M.; Perazzo, A.; Pereira, A.; Perepelkin, E.; Perera, V. J. O.; Perez Codina, E.; Perez Reale, V.; Peric, I.; Perini, L.; Pernegger, H.; Perrin, E.; Perrino, R.; Perrodo, P.; Perrot, G.; Perus, P.; Peshekhonov, V. D.; Petereit, E.; Petersen, J.; Petersen, T. C.; Petit, P. J. F.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petti, R.; Pezzetti, M.; Pfeifer, B.; Phan, A.; Phillips, A. W.; Phillips, P. W.; Piacquadio, G.; Piccinini, M.; Pickford, A.; Piegaia, R.; Pier, S.; Pilcher, J. E.; Pilkington, A. D.; Pimenta Dos Santos, M. A.; Pina, J.; Pinfold, J. L.; Ping, J.; Pinhão, J.; Pinto, B.; Pirotte, O.; Placakyte, R.; Placci, A.; Plamondon, M.; Plano, W. G.; Pleier, M.-A.; Pleskach, A. V.; Podkladkin, S.; Podlyski, F.; Poffenberger, P.; Poggioli, L.; Pohl, M.; Polak, I.; Polesello, G.; Policicchio, A.; Polini, A.; Polychronakos, V.; Pomarede, D. M.; Pommès, K.; Ponsot, P.; Pontecorvo, L.; Pope, B. G.; Popescu, R.; Popovic, D. S.; Poppleton, A.; Popule, J.; Portell Bueso, X.; Posch, C.; Pospelov, G. E.; Pospichal, P.; Pospisil, S.; Postranecky, M.; Potrap, I. N.; Potter, C. J.; Poulard, G.; Pousada, A.; Poveda, J.; Prabhu, R.; Pralavorio, P.; Prasad, S.; Prast, J.; Prat, S.; Prata, M.; Pravahan, R.; Preda, T.; Pretzl, K.; Pribyl, L.; Price, D.; Price, L. E.; Price, M. J.; Prichard, P. M.; Prieur, D.; Primavera, M.; Primor, D.; Prokofiev, K.; Prosso, E.; Proudfoot, J.; Przysiezniak, H.; Puigdengoles, C.; Purdham, J.; Purohit, M.; Puzo, P.; Pylaev, A. N.; Pylypchenko, Y.; Qi, M.; Qian, J.; Qian, W.; Qian, Z.; Qing, D.; Quadt, A.; Quarrie, D. R.; Quayle, W. B.; Rabbers, J. J.; Radeka, V.; Rafi, J. M.; Ragusa, F.; Rahimi, A. M.; Rahm, D.; Raine, C.; Raith, B.; Rajagopalan, S.; Rajek, S.; Rammer, H.; Ramstedt, M.; Rangod, S.; Ratoff, P. N.; Raufer, T.; Rauscher, F.; Rauter, E.; Raymond, M.; Reads, A. L.; Rebuzzi, D.; Redlinger, G. R.; Reeves, K.; Rehak, M.; Reichold, A.; Reinherz-Aronis, E.; Reisinger, I.; Reljic, D.; Rembser, C.; Ren, Z.; Renaudin-Crepe, S. R. C.; Renkel, P.; Rensch, B.; Rescia, S.; Rescigno, M.; Resconi, S.; Resende, B.; Rewiersma, P.; Rey, J.; Rey-Campagnolle, M.; Rezaie, E.; Reznicek, P.; Richards, R. A.; Richer, J.-P.; Richter, R. H.; Richter, R.; Richter-Was, E.; Ridel, M.; Riegler, W.; Rieke, S.; Rijpstra, M.; Rijssenbeek, M.; Rimoldi, A.; Rios, R. R.; Riu Dachs, I.; Rivline, M.; Rivoltella, G.; Rizatdinova, F.; Robertson, S. H.; Robichaud-Veronneau, A.; Robins, S.; Robinson, D.; Robson, A.; Rochford, J. H.; Roda, C.; Rodier, S.; Roe, S.; Røhne, O.; Rohrbach, F.; Roldán, J.; Rolli, S.; Romance, J. B.; Romaniouk, A.; Romanov, V. M.; Romeo, G.; Roos, L.; Ros, E.; Rosati, S.; Rosenbaum, F.; Rosenbaum, G. A.; Rosenberg, E. I.; Rosselet, L.; Rossi, L. P.; Rossi, L.; Rotaru, M.; Rothberg, J.; Rottländer, I.; Rousseau, D.; Rozanov, A.; Rozen, Y.; Ruber, R.; Ruckert, B.; Rudolph, G.; Rühr, F.; Ruggieri, F.; Ruggiero, G.; Ruiz, H.; Ruiz-Martinez, A.; Rulikowska-Zarebska, E.; Rumiantsev, V.; Rumyantsev, L.; Runge, K.; Runolfsson, O.; Rusakovich, N. A.; Rust, D. R.; Rutherfoord, J. P.; Ruwiedel, C.; Ryabov, Y. F.; Ryadovikov, V.; Ryan, P.; Rybkine, G.; da Costa, J. Sá; Saavedra, A. F.; Saboumazrag, S.; F-W Sadrozinski, H.; Sadykov, R.; Sakamoto, H.; Sala, P.; Salamon, A.; Saleem, M.; Salihagic, D.; Salt, J.; Saltó Bauza, O.; Salvachúa Ferrando, B. M.; Salvatore, D.; Salzburger, A.; Sampsonidis, D.; Samset, B. H.; Sánchez Sánchez, C. A.; Sanchis Lozano, M. A.; Sanchis Peris, E.; Sandaker, H.; Sander, H. G.; Sandhoff, M.; Sandvoss, S.; Sankey, D. P. C.; Sanny, B.; Sansone, S.; Sansoni, A.; Santamarina Rios, C.; Santander, J.; Santi, L.; Santoni, C.; Santonico, R.; Santos, J.; Sapinski, M.; Saraiva, J. G.; Sarri, F.; Sasaki, O.; Sasaki, T.; Sasao, N.; Satsounkevitch, I.; Sauvage, D.; Sauvage, G.; Savard, P.; Savine, A. Y.; Savinov, V.; Savoy-Navarro, A.; Savva, P.; Saxon, D. H.; Says, L. P.; Sbarra, C.; Sbrissa, E.; Sbrizzi, A.; Scannicchio, D. A.; Schaarschmidt, J.; Schacht, P.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schaller, M.; Schamov, A. G.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Scherzer, M. I.; Schiavi, C.; Schick, H.; Schieck, J.; Schieferdecker, P.; Schioppa, M.; Schlager, G.; Schlenker, S.; Schlereth, J. L.; Schmid, P.; Schmidt, M. P.; Schmitt, C.; Schmitt, K.; Schmitz, M.; Schmücker, H.; Schoerner, T.; Scholte, R. C.; Schott, M.; Schouten, D.; Schram, M.; Schricker, A.; Schroff, D.; Schuh, S.; Schuijlenburg, H. W.; Schuler, G.; Schultes, J.; Schultz-Coulon, H.-C.; Schumacher, J.; Schumacher, M.; Schune, Ph; Schwartzman, A.; Schweiger, D.; Schwemling, Ph; Schwick, C.; Schwienhorst, R.; Schwierz, R.; Schwindling, J.; Scott, W. G.; Secker, H.; Sedykh, E.; Seguin-Moreau, N.; Segura, E.; Seidel, S. C.; Seiden, A.; Seixas, J. M.; Sekhniaidze, G.; Seliverstov, D. M.; Selldén, B.; Seman, M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Seuster, R.; Severini, H.; Sevior, M. E.; Sexton, K. A.; Sfyrla, A.; Shah, T. P.; Shan, L.; Shank, J. T.; Shapiro, M.; Shatalov, P. B.; Shaver, L.; Shaw, C.; Shears, T. G.; Sherwood, P.; Shibata, A.; Shield, P.; Shilov, S.; Shimojima, M.; Shin, T.; Shiyakova, M.; Shmeleva, A.; Shoa, M.; Shochet, M. J.; Shupe, M. A.; Sicho, P.; Sidoti, A.; Siebel, A.; Siebel, M.; Siegrist, J.; Sijacki, D.; Silva, J.; Silverstein, S. B.; Simak, V.; Simic, Lj; Simion, S.; Simmons, B.; Simonyan, M.; Sinervo, P.; Sipica, V.; Siragusa, G.; Sisakyan, A. N.; Sivoklokov, S.; Sjölin, J.; Skubic, P.; Skvorodnev, N.; Slattery, P.; Slavicek, T.; Sliwa, K.; Sloan, T. J.; Sloper, J.; Smakhtin, V.; Small, A.; Smirnov, S. Yu; Smirnov, Y.; Smirnova, L.; Smirnova, O.; Smith, N. A.; Smith, B. C.; Smith, D. S.; Smith, J.; Smith, K. M.; Smith, B.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snow, S. W.; Snow, J.; Snuverink, J.; Snyder, S.; Soares, M.; Soares, S.; Sobie, R.; Sodomka, J.; Söderberg, M.; Soffer, A.; Solans, C. A.; Solar, M.; Sole, D.; Solfaroli Camillocci, E.; Solodkov, A. A.; Solov'yanov, O. V.; Soloviev, I.; Soluk, R.; Sondericker, J.; Sopko, V.; Sopko, B.; Sorbi, M.; Soret Medel, J.; Sosebee, M.; Sosnovtsev, V. V.; Sospedra Suay, L.; Soukharev, A.; Soukup, J.; Spagnolo, S.; Spano, F.; Speckmayer, P.; Spegel, M.; Spencer, E.; Spighi, R.; Spigo, G.; Spila, F.; Spiriti, E.; Spiwoks, R.; Spogli, L.; Spousta, M.; Sprachmann, G.; Spurlock, B.; St. Denis, R. D.; Stahl, T.; Staley, R. J.; Stamen, R.; Stancu, S. N.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stapnes, S.; Starchenko, E. A.; Staroba, P.; Stastny, J.; Staude, A.; Stavina, P.; Stavrianakou, M.; Stavropoulos, G.; Stefanidis, E.; Steffens, J. L.; Stekl, I.; Stelzer, H. J.; Stenzel, H.; Stewart, G.; Stewart, T. D.; Stiller, W.; Stockmanns, T.; Stodulski, M.; Stonjek, S.; Stradling, A.; Straessner, A.; Strandberg, J.; Strandlie, A.; Strauss, M.; Strickland, V.; Striegel, D.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Strong, J. A.; Stroynowski, R.; Stugu, B.; Stumer, I.; Su, D.; Subramania, S.; Suchkov, S. I.; Sugaya, Y.; Sugimoto, T.; Suk, M.; Sulin, V. V.; Sultanov, S.; Sun, Z.; Sundal, B.; Sushkov, S.; Susinno, G.; Sutcliffe, P.; Sutton, M. R.; Sviridov, Yu M.; Sykora, I.; Szczygiel, R. R.; Szeless, B.; Szymocha, T.; Sánchez, J.; Ta, D.; Taboada Gameiro, S.; Tadel, M.; Tafirout, R.; Taga, A.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Talby, M.; Talyshev, A.; Tamsett, M. C.; Tanaka, J.; Tanaka, K.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tanaka, Y.; Tappern, G. P.; Tapprogge, S.; Tarem, S.; Tarrade, F.; Tarrant, J.; Tartarelli, G.; Tas, P.; Tasevsky, M.; Tayalati, Y.; Taylor, F. E.; Taylor, G.; Taylor, G. N.; Taylor, R. P.; Tcherniatine, V.; Tegenfeldt, F.; Teixeira-Dias, P.; Ten Kate, H.; Teng, P. K.; Ter-Antonyan, R.; Terada, S.; Terron, J.; Terwort, M.; Teuscher, R. J.; Tevlin, C. M.; Thadome, J.; Thion, J.; Thioye, M.; Thomas, A.; Thomas, J. P.; Thomas, T. L.; Thomas, E.; Thompson, R. J.; Thompson, A. S.; Thun, R. P.; Tic, T.; Tikhomirov, V. O.; Tikhonov, Y. A.; Timm, S.; Timmermans, C. J. W. P.; Tipton, P.; Tique Aires Viegas, F. J.; Tisserant, S.; Titov, M.; Tobias, J.; Tocut, V. M.; Toczek, B.; Todorova-Nova, S.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tomasek, L.; Tomasek, M.; Tomasz, F.; Tomoto, M.; Tompkins, D.; Tompkins, L.; Toms, K.; Tonazzo, A.; Tong, G.; Tonoyan, A.; Topfel, C.; Topilin, N. D.; Torrence, E.; Torres Pais, J. G.; Toth, J.; Touchard, F.; Tovey, D. R.; Tovey, S. N.; Towndrow, E. F.; Trefzger, T.; Treichel, M.; Treis, J.; Tremblet, L.; Tribanek, W.; Tricoli, A.; Trigger, I. M.; Trilling, G.; Trincaz-Duvoid, S.; Tripiana, M. F.; Trischuk, W.; Trka, Z.; Trocmé, B.; Troncon, C.; C-L Tseng, J.; Tsiafis, I.; Tsiareshka, P. V.; Tsipolitis, G.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Turala, M.; Turk Cakir, I.; Turlay, E.; Tuts, P. M.; Twomey, M. S.; Tyndel, M.; Typaldos, D.; Tyrvainen, H.; Tzamarioudaki, E.; Tzanakos, G.; Ueda, I.; Uhrmacher, M.; Ukegawa, F.; Ullán Comes, M.; Unal, G.; Underwood, D. G.; Undrus, A.; Unel, G.; Unno, Y.; Urkovsky, E.; Usai, G.; Usov, Y.; Vacavant, L.; Vacek, V.; Vachon, B.; Vahsen, S.; Valderanis, C.; Valenta, J.; Valente, P.; Valero, A.; Valkar, S.; Valls Ferrer, J. A.; Van der Bij, H.; van der Graaf, H.; van der Kraaij, E.; Van Eijk, B.; van Eldik, N.; van Gemmeren, P.; van Kesteren, Z.; van Vulpen, I.; Van Berg, R.; Vandelli, W.; Vandoni, G.; Vaniachine, A.; Vannucci, F.; Varanda, M.; Varela Rodriguez, F.; Vari, R.; Varnes, E. W.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vassilakopoulos, V. I.; Vassilieva, L.; Vataga, E.; Vaz, L.; Vazeille, F.; Vedrine, P.; Vegni, G.; Veillet, J. J.; Vellidis, C.; Veloso, F.; Veness, R.; Veneziano, S.; Ventura, A.; Ventura, S.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vertogardov, L.; Vetterli, M. C.; Vichou, I.; Vickey, T.; Viehhauser, G. H. A.; Vigeolas, E.; Villa, M.; Villani, E. G.; Villate, J.; Villella, I.; Vilucchi, E.; Vincent, P.; Vincke, H.; Vincter, M. G.; Vinogradov, V. B.; Virchaux, M.; Viret, S.; Virzi, J.; Vitale, A.; Vivarelli, I.; Vives, R.; Vives Vaques, F.; Vlachos, S.; Vogt, H.; Vokac, P.; Vollmer, C. F.; Volpi, M.; Volpini, G.; von Boehn-Buchholz, R.; von der Schmitt, H.; von Toerne, E.; Vorobel, V.; Vorobiev, A. P.; Vorozhtsov, A. S.; Vorozhtsov, S. B.; Vos, M.; Voss, K. C.; Voss, R.; Vossebeld, J. H.; Vovenko, A. S.; Vranjes, N.; Vrba, V.; Vreeswijk, M.; Anh, T. Vu; Vuaridel, B.; Vudragovic, M.; Vuillemin, V.; Vuillermet, R.; Wänanen, A.; Wahlen, H.; Walbersloh, J.; Walker, R.; Walkowiak, W.; Wall, R.; Wallny, R. S.; Walsh, S.; Wang, C.; Wang, J. C.; Wappler, F.; Warburton, A.; Ward, C. P.; Warner, G. P.; Warren, M.; Warsinsky, M.; Wastie, R.; Watkins, P. M.; Watson, A. T.; Watts, G.; Waugh, A. T.; Waugh, B. M.; Weaverdyck, C.; Webel, M.; Weber, G.; Weber, J.; Weber, M.; Weber, P.; Weidberg, A. R.; Weilhammer, P. M.; Weingarten, J.; Weiser, C.; Wellenstein, H.; Wellisch, H. P.; Wells, P. S.; Wemans, A.; Wen, M.; Wenaus, T.; Wendler, S.; Wengler, T.; Wenig, S.; Wermes, N.; Werneke, P.; Werner, P.; Werthenbach, U.; Wheeler-Ellis, S. J.; Whitaker, S. P.; White, A.; White, M. J.; White, S.; Whittington, D.; Wicek, F.; Wicke, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiesmann, M.; Wiesmann, M.; Wijnen, T.; Wildauer, A.; Wilhelm, I.; Wilkens, H. G.; Williams, H. H.; Willis, W.; Willocq, S.; Wilmut, I.; Wilson, J. A.; Wilson, A.; Wingerter-Seez, I.; Winton, L.; Witzeling, W.; Wlodek, T.; Woehrling, E.; Wolter, M. W.; Wolters, H.; Wosiek, B.; Wotschack, J.; Woudstra, M. J.; Wright, C.; Wu, S. L.; Wu, X.; Wuestenfeld, J.; Wunstorf, R.; Xella-Hansen, S.; Xiang, A.; Xie, S.; Xie, Y.; Xu, G.; Xu, N.; Yamamoto, A.; Yamamoto, S.; Yamaoka, H.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, J. C.; Yang, S.; Yang, U. K.; Yang, Y.; Yang, Z.; Yao, W.-M.; Yao, Y.; Yarradoddi, K.; Yasu, Y.; Ye, J.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, H.; Yoshida, R.; Young, C.; Youssef, S. P.; Yu, D.; Yu, J.; Yu, M.; Yu, X.; Yuan, J.; Yurkewicz, A.; Zaets, V. G.; Zaidan, R.; Zaitsev, A. M.; Zajac, J.; Zajacova, Z.; Zalite, A. Yu; Zalite, Yo K.; Zanello, L.; Zarzhitsky, P.; Zaytsev, A.; Zdrazil, M.; Zeitnitz, C.; Zeller, M.; Zema, P. F.; Zendler, C.; Zenin, A. V.; Zenis, T.; Zenonos, Z.; Zenz, S.; Zerwas, D.; Zhang, H.; Zhang, J.; Zheng, W.; Zhang, X.; Zhao, L.; Zhao, T.; Zhao, X.; Zhao, Z.; Zhelezko, A.; Zhemchugov, A.; Zheng, S.; Zhichao, L.; Zhou, B.; Zhou, N.; Zhou, S.; Zhou, Y.; Zhu, C. G.; Zhu, H. Z.; Zhuang, X. A.; Zhuravlov, V.; Zilka, B.; Zimin, N. I.; Zimmermann, S.; Ziolkowski, M.; Zitoun, R.; Zivkovic, L.; Zmouchko, V. V.; Zobernig, G.; Zoccoli, A.; Zoeller, M. M.; Zolnierowski, Y.; Zsenei, A.; zur Nedden, M.; Zychacek, V.

    2008-08-01

    The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper. A brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented.

  18. CERN IRRADIATION FACILITIES.

    PubMed

    Pozzi, Fabio; Garcia Alia, Ruben; Brugger, Markus; Carbonez, Pierre; Danzeca, Salvatore; Gkotse, Blerina; Richard Jaekel, Martin; Ravotti, Federico; Silari, Marco; Tali, Maris

    2017-09-28

    CERN provides unique irradiation facilities for applications in dosimetry, metrology, intercomparison of radiation protection devices, benchmark of Monte Carlo codes and radiation damage studies to electronics. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Towards a 21st century telephone exchange at CERN

    NASA Astrophysics Data System (ADS)

    Valentín, F.; Hesnaux, A.; Sierra, R.; Chapron, F.

    2015-12-01

    The advent of mobile telephony and Voice over IP (VoIP) has significantly impacted the traditional telephone exchange industry—to such an extent that private branch exchanges are likely to disappear completely in the near future. For large organisations, such as CERN, it is important to be able to smooth this transition by implementing new multimedia platforms that can protect past investments and the flexibility needed to securely interconnect emerging VoIP solutions and forthcoming developments such as Voice over LTE (VoLTE). We present the results of ongoing studies and tests at CERN of the latest technologies in this area.

  20. Ageing Studies on the First Resistive-MicroMeGaS Quadruplet at GIF++ Preliminary Results

    NASA Astrophysics Data System (ADS)

    Alvarez Gonzalez, B.; Bianco, M.; Farina, E.; Iengo, P.; Kuger, F.; Lin, T.; Longo, L.; Sekhniaidze, G.; Sidiropoulou, O.; Schott, M.; Valderanis, C.; Wotschack, J.

    2018-02-01

    A resistive-MicroMeGaS quadruplet built at CERN has been installed at the new CERN Gamma Irradiation Facility (GIF++) with the aim of carrying out a long-term ageing study. Two smaller resistive bulk-MicroMeGaS produced at the CERN PCB workshop have also been installed at GIF++ in order to provide a comparison of the ageing behavior with the MicroMeGaS quadruplet. We give an overview of the ongoing tests at GIF++ in terms of particle rate, integrated charge and spatial resolution of the MicroMeGaS detectors.

  1. Media Training

    ScienceCinema

    None

    2017-12-09

    With the LHC starting up soon, the world's media are again turning their attention to CERN. We're all likely to be called upon to explain what is happening at CERN to media, friends and neighbours. The seminar will be given by BBC television news journalists Liz Pike and Nadia Marchant, and will deal with the kind of questions we're likely to be confronted with through the restart period. The training is open for everybody. Make sure you arrive early enough to get a seat - there are only 200 seats in the Globe. The session will also be webcast: http://webcast.cern.ch/

  2. The significance of Cern

    ScienceCinema

    None

    2017-12-09

    Le Prof. V.Weisskopf, DG du Cern de 1961 à 1965, est né à Vienne, a fait ses études à Göttingen et a une carrière académique particulièrement riche. Il a travaillé à Berlin, Copenhague et Berlin et est parti aux Etats Unis pour participer au projet Manhattan et était Prof. au MTT jusqu'à 1960. Revenu en Europe, il a été DG du Cern et lui a donné l'impulsion que l'on sait.

  3. HIGH ENERGY PHYSICS: CERN Link Breathes Life Into Russian Physics.

    PubMed

    Stone, R

    2000-10-13

    Without fanfare, 600 Russian scientists here at CERN, the European particle physics laboratory, are playing key roles in building the Large Hadron Collider (LHC), a machine that will explore fundamental questions such as why particles have mass, as well as search for exotic new particles whose existence would confirm supersymmetry, a popular theory that aims to unify the four forces of nature. In fact, even though Russia is not one of CERN's 20 member states, most top high-energy physicists in Russia are working on the LHC. Some say their work could prove the salvation of high-energy physics back home.

  4. Experience with procuring, deploying and maintaining hardware at remote co-location centre

    NASA Astrophysics Data System (ADS)

    Bärring, O.; Bonfillou, E.; Clement, B.; Coelho Dos Santos, M.; Dore, V.; Gentit, A.; Grossir, A.; Salter, W.; Valsan, L.; Xafi, A.

    2014-05-01

    In May 2012 CERN signed a contract with the Wigner Data Centre in Budapest for an extension to CERN's central computing facility beyond its current boundaries set by electrical power and cooling available for computing. The centre is operated as a remote co-location site providing rack-space, electrical power and cooling for server, storage and networking equipment acquired by CERN. The contract includes a 'remote-hands' services for physical handling of hardware (rack mounting, cabling, pushing power buttons, ...) and maintenance repairs (swapping disks, memory modules, ...). However, only CERN personnel have network and console access to the equipment for system administration. This report gives an insight to adaptations of hardware architecture, procurement and delivery procedures undertaken enabling remote physical handling of the hardware. We will also describe tools and procedures developed for automating the registration, burn-in testing, acceptance and maintenance of the equipment as well as an independent but important change to the IT assets management (ITAM) developed in parallel as part of the CERN IT Agile Infrastructure project. Finally, we will report on experience from the first large delivery of 400 servers and 80 SAS JBOD expansion units (24 drive bays) to Wigner in March 2013. Changes were made to the abstract file on 13/06/2014 to correct errors, the pdf file was unchanged.

  5. Building an organic block storage service at CERN with Ceph

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel; Wiebalck, Arne

    2014-06-01

    Emerging storage requirements, such as the need for block storage for both OpenStack VMs and file services like AFS and NFS, have motivated the development of a generic backend storage service for CERN IT. The goals for such a service include (a) vendor neutrality, (b) horizontal scalability with commodity hardware, (c) fault tolerance at the disk, host, and network levels, and (d) support for geo-replication. Ceph is an attractive option due to its native block device layer RBD which is built upon its scalable, reliable, and performant object storage system, RADOS. It can be considered an "organic" storage solution because of its ability to balance and heal itself while living on an ever-changing set of heterogeneous disk servers. This work will present the outcome of a petabyte-scale test deployment of Ceph by CERN IT. We will first present the architecture and configuration of our cluster, including a summary of best practices learned from the community and discovered internally. Next the results of various functionality and performance tests will be shown: the cluster has been used as a backend block storage system for AFS and NFS servers as well as a large OpenStack cluster at CERN. Finally, we will discuss the next steps and future possibilities for Ceph at CERN.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons.Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAllister, Liam

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions".This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAllister, Liam

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe";. The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental InteractionS". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ashoke

    Part 7.The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five seriesmore » of pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions";. This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ashoke

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network". The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde Local organizers: A. Uranga, J. Walcher.« less

  14. PILOT: optical performance and end-to-end characterisation

    NASA Astrophysics Data System (ADS)

    Longval, Y.; Misawa, R.; Ade, P.; André, Y.; de Bernardis, P.; Bousquet, F.; Bouzit, M.; Buttice, V.; Charra, M.; Crane, B.; Dubois, J. P.; Engel, C.; Griffin, M.; Hargrave, P.; Leriche, B.; Maestre, S.; Marty, C.; Marty, W.; Masi, S.; Mot, B.; Narbonne, J.; Pajot, F.; Pisano, G.; Ponthieu, N.; Ristorcelli, I.; Rodriguez, L.; Roudil, G.; Simonella, O.; Salatino, M.; Savini, G.; Tucker, C.; Bernard, J.-P.

    2017-11-01

    PILOT (Polarized Instrument for the Long-wavelength Observations of the Tenuous ISM), is a balloon-borne astronomy experiment dedicated to study the polarization of dust emission from the diffuse ISM in our Galaxy [1]. The observations of PILOT have two major scientific objectives. Firstly, they will allow us to constrain the large-scale geometry of the magnetic field in our Galaxy and to study in details the alignment properties of dust grains with respect to the magnetic field. In this domain, the measurements of PILOT will complement those of the Planck satellite at longer wavelengths. In particular, they will bring information at a better angular resolution, which is critical in crowded regions such as the Galactic plane. They will allow us to better understand how the magnetic field is shaping the ISM material on large scale in molecular clouds, and the role it plays in the gravitational collapse leading to star formation. Secondly, the PILOT observations will allow us to measure for the first time the polarized dust emission towards the most diffuse regions of the sky, where the measurements are the most easily interpreted in terms of the physics of dust. In this particular domain, PILOT will play a role for future CMB missions similar to that played by the Archeops experiment for Planck. The results of PILOT will allow us to gain knowledge about the magnetic properties of dust grains and about the structure of the magnetic field in the diffuse ISM that is necessary to a precise foreground subtraction in future polarized CMB measurements. The PILOT measurements, combined with those of Planck at longer wavelengths, will therefore allow us to further constrain the dust models. The outcome of such studies will likely impact the instrumental and technical choices for the future space missions dedicated to CMB polarization. The PILOT instrument will allow observations in two photometric channels at wavelengths 240 μm and 550 μm, with an angular resolution of a few arcminutes. We will make use of large format bolometer arrays, developed for the PACS instrument on board the Herschel satellite. With 1024 detectors per photometric channel and photometric band optimized for the measurement of dust emission, PILOT is likely to become the most sensitive experiment for this type of measurements. The PILOT experiment will take advantage of the large gain in sensitivity allowed by the use of large format, filled bolometer arrays at frequencies more favorable to the detection of dust emission. This paper presents the optical design, optical characterization and its performance. We begin with a presentation of the instrument and the optical system and then we summarise the main optical tests performed. In section III, we present preliminary end-to-end test results.

  15. STS-73 Flight Day 15

    NASA Technical Reports Server (NTRS)

    1995-01-01

    On this fifteenth day of the STS-73 sixteen day mission, the crew Cmdr. Kenneth Bowersox, Pilot Kent Rominger, Payload Specialists Albert Sacco and Fred Leslie, and Mission Specialists Kathryn Thornton, Catherine 'Cady' Coleman, and Michael Lopez-Alegria are shown hosting an in-orbit interview with various newspaper reporters from Johnson Space Center, Kennedy Space Center, and Marshall Space Flight Center via satellite hookup. The astronauts were asked questions regarding the status of the United States Microgravity Lab-2 (USML-2) experiments, their personal goals regarding their involvement in the mission, their future in the space program, and general questions about living in space. Earth views included cloud cover and a tropical storm.

  16. CERN goes iconic

    NASA Astrophysics Data System (ADS)

    2017-06-01

    There are more than 1800 emoji that can be sent and received in text messages and e-mails. Now, the CERN particle-physics lab near Geneva has got in on the act and released its own collection of 35 images that can be used by anyone with an Apple device.

  17. Neutrino Factory Plans at CERN

    NASA Astrophysics Data System (ADS)

    Riche, J. A.

    2002-10-01

    The considerable interest raised by the discovery of neutrino oscillations and recent progress in studies of muon colliders has triggered interest in considering a neutrino factory at CERN. This paper explains the reference scenario, indicates the other possible choices and mentions the R&D that are foreseen.

  18. Wi-Fi Service enhancement at CERN

    NASA Astrophysics Data System (ADS)

    Ducret, V.; Sosnowski, A.; Gonzalez Caballero, B.; Barrand, Q.

    2017-10-01

    Since the early 2000’s, the number of mobile devices connected to CERN’s internal network has increased from just a handful to well over 10,000. Wireless access is no longer simply “nice to have” or just for conference and meeting rooms; support for mobility is expected by most, if not all, of the CERN community. In this context, a full renewal of the CERN Wi-Fi network has been launched to deliver a state-of-the-art campus-wide Wi-Fi Infrastructure. We aim to deliver, in more than 200 office buildings with a surface area of over 400,000m2 and including many high-priority and high-occupation zones, an end-user experience comparable, for most applications, to a wired connection and with seamless mobility support. We describe here the studies and tests performed at CERN to ensure the solution we are deploying can meet these goals as well as delivering a single, simple, flexible and open management platform.

  19. Thermostructural characterization and structural elastic property optimization of novel high luminosity LHC collimation materials at CERN

    NASA Astrophysics Data System (ADS)

    Borg, M.; Bertarelli, A.; Carra, F.; Gradassi, P.; Guardia-Valenzuela, J.; Guinchard, M.; Izquierdo, G. Arnau; Mollicone, P.; Sacristan-de-Frutos, O.; Sammut, N.

    2018-03-01

    The CERN Large Hadron Collider is currently being upgraded to operate at a stored beam energy of 680 MJ through the High Luminosity upgrade. The LHC performance is dependent on the functionality of beam collimation systems, essential for safe beam cleaning and machine protection. A dedicated beam experiment at the CERN High Radiation to Materials facility is created under the HRMT-23 experimental campaign. This experiment investigates the behavior of three collimation jaws having novel composite absorbers made of copper diamond, molybdenum carbide graphite, and carbon fiber carbon, experiencing accidental scenarios involving the direct beam impact on the material. Material characterization is imperative for the design, execution, and analysis of such experiments. This paper presents new data and analysis of the thermostructural characteristics of some of the absorber materials commissioned within CERN facilities. In turn, characterized elastic properties are optimized through the development and implementation of a mixed numerical-experimental optimization technique.

  20. Highlights from the CERN/ESO/NordForsk ''Gender in Physics Day''

    NASA Astrophysics Data System (ADS)

    Primas, F.; Guinot, G.; Strandberg, L.

    2017-03-01

    In their role as observers on the EU Gender Equality Network in the European Research Area (GENERA) project, funded under the Horizon 2020 framework, CERN, ESO and NordForsk joined forces and organised a Gender in Physics Day at the CERN Globe of Science and Innovation. The one-day conference aimed to examine innovative activities promoting gender equality, and to discuss gender-oriented policies and best practice in the European Research Area (with special emphasis on intergovernmental organisations), as well as the importance of building solid networks. The event was very well attended and was declared a success. The main highlights of the meeting are reported.

  1. Dissemination of data measured at the CERN n_TOF facility

    NASA Astrophysics Data System (ADS)

    Dupont, E.; Otuka, N.; Cabellos, O.; Aberle, O.; Aerts, G.; Altstadt, S.; Alvarez, H.; Alvarez-Velarde, F.; Andriamonje, S.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Badurek, G.; Balibrea, J.; Barbagallo, M.; Barros, S.; Baumann, P.; Bécares, V.; Bečvář, F.; Beinrucker, C.; Belloni, F.; Berthier, B.; Berthoumieux, E.; Billowes, J.; Boccone, V.; Bosnar, D.; Brown, A.; Brugger, M.; Caamaño, M.; Calviani, M.; Calviño, F.; Cano-Ott, D.; Capote, R.; Cardella, R.; Carrapiço, C.; Casanovas, A.; Castelluccio, D. M.; Cennini, P.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Chin, M.; Colonna, N.; Cortés, G.; Cortés-Giraldo, M. A.; Cosentino, L.; Couture, A.; Cox, J.; Damone, L. A.; David, S.; Deo, K.; Diakaki, M.; Dillmann, I.; Domingo-Pardo, C.; Dressler, R.; Dridi, W.; Duran, I.; Eleftheriadis, C.; Embid-Segura, M.; Fernández-Domínguez, B.; Ferrant, L.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Fraval, K.; Frost, R. J. W.; Fujii, K.; Furman, W.; Ganesan, S.; Garcia, A. R.; Gawlik, A.; Gheorghe, I.; Gilardoni, S.; Giubrone, G.; Glodariu, T.; Göbel, K.; Gomez-Hornillos, M. B.; Goncalves, I. F.; Gonzalez-Romero, E.; Goverdovski, A.; Gramegna, F.; Griesmayer, E.; Guerrero, C.; Gunsing, F.; Gurusamy, P.; Haight, R.; Harada, H.; Heftrich, T.; Heil, M.; Heinitz, S.; Hernández-Prieto, A.; Heyse, J.; Igashira, M.; Isaev, S.; Jenkins, D. G.; Jericha, E.; Kadi, Y.; Kaeppeler, F.; Kalamara, A.; Karadimos, D.; Karamanis, D.; Katabuchi, T.; Kavrigin, P.; Kerveno, M.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Konovalov, V.; Krtička, M.; Kroll, J.; Kurtulgil, D.; Lampoudis, C.; Langer, C.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Naour, C. Le; Lerendegui-Marco, J.; Leong, L. S.; Licata, M.; Meo, S. Lo; Lonsdale, S. J.; Losito, R.; Lozano, M.; Macina, D.; Manousos, A.; Marganiec, J.; Martinez, T.; Marrone, S.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Mondelaers, W.; Montesano, S.; Moreau, C.; Mosconi, M.; Musumarra, A.; Negret, A.; Nolte, R.; O'Brien, S.; Oprea, A.; Palomo-Pinto, F. R.; Pancin, J.; Paradela, C.; Patronis, N.; Pavlik, A.; Pavlopoulos, P.; Perkowski, J.; Perrot, L.; Pigni, M. T.; Plag, R.; Plompen, A.; Plukis, L.; Poch, A.; Porras, I.; Praena, J.; Pretel, C.; Quesada, J. M.; Radeck, D.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego, A.; Robles, M.; Roman, F.; Rout, P. C.; Rudolf, G.; Rubbia, C.; Rullhusen, P.; Ryan, J. A.; Sabaté-Gilarte, M.; Salgado, J.; Santos, C.; Sarchiapone, L.; Sarmento, R.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Stephan, C.; Suryanarayana, S. V.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tavora, L.; Terlizzi, R.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Versaci, R.; Vermeulen, M. J.; Villamarin, D.; Vicente, M. C.; Vlachoudis, V.; Vlastou, R.; Voss, F.; Wallner, A.; Walter, S.; Ware, T.; Warren, S.; Weigand, M.; Weiß, C.; Wolf, C.; Wiesher, M.; Wisshak, K.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    The n_TOF neutron time-of-flight facility at CERN is used for high quality nuclear data measurements from thermal energy up to hundreds of MeV. In line with the CERN open data policy, the n_TOF Collaboration takes actions to preserve its unique data, facilitate access to them in standardised format, and allow their re-use by a wide community in the fields of nuclear physics, nuclear astrophysics and various nuclear technologies. The present contribution briefly describes the n_TOF outcomes, as well as the status of dissemination and preservation of n_TOF final data in the international EXFOR library.

  2. How to create successful Open Hardware projects — About White Rabbits and open fields

    NASA Astrophysics Data System (ADS)

    van der Bij, E.; Arruat, M.; Cattin, M.; Daniluk, G.; Gonzalez Cobas, J. D.; Gousiou, E.; Lewis, J.; Lipinski, M. M.; Serrano, J.; Stana, T.; Voumard, N.; Wlostowski, T.

    2013-12-01

    CERN's accelerator control group has embraced ''Open Hardware'' (OH) to facilitate peer review, avoid vendor lock-in and make support tasks scalable. A web-based tool for easing collaborative work was set up and the CERN OH Licence was created. New ADC, TDC, fine delay and carrier cards based on VITA and PCI-SIG standards were designed and drivers for Linux were written. Often industry was paid for developments, while quality and documentation was controlled by CERN. An innovative timing network was also developed with the OH paradigm. Industry now sells and supports these designs that find their way into new fields.

  3. Medical Applications at CERN and the ENLIGHT Network

    PubMed Central

    Dosanjh, Manjit; Cirilli, Manuela; Myers, Steve; Navin, Sparsh

    2016-01-01

    State-of-the-art techniques derived from particle accelerators, detectors, and physics computing are routinely used in clinical practice and medical research centers: from imaging technologies to dedicated accelerators for cancer therapy and nuclear medicine, simulations, and data analytics. Principles of particle physics themselves are the foundation of a cutting edge radiotherapy technique for cancer treatment: hadron therapy. This article is an overview of the involvement of CERN, the European Organization for Nuclear Research, in medical applications, with specific focus on hadron therapy. It also presents the history, achievements, and future scientific goals of the European Network for Light Ion Hadron Therapy, whose co-ordination office is at CERN. PMID:26835422

  4. Medical Applications at CERN and the ENLIGHT Network.

    PubMed

    Dosanjh, Manjit; Cirilli, Manuela; Myers, Steve; Navin, Sparsh

    2016-01-01

    State-of-the-art techniques derived from particle accelerators, detectors, and physics computing are routinely used in clinical practice and medical research centers: from imaging technologies to dedicated accelerators for cancer therapy and nuclear medicine, simulations, and data analytics. Principles of particle physics themselves are the foundation of a cutting edge radiotherapy technique for cancer treatment: hadron therapy. This article is an overview of the involvement of CERN, the European Organization for Nuclear Research, in medical applications, with specific focus on hadron therapy. It also presents the history, achievements, and future scientific goals of the European Network for Light Ion Hadron Therapy, whose co-ordination office is at CERN.

  5. Preparation of a primary argon beam for the CERN fixed target physics.

    PubMed

    Küchler, D; O'Neil, M; Scrivens, R; Thomae, R

    2014-02-01

    The fixed target experiment NA61 in the North Area of the Super Proton Synchrotron is studying phase transitions in strongly interacting matter. Up to now they used the primary beams available from the CERN accelerator complex (protons and lead ions) or fragmented beams created from the primary lead ion beam. To explore a wider range of energies and densities a request was made to provide primary argon and xenon beams. This paper describes the results of the setting up and 10 week test run of the Ar(11+) beam from the 14.5 GHz ECR ion source and the linear accelerator (Linac3) at CERN.

  6. Pilot-scale production of biodiesel from waste fats and oils using tetramethylammonium hydroxide.

    PubMed

    Šánek, Lubomír; Pecha, Jiří; Kolomazník, Karel; Bařinová, Michaela

    2016-02-01

    Annually, a great amount of waste fats and oils not suitable for human consumption or which cannot be further treated are produced around the world. A potential way of utilizing this low-cost feedstock is its conversion into biodiesel. The majority of biodiesel production processes today are based on the utilization of inorganic alkali catalysts. However, it has been proved that an organic base - tetramethylammonium hydroxide - can be used as a very efficient transesterification catalyst. Furthermore, it can be employed for the esterification of free fatty acids - reducing even high free fatty acid contents to the required level in just one step. The work presented herein, is focused on biodiesel production from waste frying oils and animal fats using tetramethylammonium hydroxide at the pilot-plant level. The results showed that the process performance in the pilot unit - using methanol and TMAH as a catalyst, is comparable to the laboratory procedure, even when the biodiesel is produced from waste vegetable oils or animal fats with high free fatty acid content. The reaction conditions were set at: 1.5% w/w of TMAH, reaction temperature 65°C, the feedstock to methanol molar ratio to 1:6, and the reaction time to 120min. The conversion of triglycerides to FAME was approximately 98%. The cloud point of the biodiesel obtained from waste animal fat was also determined. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Development/Testing of a Monitoring System Assisting MCI Patients: The European Project INLIFE.

    PubMed

    Kaimakamis, Evangelos; Karavidopoulou, Vaia; Kilintzis, Vassilios; Stefanopoulos, Leandros; Papageorgiou, Valentini

    2017-01-01

    INLIFE is a project cofounded from the European Union aiming in prolonging independent living of elderly people with cognitive impairment based on open, seamless ICT services supporting communication, daily activities, providing health services and professional care to the elderly. The main innovation stems from ICT solutions offering 19 different services adapted on specific characteristics elderly people with mild cognitive impairment, early and later stages of Dementia, cognitive impairment and co-morbid condition, as well as their formal and informal caregivers. All services have different focus areas and are incorporated into a unified system based on cloud architecture implemented in patients of 6 European countries, including Greece. More than 1200 patients, caregivers and healthcare providers participate in the pilot testing of the project. Primary parameter for assessing the effectiveness of the interventions is their impact on the quality of life of the elderly patients and their caregivers, contributing to prolonging independent living of the affected. A special digital platform has been developed in the Greek pilot site aiming to adapt and monitor all the implemented applications. This includes a medical decision support system that receives biosignals from patients and interaction interfaces in which all participants are involved. Recruitment and patients' participation has already started in the pilot site of Thessaloniki for the services that are to be tested in Greece.

  8. Modeling and Analysis of the Water Cycle: Seasonal and Event Variability at the Walnut River Research Watershed

    NASA Astrophysics Data System (ADS)

    Miller, M. A.; Miller, N. L.; Sale, M. J.; Springer, E. P.; Wesely, M. L.; Bashford, K. E.; Conrad, M. E.; Costigan, K. R.; Kemball-Cook, S.; King, A. W.; Klazura, G. E.; Lesht, B. M.; Machavaram, M. V.; Sultan, M.; Song, J.; Washington-Allen, R.

    2001-12-01

    A multi-laboratory Department of Energy (DOE) team (Argonne National Laboratory, Brookhaven National Laboratory, Los Alamos National Laboratory, Lawrence Berkeley National Laboratory, Oak Ridge National Laboratory) has begun an investigation of hydrometeorological processes at the Whitewater subbasin of the Walnut River Watershed in Kansas. The Whitewater sub-basin is viewed as a DOE long-term hydrologic research watershed and resides within the well-instrumented Atmospheric Radiation Measurement/Cloud Radiation Atmosphere Testbed (ARM/CART) and the proposed Arkansas-Red River regional hydrologic testbed. The focus of this study is the development and evaluation of coupled regional to watershed scale models that simulate atmospheric, land surface, and hydrologic processes as systems with linkages and feedback mechanisms. This pilot is the precursor to the proposed DOE Water Cycle Dynamics Prediction Program. An important new element is the introduction of water isotope budget equations into mesoscale and hydrologic modeling. Two overarching hypotheses are part of this pilot study: (1) Can the predictability of the regional water balance be improved using high-resolution model simulations that are constrained and validated using new water isotope and hydrospheric water measurements? (2) Can water isotopic tracers be used to segregate different pathways through the water cycle and predict a change in regional climate patterns? Initial results of the pilot will be presented along with a description and copies of the proposed DOE Water Cycle Dynamics Prediction Program.

  9. The keys to CERN conference rooms - Managing local collaboration facilities in large organisations

    NASA Astrophysics Data System (ADS)

    Baron, T.; Domaracky, M.; Duran, G.; Fernandes, J.; Ferreira, P.; Gonzalez Lopez, J. B.; Jouberjean, F.; Lavrut, L.; Tarocco, N.

    2014-06-01

    For a long time HEP has been ahead of the curve in its usage of remote collaboration tools, like videoconference and webcast, while the local CERN collaboration facilities were somewhat behind the expected quality standards for various reasons. This time is now over with the creation by the CERN IT department in 2012 of an integrated conference room service which provides guidance and installation services for new rooms (either equipped for videoconference or not), as well as maintenance and local support. Managing now nearly half of the 246 meeting rooms available on the CERN sites, this service has been built to cope with the management of all CERN rooms with limited human resources. This has been made possible by the intensive use of professional software to manage and monitor all the room equipment, maintenance and activity. This paper focuses on presenting these packages, either off-the-shelf commercial products (asset and maintenance management tool, remote audio-visual equipment monitoring systems, local automation devices, new generation touch screen interfaces for interacting with the room) when available or locally developed integration and operational layers (generic audio-visual control and monitoring framework) and how they help overcoming the challenges presented by such a service. The aim is to minimise local human interventions while preserving the highest service quality and placing the end user back in the centre of this collaboration platform.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The CERN Winter School on Supergravity, Strings, and Gauge Theory is the analytic continuation of the yearly training school of the former EC-RTN string network "Constituents, Fundamental Forces and Symmetries of the Universe". The 2010 edition of the school is supported and organized by the CERN Theory Divison, and will take place from Monday January 25 to Friday January 29, at CERN. As its predecessors, this school is meant primarily for training of doctoral students and young postdoctoral researchers in recent developments in theoretical high-energy physics and string theory. The programme of the school will consist of five series ofmore » pedagogical lectures, complemented by tutorial discussion sessions in the afternoons. Previous schools in this series were organized in 2005 at SISSA in Trieste, and in 2006, 2007, 2008, and 2009 at CERN, Geneva. Other similar schools have been organized in the past by the former related RTN network "The Quantum Structure of Spacetime and the Geometric Nature of Fundamental Interactions". This edition of the school is not funded by the European Union. The school is funded by the CERN Theory Division, and the Arnold Sommerfeld Center at Ludwig-Maximilians University of Munich. Scientific committee: M. Gaberdiel, D. Luest, A. Sevrin, J. Simon, K. Stelle, S. Theisen, A. Uranga, A. Van Proeyen, E. Verlinde. Local organizers: A. Uranga, J. Walcher. This video is Part 11 in the series.« less

  11. A possible biomedical facility at the European Organization for Nuclear Research (CERN).

    PubMed

    Dosanjh, M; Jones, B; Myers, S

    2013-05-01

    A well-attended meeting, called "Brainstorming discussion for a possible biomedical facility at CERN", was held by the European Organization for Nuclear Research (CERN) at the European Laboratory for Particle Physics on 25 June 2012. This was concerned with adapting an existing, but little used, 78-m circumference CERN synchrotron to deliver a wide range of ion species, preferably from protons to at least neon ions, with beam specifications that match existing clinical facilities. The potential extensive research portfolio discussed included beam ballistics in humanoid phantoms, advanced dosimetry, remote imaging techniques and technical developments in beam delivery, including gantry design. In addition, a modern laboratory for biomedical characterisation of these beams would allow important radiobiological studies, such as relative biological effectiveness, in a dedicated facility with standardisation of experimental conditions and biological end points. A control photon and electron beam would be required nearby for relative biological effectiveness comparisons. Research beam time availability would far exceed that at other facilities throughout the world. This would allow more rapid progress in several biomedical areas, such as in charged hadron therapy of cancer, radioisotope production and radioprotection. The ethos of CERN, in terms of open access, peer-reviewed projects and governance has been so successful for High Energy Physics that application of the same to biomedicine would attract high-quality research, with possible contributions from Europe and beyond, along with potential new funding streams.

  12. View of the Earth seen by the Apollo 17 crew traveling toward the moon

    NASA Image and Video Library

    1972-12-07

    AS17-148-22727 (7 Dec. 1972) --- This view of Earth was seen by the Apollo 17 crew as they traveled toward the moon on their NASA lunar landing mission. This outstanding trans-lunar coast photograph extends from the Mediterranean Sea area to the Antarctica south polar ice cap. This is the first time the Apollo trajectory made it possible to photograph the south polar ice cap. Note the heavy cloud cover in the Southern Hemisphere. Almost the entire coastline of Africa is clearly visible. The Arabian Peninsula can be seen at the northeastern edge of Africa. The large island off the coast of Africa is the Malagasy Republic. The Asian mainland is on the horizon toward the northeast. The Apollo 17 crew consisted of astronauts Eugene A. Cernan, mission commander; Ronald E. Evans, command module pilot; and Harrison H. Schmitt, lunar module pilot. While astronauts Cernan and Schmitt descended in the Lunar Module (LM) to explore the moon, astronaut Evans remained with the Command and Service Modules (CSM) in lunar orbit.

  13. Three Dimensional Modeling Analysis of the Transpacific Transport of Aerosols During PACDEX

    NASA Astrophysics Data System (ADS)

    Carmichael, G. R.; Adhikary, B.; Hatch, C.; Kulkarni, S.; Moen, J.; Mena, M.

    2007-12-01

    Mineral dust and aerosols emitted from Asia are known to traverse long distances across the Pacific Ocean and can reach North America within a few days. A pilot field study, the PACific Dust Experiment (PACDEX), was carried out in April and May of 2007, during the peak East Asian dust emission season. The NSF/NCAR-HIAPER (High Performance Instrumented Airborne Platform for Environmental Research) platform allowed for sampling the evolution of mineral aerosol/pollution plumes and their physical and chemical characteristics as they traverse the Pacific Ocean and interact with the Pacific cloud systems en route to North America in both the upper and lower troposphere. A comprehensive 3-dimensional regional-scale model developed at The University of Iowa (Sulfur Transport dEposition Model, STEM) has been used for the analysis of aerosol interactions to help define key measurement strategies during the mission and to help interpret observations from the HIAPER platform. In this study we will present model aerosol distribution inter-comparison with cloud fields and aircraft observations. Model analysis provides further insight into cloud/pollution/dust interactions as East Asian emissions transit the Pacific Ocean en route to North America. Trajectory analysis and emission markers are used to help understand the air mass history and aerosol aging processes of the aerosols sampled by the HIAPER platform. Estimates of the fluxes of aerosol dust, BC and sulfate due to transpacific transport will also be presented.

  14. Mapping Remote and Multidisciplinary Learning Barriers: Lessons from "Challenge-Based Innovation" at CERN

    ERIC Educational Resources Information Center

    Jensen, Matilde Bisballe; Utriainen, Tuuli Maria; Steinert, Martin

    2018-01-01

    This paper presents the experienced difficulties of students participating in the multidisciplinary, remote collaborating engineering design course challenge-based innovation at CERN. This is with the aim to identify learning barriers and improve future learning experiences. We statistically analyse the rated differences between distinct design…

  15. DG's New Year's presentation

    ScienceCinema

    Heuer, R.-D.

    2018-05-22

    CERN general staff meeting. Looking back at key messages: Highest priority: LHC physics in 2009; Increase diversity of the scientific program; Prepare for future projects; Establish open and direct communication; Prepare CERN towards a global laboratory; Increase consolidation efforts; Financial situation--tight; Knowledge and technology transfer--proactive; Contract policy and internal mobility--lessons learned.

  16. Contextualized Magnetism in Secondary School: Learning from the LHC (CERN)

    ERIC Educational Resources Information Center

    Cid, Ramon

    2005-01-01

    Physics teachers in secondary schools usually mention the world's largest particle physics laboratory--CERN (European Organization for Nuclear Research)--only because of the enormous size of the accelerators and detectors used there, the number of scientists involved in their activities and also the necessary international scientific…

  17. WorldWide Web: Hypertext from CERN.

    ERIC Educational Resources Information Center

    Nickerson, Gord

    1992-01-01

    Discussion of software tools for accessing information on the Internet focuses on the WorldWideWeb (WWW) system, which was developed at the European Particle Physics Laboratory (CERN) in Switzerland to build a worldwide network of hypertext links using available networking technology. Its potential for use with multimedia documents is also…

  18. The CERES S'COOL Project: Development and Operational Phases

    NASA Technical Reports Server (NTRS)

    Chambers, Lin H.; Young, David F.; Racel, Anne M.

    1998-01-01

    As part of NASA's Mission to Planet Earth, the first Clouds and the Earth's Radiant Energy System (CERES) instrument will be launched on the Tropical Rainfall Measuring Mission (TRMM) spacecraft from the Tanegashima launch site in Japan in November 1997. The instrument will measure the radiation budget incoming and outgoing radiant energy - of the Earth. The major feature of interest is clouds, which play a very strong role in regulating our climate. CERES will identify clear and cloudy regions and determine cloud physical and microphysical properties using imager data from a companion instrument. Validation efforts for the remote sensing algorithms will be intensive. As one component of the validation, the S'COOL (Students' Cloud Observations On-Line) project will involve school children around the globe in making ground truth measurements at the time of a CERES overpass. They will report cloud type, height, fraction, and opacity, as well as the local surface conditions. Their observations will be collected at the NASA Langley Distributed Active Archive Center (DAAC) and made available over the Internet for educational purposes as well as for use by the CERES Science Team in validation efforts. Pilot testing of the S'COOL project began in January 1997 with two local schools in Southeastern Virginia and one remote site in Montana. National testing in April 1997 involved 8 schools (grades 3 to high school) across the United States. Global testing will be carried out in October 1997. Details of the S'COOL project, which is mainly Internet-based, are being developed in each of these phases according to feedback received from participants. In 1998, when the CERES instrument is operational, a global observer network should be in place providing useful information to the scientists and learning opportunities to the students. Broad participation in the S'COOL project is planned, both to obtain data from a wide range of geographic areas, and to involve as many students as possible in learning about clouds and atmospheric science. This paper reports on the development phase of the S'COOL project, including the reaction of the teachers and students who have been involved. It describes the operational state of the S'COOL network, and identifies opportunities for additional participants.

  19. LEMON - LHC Era Monitoring for Large-Scale Infrastructures

    NASA Astrophysics Data System (ADS)

    Marian, Babik; Ivan, Fedorko; Nicholas, Hook; Hector, Lansdale Thomas; Daniel, Lenkes; Miroslav, Siket; Denis, Waldron

    2011-12-01

    At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.

  20. NextGEOSS project: A user-driven approach to build a Earth Observations Data Hub

    NASA Astrophysics Data System (ADS)

    Percivall, G.; Voidrot, M. F.; Bye, B. L.; De Lathouwer, B.; Catarino, N.; Concalves, P.; Kraft, C.; Grosso, N.; Meyer-Arnek, J.; Mueller, A.; Goor, E.

    2017-12-01

    Several initiatives and projects contribute to support Group on Earth Observation's (GEO) global priorities including support to the UN 2030 Agenda for sustainable development, the Paris Agreement on climate change, and the Sendai Framework for Disaster Risk Reduction . Running until 2020, the NextGEOSS project evolves the European vision of a user driven GEOSS data exploitation for innovation and business, relying on the three main pillars: engaging communities of practice delivering technological advancements advocating the use of GEOSS These 3 pillars support the creation and deployment of Earth observation based innovative research activities and commercial services. In this presentation we will emphasise how the NextGEOSS project uses a pilot-driven approach to ramp up and consolidate the system in a pragmatique way, integrating the complexity of the existing global ecosystem, leveraging previous investments, adding new cloud technologies and resources and engaging the diverse communities to address all types of Sustainable Development Goals (SDGs). A set of 10 initial pilots have been defined by the project partners to address the main challenges and include as soon as possible contributions to SDGs associated with Food Sustainability, Bio Diversity, Space and Security, Cold Regions, Air Pollutions, Disaster Risk Reduction, Territorial Planning, Energy. In 2018 and 2019 the project team will work on two new series of Architecture Implementation Pilots (AIP-10 and AIP-11), opened world-wide, to increase discoverability, accessibility and usability of data with a strong User Centric approach for innovative GEOSS powered applications for multiple societal areas. All initiatives with an interest in and need of Earth observations (data, processes, models, ...) are welcome to participate to these pilots initiatives. NextGEOSS is a H2020 Research and Development Project from the European Community under grant agreement 730329.

  1. Preparation of a primary argon beam for the CERN fixed target physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Küchler, D., E-mail: detlef.kuchler@cern.ch; O’Neil, M.; Scrivens, R.

    2014-02-15

    The fixed target experiment NA61 in the North Area of the Super Proton Synchrotron is studying phase transitions in strongly interacting matter. Up to now they used the primary beams available from the CERN accelerator complex (protons and lead ions) or fragmented beams created from the primary lead ion beam. To explore a wider range of energies and densities a request was made to provide primary argon and xenon beams. This paper describes the results of the setting up and 10 week test run of the Ar{sup 11+} beam from the 14.5 GHz ECR ion source and the linear acceleratormore » (Linac3) at CERN.« less

  2. Deployment and Operational Experiences with CernVM-FS at the GridKa Tier-1 Center

    NASA Astrophysics Data System (ADS)

    Alef, Manfred; Jäger, Axel; Petzold and, Andreas; Verstege, Bernhard

    2012-12-01

    In 2012 the GridKa Tier-1 computing center hosts 130 kHS06 computing resources and 14PB disk and 17PB tape space. These resources are shared between the four LHC VOs and a number of national and international VOs from high energy physics and other sciences. CernVM-FS has been deployed at GridKa to supplement the existing NFS-based system to access VO software on the worker nodes. It provides a solution tailored to the requirement of the LHC VOs. We will focus on the first operational experiences and the monitoring of CernVM-FS on the worker nodes and the squid caches.

  3. Open Media Training Session

    ScienceCinema

    None

    2017-12-09

    Have you ever wondered how the media work and why some topics make it into the news and other don't? Would you like to know how to (and how not to) give an interview to a journalist? With the LHC preparing for first collisions at high energies, the world's media are again turning their attention to CERN. We're all likely to be called upon to explain what is happening at CERN to media, friends and neighbours. The seminar will be given by BBC television news journalists Liz Pike and Nadia Marchant, and will deal with the kind of questions we're likely to be confronted with through the restart period. Follow the webcast: http://webcast.cern.ch/

  4. CERN - Six Decades of Science, Innovation, Cooperation, and Inspiration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quigg, Chris

    The European Laboratory for Particle Physics, which straddles the Swiss-French border northwest of Geneva, celebrates its sixtieth birthday in 2014 CERN is the preeminent particle-physics institution in the world, currently emphasizing the study of collisions of protons and heavy nuclei at very high energies and the exploration of physics on the electroweak scale (energies where electromagnetism and the weak nuclear force merge). With brilliant accomplishments in research, innovation, and education, and a sustained history of cooperation among people from different countries and cultures, CERN ranks as one of the signal achievements of the postwar European Project. For physicists the worldmore » over, the laboratory is a source of pride and inspiration.« less

  5. Remote Sensing of In-Flight Icing Conditions: Operational, Meteorological, and Technological Considerations

    NASA Technical Reports Server (NTRS)

    Ryerson, Charles C.

    2000-01-01

    Remote-sensing systems that map aircraft icing conditions in the flight path from airports or aircraft would allow icing to be avoided and exited. Icing remote-sensing system development requires consideration of the operational environment, the meteorological environment, and the technology available. Operationally, pilots need unambiguous cockpit icing displays for risk management decision-making. Human factors, aircraft integration, integration of remotely sensed icing information into the weather system infrastructures, and avoid-and-exit issues need resolution. Cost, maintenance, power, weight, and space concern manufacturers, operators, and regulators. An icing remote-sensing system detects cloud and precipitation liquid water, drop size, and temperature. An algorithm is needed to convert these conditions into icing potential estimates for cockpit display. Specification development requires that magnitudes of cloud microphysical conditions and their spatial and temporal variability be understood at multiple scales. The core of an icing remote-sensing system is the technology that senses icing microphysical conditions. Radar and microwave radiometers penetrate clouds and can estimate liquid water and drop size. Retrieval development is needed; differential attenuation and neural network assessment of multiple-band radar returns are most promising to date. Airport-based radar or radiometers are the most viable near-term technologies. A radiometer that profiles cloud liquid water, and experimental techniques to use radiometers horizontally, are promising. The most critical operational research needs are to assess cockpit and aircraft system integration, develop avoid-and-exit protocols, assess human factors, and integrate remote-sensing information into weather and air traffic control infrastructures. Improved spatial characterization of cloud and precipitation liquid-water content, drop-size spectra, and temperature are needed, as well as an algorithm to convert sensed conditions into a measure of icing potential. Technology development also requires refinement of inversion techniques. These goals can be accomplished with collaboration among federal agencies including NASA, the FAA, the National Center for Atmospheric Research, NOAA, and the Department of Defense. This report reviews operational, meteorological, and technological considerations in developing the capability to remotely map in-flight icing conditions from the ground and from the air.

  6. More "Hands-On" Particle Physics: Learning with ATLAS at CERN

    ERIC Educational Resources Information Center

    Long, Lynne

    2011-01-01

    This article introduces teachers and students to a new portal of resources called Learning with ATLAS at CERN (http://learningwithatlas-portal.eu/), which has been developed by a European consortium of academic researchers and schools' liaison and outreach providers from countries across Europe. It includes the use of some of the mind-boggling…

  7. History of Cern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2007-12-20

    Cérémonie à l'occasion de l'apparition du premier volume du livre sur l'histoire du Cern, avec plusieurs personnes présentes qui jouaient un rôle important dans cette organisation européenne couronnée de succès grâce à l'esprit des membres fondateurs qui est et restera essentiel

  8. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  9. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    ScienceCinema

    Arnold, Jeffrey

    2018-05-14

    Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided. About the speaker: Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.

  10. TOWARDS A NOVEL MODULAR ARCHITECTURE FOR CERN RADIATION MONITORING.

    PubMed

    Boukabache, Hamza; Pangallo, Michel; Ducos, Gael; Cardines, Nicola; Bellotta, Antonio; Toner, Ciarán; Perrin, Daniel; Forkel-Wirth, Doris

    2017-04-01

    The European Organization for Nuclear Research (CERN) has the legal obligation to protect the public and the people working on its premises from any unjustified exposure to ionising radiation. In this context, radiation monitoring is one of the main concerns of the Radiation Protection Group. After 30 y of reliable service, the ARea CONtroller (ARCON) system is approaching the end of its lifecycle, which raises the need for new, more efficient radiation monitors with a high level of modularity to ensure better maintainability. Based on these two main principles, new detectors are currently being developed that will be capable of measuring very low dose rates down to 50 nSv h-1, whilst being able to measure radiation over an extensive range of 8 decades without any auto scaling. To reach these performances, CERN Radiation MOnitoring Electronics (CROME), the new generation of CERN radiation monitors, is based on the versatile architecture that includes new read-out electronics developed by the Instrumentation and Logistics section of the CERN Radiation Protection Group as well as a reconfigurable system on chip capable of performing complex processing calculations. Beside the capabilities of CROME to continuously measure the ambient dose rate, the system generates radiation alarms, provides interlock signals, drives alarm display units through a fieldbus and provides long-term, permanent and reliable data logging. The measurement tests performed during the first phase of the development show very promising results that pave the way to the second phase: the certification. © The Author 2016. Published by Oxford University Press.

  11. TOWARDS A NOVEL MODULAR ARCHITECTURE FOR CERN RADIATION MONITORING

    PubMed Central

    Boukabache, Hamza; Pangallo, Michel; Ducos, Gael; Cardines, Nicola; Bellotta, Antonio; Toner, Ciarán; Perrin, Daniel; Forkel-Wirth, Doris

    2017-01-01

    Abstract The European Organization for Nuclear Research (CERN) has the legal obligation to protect the public and the people working on its premises from any unjustified exposure to ionising radiation. In this context, radiation monitoring is one of the main concerns of the Radiation Protection Group. After 30 y of reliable service, the ARea CONtroller (ARCON) system is approaching the end of its lifecycle, which raises the need for new, more efficient radiation monitors with a high level of modularity to ensure better maintainability. Based on these two main principles, new detectors are currently being developed that will be capable of measuring very low dose rates down to 50 nSv h−1, whilst being able to measure radiation over an extensive range of 8 decades without any auto scaling. To reach these performances, CERN Radiation MOnitoring Electronics (CROME), the new generation of CERN radiation monitors, is based on the versatile architecture that includes new read-out electronics developed by the Instrumentation and Logistics section of the CERN Radiation Protection Group as well as a reconfigurable system on chip capable of performing complex processing calculations. Beside the capabilities of CROME to continuously measure the ambient dose rate, the system generates radiation alarms, provides interlock signals, drives alarm display units through a fieldbus and provides long-term, permanent and reliable data logging. The measurement tests performed during the first phase of the development show very promising results that pave the way to the second phase: the certification. PMID:27909154

  12. Got Questions About the Higgs Boson? Ask a Scientist

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinchliffe, Ian

    Ask a scientist about the Higgs boson. There's a lot of buzz this week over new data from CERN's Large Hadron Collider (LHC) and the final data from Fermilab's Tevatron about the Higgs boson. It raises questions about what scientists have found and what still remains to be found -- and what it all means. Berkeley Lab's Ian Hinchliffe invites you to send in questions about the Higgs. He'll answer a few of your questions in a follow-up video later this week. Hinchliffe is a theoretical physicist who heads Berkeley Lab's sizable contingent with the ATLAS experiment at CERN. •more » Post your questions in the comment box • E-mail your questions to askascientist@lbl.gov • Tweet to @BerkeleyLab • Or post on our facebook page: facebook/berkeleylab Update on July 5: Ian responds to several of your questions in this video: http://youtu.be/1BkpD1IS62g. Update on 7/04: Here's CERN's press release from earlier today on the latest preliminary results in the search for the long sought Higgs particle: http://press.web.cern.ch/press/PressReleases/Releases2012/PR17.12E.htm. And here's a Q&A on what the news tells us: http://cdsweb.cern.ch/journal/CERNBulletin/2012/28/News%20Articles/1459460?ln=en. CERN will present the new LHC data at a seminar July 4th at 9:00 in the morning Geneva time (3:00 in the morning Eastern Daylight Time, midnight on the Pacific Coast), where the ATLAS collaboration and their rivals in the CMS experiment will announce their results. Tevatron results were announced by Fermilab on Monday morning. For more background on the LHC's search for the Higgs boson, visit http://newscenter.lbl.gov/feature-stories/2012/06/28/higgs-2012/.« less

  13. Got Questions About the Higgs Boson? Ask a Scientist

    ScienceCinema

    Hinchliffe, Ian

    2017-12-12

    Ask a scientist about the Higgs boson. There's a lot of buzz this week over new data from CERN's Large Hadron Collider (LHC) and the final data from Fermilab's Tevatron about the Higgs boson. It raises questions about what scientists have found and what still remains to be found -- and what it all means. Berkeley Lab's Ian Hinchliffe invites you to send in questions about the Higgs. He'll answer a few of your questions in a follow-up video later this week. Hinchliffe is a theoretical physicist who heads Berkeley Lab's sizable contingent with the ATLAS experiment at CERN. • Post your questions in the comment box • E-mail your questions to askascientist@lbl.gov • Tweet to @BerkeleyLab • Or post on our facebook page: facebook/berkeleylab Update on July 5: Ian responds to several of your questions in this video: http://youtu.be/1BkpD1IS62g. Update on 7/04: Here's CERN's press release from earlier today on the latest preliminary results in the search for the long sought Higgs particle: http://press.web.cern.ch/press/PressReleases/Releases2012/PR17.12E.htm. And here's a Q&A on what the news tells us: http://cdsweb.cern.ch/journal/CERNBulletin/2012/28/News%20Articles/1459460?ln=en. CERN will present the new LHC data at a seminar July 4th at 9:00 in the morning Geneva time (3:00 in the morning Eastern Daylight Time, midnight on the Pacific Coast), where the ATLAS collaboration and their rivals in the CMS experiment will announce their results. Tevatron results were announced by Fermilab on Monday morning. For more background on the LHC's search for the Higgs boson, visit http://newscenter.lbl.gov/feature-stories/2012/06/28/higgs-2012/.

  14. A pilot biomedical engineering course in rapid prototyping for mobile health.

    PubMed

    Stokes, Todd H; Venugopalan, Janani; Hubbard, Elena N; Wang, May D

    2013-01-01

    Rapid prototyping of medically assistive mobile devices promises to fuel innovation and provides opportunity for hands-on engineering training in biomedical engineering curricula. This paper presents the design and outcomes of a course offered during a 16-week semester in Fall 2011 with 11 students enrolled. The syllabus covered a mobile health design process from end-to-end, including storyboarding, non-functional prototypes, integrated circuit programming, 3D modeling, 3D printing, cloud computing database programming, and developing patient engagement through animated videos describing the benefits of a new device. Most technologies presented in this class are open source and thus provide unlimited "hackability". They are also cost-effective and easily transferrable to other departments.

  15. STS-69 Flight Day 3 Highlights

    NASA Technical Reports Server (NTRS)

    1995-01-01

    On the third day of the STS-69 mission, the flight crew (Cmdr. Dave Walker, Pilot Ken Cockrell, and Mission Specialists Jim Voss, Mike Gernhardt, and Jim Newman) test the Orbital Maneuvering System and prepare for the retrieval of the SPARTAN satellite with a checkout procedure of the space shuttle's robot arm. Physiological and chemical experiments on fluid dynamics are conducted as part of the Sea Lab project. Urine and blood samples from the crew are collected and studied under microgravity conditions, and a slime mold experiment is conducted to determine the properties of motion, growth, and chemistry in zero gravity conditions. Earth views include cloud cover, a hurricane, and a close-up of its eye.

  16. SPARTAN-201-3 spacecraft prior to being re-captured

    NASA Image and Video Library

    1995-09-10

    STS069-703-00H (10 September 1995) --- Prior to being re-captured by Space Shuttle Endeavour’s Remote Manipulator System (RMS), the Shuttle Pointed Autonomous Research Tool for Astronomy (SPARTAN-201) spacecraft was recorded on film, backdropped against the darkness of space over a heavily cloud-covered Earth. Endeavour, with a five-member crew, launched on September 7, 1995, from the Kennedy Space Center (KSC) and ended its mission there on September 18, 1995, with a successful landing on Runway 33. The multifaceted mission carried a crew of astronauts David M. Walker, mission commander; Kenneth D. Cockrell, pilot; and James S. Voss (payload commander), James H. Newman and Michael L. Gernhardt, all mission specialists.

  17. Of people, particles and prejudice

    NASA Astrophysics Data System (ADS)

    Jackson, Penny; Greene, Anne; Mears, Matt; Spacecadet1; Green, Christian; Hunt, Devin J.; Berglyd Olsen, Veronica K.; Ilya, Komarov; Pierpont, Elaine; Gillman, Matthew

    2016-05-01

    In reply to Louise Mayor's feature article “Where people and particles collide”, about the experiences of researchers at CERN who are lesbian, gay, bisexual or transgender (LGBT), efforts to make LGBT CERN an officially recognized club, and incidents where posters advertising the club have been torn down or defaced (March pp31-36, http://ow.ly/YVP2Z).

  18. The Secret Chambers in the Chephren Pyramid

    ERIC Educational Resources Information Center

    Gutowski, Bartosz; Józwiak, Witold; Joos, Markus; Kempa, Janusz; Komorowska, Kamila; Krakowski, Kamil; Pijus, Ewa; Szymczak, Kamil; Trojanowska, Malgorzata

    2018-01-01

    In 2016, we (seven high school students from a school in Plock, Poland) participated in the CERN Beamline for Schools competition. Together with our team coach, Mr. Janusz Kempa, we submitted a proposal to CERN that was selected as one of two winning proposals that year. This paper describes our experiment from the early days of brainstorming to…

  19. Lead Ions and Coulomb's Law at the LHC (CERN)

    ERIC Educational Resources Information Center

    Cid-Vidal, Xabier; Cid, Ramon

    2018-01-01

    Although for most of the time the Large Hadron Collider (LHC) at CERN collides protons, for around one month every year lead ions are collided, to expand the diversity of the LHC research programme. Furthermore, in an effort not originally foreseen, proton-lead collisions are also taking place, with results of high interest to the physics…

  20. From strangeness enhancement to quark-gluon plasma discovery

    NASA Astrophysics Data System (ADS)

    Koch, Peter; Müller, Berndt; Rafelski, Johann

    2017-11-01

    This is a short survey of signatures and characteristics of the quark-gluon plasma in the light of experimental results that have been obtained over the past three decades. In particular, we present an in-depth discussion of the strangeness observable, including a chronology of the experimental effort to detect QGP at CERN-SPS, BNL-RHIC, and CERN-LHC.

  1. Ceremony 25th birthday Cern

    ScienceCinema

    None

    2018-05-18

    Celebration of CERN's 25th birthday with a speech by L. Van Hove and J.B. Adams, musical interludes by Ms. Mey and her colleagues (starting with Beethoven). The general managers then proceed with the presentation of souvenirs to members of the personnel who have 25 years of service in the organization. A gesture of recognition is also given to Zwerner.

  2. Comittees

    NASA Astrophysics Data System (ADS)

    2004-10-01

    Fritz Caspers (CERN, Switzerland), Michel Chanel (CERN, Switzerland), Håkan Danared (MSL, Sweden), Bernhard Franzke (GSI, Germany), Manfred Grieser (MPI für Kernphysik, Germany), Dieter Habs (LMU München, Germany), Jeffrey Hangst (University of Aarhus, Denmark), Takeshi Katayama (RIKEN/Univ. Tokyo, Japan), H.-Jürgen Kluge (GSI, Germany), Shyh-Yuan Lee (Indiana University, USA), Rudolf Maier (FZ Jülich, Germany), John Marriner (FNAL, USA), Igor Meshkov (JINR, Russia), Dieter Möhl (CERN, Switzerland), Vasily Parkhomchuk (BINP, Russia), Robert Pollock (Indiana University), Dieter Prasuhn (FZ Jülich, Germany), Dag Reistad (TSL, Sweden), John Schiffer (ANL, USA), Andrew Sessler (LBNL, USA), Alexander Skrinsky (BINP, Russia), Markus Steck (GSI, Germany), Jie Wei (BNL, USA), Andreas Wolf (MPI für Kernphysik, Germany), Hongwei Zhao (IMP, People's Rep. of China).

  3. Across Europe to CERN: Taking students on the ultimate physics experience

    NASA Astrophysics Data System (ADS)

    Wheeler, Sam

    2018-05-01

    In 2013, I was an Einstein Fellow with the U.S. Department of Energy and I was asked by a colleague, working in a senator's office, if I would join him in a meeting with a physicist to "translate" the science into something more understandable. That meeting turned out to be a wonderful opportunity I would never have otherwise had. During the meeting I met Michael Tuts, a physicist who was working on project ATLAS at CERN. Afterwards, I walked with him out of the Senate office building to Union Station and, in parting, he gave me his card and told me that if I were in Geneva that he could help me get a tour of CERN and the LHC.

  4. User and group storage management the CMS CERN T2 centre

    NASA Astrophysics Data System (ADS)

    Cerminara, G.; Franzoni, G.; Pfeiffer, A.

    2015-12-01

    A wide range of detector commissioning, calibration and data analysis tasks is carried out by CMS using dedicated storage resources available at the CMS CERN Tier-2 centre. Relying on the functionalities of the EOS disk-only storage technology, the optimal exploitation of the CMS user/group resources has required the introduction of policies for data access management, data protection, cleanup campaigns based on access pattern, and long term tape archival. The resource management has been organised around the definition of working groups and the delegation to an identified responsible of each group composition. In this paper we illustrate the user/group storage management, and the development and operational experience at the CMS CERN Tier-2 centre in the 2012-2015 period.

  5. [CERN-MEDICIS (Medical Isotopes Collected from ISOLDE): a new facility].

    PubMed

    Viertl, David; Buchegger, Franz; Prior, John O; Forni, Michel; Morel, Philippe; Ratib, Osman; Bühler Léo H; Stora, Thierry

    2015-06-17

    CERN-MEDICIS is a facility dedicated to research and development in life science and medical applications. The research platform was inaugurated in October 2014 and will produce an increasing range of innovative isotopes using the proton beam of ISOLDE for fundamental studies in cancer research, for new imaging and therapy protocols in cell and animal models and for preclinical trials, possibly extended to specific early phase clinical studies (phase 0) up to phase I trials. CERN, the University Hospital of Geneva (HUG), the University Hospital of Lausanne (CHUV), the Swiss Institute for Experimental Cancer (ISREC) at Swiss Federal Institutes of Technology (EPFL) that currently support the project will benefit of the initial production that will then be extended to other centers.

  6. Clouds and Water Vapor in the Climate System: Remotely Piloted Aircraft and Satellites

    NASA Technical Reports Server (NTRS)

    Anderson, James G.

    1999-01-01

    The objective of this work was to attack unanswered questions that lie at the intersection of radiation, dynamics, chemistry and climate. Considerable emphasis was placed on scientific collaboration and the innovative development of instruments required to address these scientific issues. The specific questions addressed include: Water vapor distribution in the Tropical Troposphere: An understanding of the mechanisms that dictate the distribution of water vapor in the middle-upper troposphere; Atmospheric Radiation: In the spectral region between 200 and 600/cm that encompasses the water vapor rotational and continuum structure, where most of the radiative cooling of the upper troposphere occurs, there is a critical need to test radiative transfer calculations using accurate, spectrally resolved radiance observations of the cold atmosphere obtained simultaneously with in situ species concentrations; Thin Cirrus: Cirrus clouds play a central role in the energy and water budgets of the tropical tropopause region; Stratosphere-Troposphere Exchange: Assessment of our ability to predict the behavior of the atmosphere to changes in the boundary conditions defined by thermal, chemical or biological variables; Correlative Science with Satellite Observations: Linking this research to the developing series of EOS observations is critical for scientific progress.

  7. Clouds as calibration targets for AVHRR reflected-solar channels - Results from a two-year study at NOAA/NESDIS

    NASA Technical Reports Server (NTRS)

    Abel, Peter

    1991-01-01

    NOAA-11 Advanced Very High Resolution Radiometer (AVHRR) and associated ground-based data have been collected at NOAA/NESDIS, on a daily basis and for 600 days, using five stations within the continental United States in the NOAA solar radiation (SOLRAD) monitoring network. The data have been filtered to include only uniformly overcast conditions and analyzed along the lines described by Paris and Justus (1988). Results from this first long-term pilot operational application of the method are presented. The method is potentially useful for establishing yearly-averaged trends in the radiometric gain of AVHRR Channels. The relatively small data base examined here suggests a precision in the 600 day mean gain of 5 percent or worse, with a significant part of this uncertainty being driven by poor knowlege of the bidirectional reflectance properties of clouds. The results suggest that the method in its present formulation has insufficient precision to be used as a primary method for the measurement of in-orbit gains of reflected-solar radiometers aboard polar orbiting satellites. Intrinsic limitations to the precision and time resolution of the method are discussed, and suggestions are offered for improving the precision of future results.

  8. New particle formation around the globe: From laboratory experiments to the Everest Base Camp (Arne Richter Award for Outstanding ECSs Lecture)

    NASA Astrophysics Data System (ADS)

    Bianchi, Federico

    2017-04-01

    Atmospheric aerosols affect the climate directly by absorbing or scattering incoming radiation and also indirectly by acting as cloud condensation nuclei (CCN) changing therefore the cloud albedo. A major fraction of these CCN comes from gas to particle conversion (nucleation). During the last decade, several nucleation studies have been published based on field observations, however most of them in the planetary boundary layer (PBL). Therefore, only little information is available about the free troposphere. The aim of this lecture is to elucidate the last founding about what species contribute to new particle formation (NPF) in the free troposphere. In the last years, we used a number of state-of-the-art instruments, first at the Swiss high alpine research station Jungfraujoch (3580 m asl) and then at the Himalayan Nepal Climate Observatory Pyramid (NCO-P) site on the southern slope of the Himalayas, not far from Everest base camp (5079 m asl). Previous studies have already showed that at both of these locations NPF takes place frequently. However, no chemical information of the vapours was retrieved. At the Nepal Climate Observatory Pyramid, we deployed an atmospheric pressure interface time-of-flight mass spectrometer (APi-TOF), a particle size magnifier (PSM) and a neutral cluster and air ion spectrometer (NAIS). The APi-TOF measured the chemical composition of either the positive or negative ions during many the nucleation events and when equipped with a chemical ionization source (CI-APi-TOF) it provided information on the chemical composition of the neutral species. In this medal lecture, in addition to present the results of these two studies, I will also compare them with other locations as the boreal forest (Hyytiälä) and polluted area as Shanghai but also with laboratory experiments (i.e. CLOUD experiment at CERN). I will present a detailed analysis of the particles evolution during nucleation and also the chemical composition of the small clusters measured with these advanced mass spectrometers. I will also show that these processes are potentially very interesting in order to understand the aerosol conditions in the pre-industrial era where information are really scares. At the end of the lecture I will also give some insight regarding future project above the Amazon and as well above the Alps.

  9. Stable Carbon Isotopes in Treerings; Revisiting the Paleocloud Proxy.

    NASA Astrophysics Data System (ADS)

    Gagen, M.; Zorita, E.; Dorado Liñán, I.; Loader, N.; McCarroll, D.; Robertson, I.; Young, G.

    2017-12-01

    The long term relationship between cloud cover and temperature is one of the most important climate feedbacks contributing to determining the value of climate sensitivity. Climate models still reveal a large spread in the simulation of changes in cloud cover under future warming scenarios and clarity might be aided by a picture of the past variability of cloudiness. Stable carbon isotope ratios from tree ring records have been successfully piloted as a palaeocloud proxy in geographical areas traditionally producing strong dendroclimatological reconstructions (high northern latitudes in the Northern Hemisphere) and with some notable successes elsewhere too. An expansion of tree-ring based palaeocloud reconstructions might help to estimate past variations of cloud cover in periods colder or warmer than the 20th century, providing a way to test model test this specific aspect. Calibration with measured instrumental sunshine and cloud data reveals stable carbon isotope ratios from tree rings as an indicator of incoming short wave solar radiation (SWR) in non-moisture stressed sites, but the statistical identification of the SWR signal is hampered by its interannual co-variability with air temperature during the growing season. Here we present a spatio-temporal statistical analysis of a multivariate stable carbon isotope tree ring data set over Europe to assess its usefulness to reconstruct past solar radiation changes. The interannual co-variability of the tree ring records stronger covariation with SWR than with air temperature. The resulting spatial patterns of interannual co-variability are strongly linked to atmospheric circulation in a physically consistent manner. However, the multidecadal variations in the proxy records show a less physically coherent picture. We explore whether atmospheric corrections applied to the proxy series are contributing to differences in the multi decadal signal and investigate whether multidecadal variations in soil moisture perturb the SWR. Preliminary results of strategies to bypass these problems are explored.

  10. A pilot Virtual Observatory (pVO) for integrated catchment science - Demonstration of national scale modelling of hydrology and biogeochemistry (Invited)

    NASA Astrophysics Data System (ADS)

    Freer, J. E.; Bloomfield, J. P.; Johnes, P. J.; MacLeod, C.; Reaney, S.

    2010-12-01

    There are many challenges in developing effective and integrated catchment management solutions for hydrology and water quality issues. Such solutions should ideally build on current scientific evidence to inform policy makers and regulators and additionally allow stakeholders to take ownership of local and/or national issues, in effect bringing together ‘communities of practice’. A strategy being piloted in the UK as the Pilot Virtual Observatory (pVO), funded by NERC, is to demonstrate the use of cyber-infrastructure and cloud computing resources to investigate better methods of linking data and models and to demonstrate scenario analysis for research, policy and operational needs. The research will provide new ways the scientific and stakeholder communities come together to exploit current environmental information, knowledge and experience in an open framework. This poster presents the project scope and methodologies for the pVO work dealing with national modelling of hydrology and macro-nutrient biogeochemistry. We evaluate the strategies needed to robustly benchmark our current predictive capability of these resources through ensemble modelling. We explore the use of catchment similarity concepts to understand if national monitoring programs can inform us about the behaviour of catchments. We discuss the challenges to applying these strategies in an open access and integrated framework and finally we consider the future for such virtual observatory platforms for improving the way we iteratively improve our understanding of catchment science.

  11. A Comprehensive Infrastructure for Big Data in Cancer Research: Accelerating Cancer Research and Precision Medicine

    PubMed Central

    Hinkson, Izumi V.; Davidsen, Tanja M.; Klemm, Juli D.; Chandramouliswaran, Ishwar; Kerlavage, Anthony R.; Kibbe, Warren A.

    2017-01-01

    Advancements in next-generation sequencing and other -omics technologies are accelerating the detailed molecular characterization of individual patient tumors, and driving the evolution of precision medicine. Cancer is no longer considered a single disease, but rather, a diverse array of diseases wherein each patient has a unique collection of germline variants and somatic mutations. Molecular profiling of patient-derived samples has led to a data explosion that could help us understand the contributions of environment and germline to risk, therapeutic response, and outcome. To maximize the value of these data, an interdisciplinary approach is paramount. The National Cancer Institute (NCI) has initiated multiple projects to characterize tumor samples using multi-omic approaches. These projects harness the expertise of clinicians, biologists, computer scientists, and software engineers to investigate cancer biology and therapeutic response in multidisciplinary teams. Petabytes of cancer genomic, transcriptomic, epigenomic, proteomic, and imaging data have been generated by these projects. To address the data analysis challenges associated with these large datasets, the NCI has sponsored the development of the Genomic Data Commons (GDC) and three Cloud Resources. The GDC ensures data and metadata quality, ingests and harmonizes genomic data, and securely redistributes the data. During its pilot phase, the Cloud Resources tested multiple cloud-based approaches for enhancing data access, collaboration, computational scalability, resource democratization, and reproducibility. These NCI-led efforts are continuously being refined to better support open data practices and precision oncology, and to serve as building blocks of the NCI Cancer Research Data Commons. PMID:28983483

  12. Asymmetric B-factory note

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calderon, M.

    Three main issues giving purpose to our visit to CERN, ESRF and DESY were to: assess the current thinking at CERN on whether Eta, the gas desorption coefficient, would continue to decrease with continued with continued beam cleaning, determine if the time between NEG reconditioning could be expanded, and acquire a knowledge of the basic fabrication processes and techniques for producing beam vacuum chambers of copper.

  13. The Proton Synchrotron (PS): At the Core of the CERN Accelerators

    NASA Astrophysics Data System (ADS)

    Cundy, Donald; Gilardoni, Simone

    The following sections are included: * Introduction * Extraction: Getting the Beam to Leave the Accelerator * Acceleration and Bunch Gymnastics * Boosting PS Beam Intensity * Capacitive Energy Storage Replaces Flywheel * Taking the Neutrinos by the Horns * OMEGA: Towards the Electronic Bubble Chamber * ISOLDE: Targeting a New Era in Nuclear Physics * The CERN n_TOF Facility: Catching Neutrons on the Fly * References

  14. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  15. The Higgs Boson: Is the End in Sight?

    ERIC Educational Resources Information Center

    Lincoln, Don

    2012-01-01

    This summer, perhaps while you were lounging around the pool in the blistering heat, the blogosphere was buzzing about data taken at the Large Hadron Collider at CERN. The buzz reached a crescendo in the first week of July when both Fermilab and CERN announced the results of their searches for the Higgs boson. Hard data confronted a theory nearly…

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute Particle Cosmology which will take placemore » from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line.« less

  17. Lecture archiving on a larger scale at the University of Michigan and CERN

    NASA Astrophysics Data System (ADS)

    Herr, Jeremy; Lougheed, Robert; Neal, Homer A.

    2010-04-01

    The ATLAS Collaboratory Project at the University of Michigan has been a leader in the area of collaborative tools since 1999. Its activities include the development of standards, software and hardware tools for lecture archiving, and making recommendations for videoconferencing and remote teaching facilities. Starting in 2006 our group became involved in classroom recordings, and in early 2008 we spawned CARMA, a University-wide recording service. This service uses a new portable recording system that we developed. Capture, archiving and dissemination of rich multimedia content from lectures, tutorials and classes are increasingly widespread activities among universities and research institutes. A growing array of related commercial and open source technologies is becoming available, with several new products introduced in the last couple years. As the result of a new close partnership between U-M and CERN IT, a market survey of these products was conducted and a summary of the results are presented here. It is informing an ambitious effort in 2009 to equip many CERN rooms with automated lecture archiving systems, on a much larger scale than before. This new technology is being integrated with CERN's existing webcast, CDS, and Indico applications.

  18. COSMO 09

    ScienceCinema

    None

    2018-02-13

    This year's edition of the annual Cosmo International Conference on Particle Physics and Cosmology -- Cosmo09 -- will be hosted by the CERN Theory Group from Monday September 7 till Friday September 11, 2009. The conference will take place at CERN, Geneva (Switzerland). The Cosmo series is one of the major venues of interaction between cosmologists and particle physicists. In the exciting LHC era, the Conference will be devoted to the modern interfaces between Fundamental and Phenomenological Particle Physics and Physical Cosmology and Astronomy. The Conference will be followed by the CERN TH Institute Particle Cosmology which will take place from Monday September 14 till Friday September 18, 2009. The CERN-TH Institutes are visitor programs intended to bring together scientists with similar interests and to promote scientific collaborations. If you wish to participate, please register on the Institute web page. Link to last editions: COSMO 07 (U. of Sussex), COSMO 08 (U. of Wisconsin) List of plenary speakers: Gianfranco Bertone, Pierre Binetruy, Francois Bouchet, Juerg Diemand, Jonathan Feng, Gregory Gabadadze, Francis Halzen, Steen Hannestad, Will Kinney, Johannes Knapp, Hiranya Peiris, Will Percival, Syksy Rasanen, Alexandre Refregier, Pierre Salati, Roman Scoccimarro, Michael Schubnell, Christian Spiering, Neil Spooner, Andrew Tolley, Matteo Viel. The plenary program is available on-line.

  19. HPC in a HEP lab: lessons learned from setting up cost-effective HPC clusters

    NASA Astrophysics Data System (ADS)

    Husejko, Michal; Agtzidis, Ioannis; Baehler, Pierre; Dul, Tadeusz; Evans, John; Himyr, Nils; Meinhard, Helge

    2015-12-01

    In this paper we present our findings gathered during the evaluation and testing of Windows Server High-Performance Computing (Windows HPC) in view of potentially using it as a production HPC system for engineering applications. The Windows HPC package, an extension of Microsofts Windows Server product, provides all essential interfaces, utilities and management functionality for creating, operating and monitoring a Windows-based HPC cluster infrastructure. The evaluation and test phase was focused on verifying the functionalities of Windows HPC, its performance, support of commercial tools and the integration with the users work environment. We describe constraints imposed by the way the CERN Data Centre is operated, licensing for engineering tools and scalability and behaviour of the HPC engineering applications used at CERN. We will present an initial set of requirements, which were created based on the above constraints and requests from the CERN engineering user community. We will explain how we have configured Windows HPC clusters to provide job scheduling functionalities required to support the CERN engineering user community, quality of service, user- and project-based priorities, and fair access to limited resources. Finally, we will present several performance tests we carried out to verify Windows HPC performance and scalability.

  20. DataBase on Demand

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, R.; Gomez, D.; Coterillo Coz, I.; Wojcik, D.

    2012-12-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  1. Development of a climatological data base to help forecast cloud cover conditions for shuttle landings at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Atchison, M. Kevin

    1993-01-01

    The Space Shuttle is an extremely weather sensitive vehicle with very restrictive constraints for both launches and landings. The most important difference between Shuttle and normal aircraft landings is that the Shuttle has no go-around capability once it begins its decent into the earth's atmosphere. The de-orbit burn decision is generally made approximately 90 minutes before landing requiring a forecast with little room for error. Because of the Shuttle's rapid re-entry to earth, the pilot must be able to see all runway and visual navigation aids from high altitude to land the Shuttle. In addition, the heat resistant tiles which are used to protect the Shuttle during its re-entry into the earth's atmosphere are extremely sensitive to any type of precipitation. Extensive damage to these tiles could occur if the Shuttle passes through any cloud that contains precipitation size particles. To help guard against changing weather conditions or any type of weather problems that might occur prior to landing, flight rules have been developed as guidelines for all landings. Although the rules vary depending on the location of the landing (Kennedy Space Center or Edwards AFB), length of mission, and weight of vehicle, most of the rules can be condensed into 4 major groupings. These are: (1) Cloud ceilings should not be less than 3048 m (10,000 feet), (2) Visibility should not be less than 13 km (7 nm), (3) Cross-wind no greater than 5-8 m/s (10-15 knots); and (4) No showers or thunderstorms at or within 56 km (30 nm) of the Shuttle Landing Facility. This study consisted of developing a climatological database of the Shuttle Landing Facility (SLF) surface observations and performing an analysis of observed conditions one and two hours subsequent to given conditions at the SLF to help analyze the 0.2 cloud cover rule. Particular emphasis was placed on Shuttle landing weather violations and the amounts of cloud cover below 3048 m (10,000 ft.). This analysis has helped to determine the best and worst times to land the Shuttle at KSC. In addition, nomograms have been developed to help forecasters make cloud cover forecasts for End of Mission (EOM) and Return to Launch Site (RTLS) at KSC. Results of categorizing this data by month, season, time of day, and surface and upper-air wind direction are presented.

  2. 65th birthday Jack Steinberger

    ScienceCinema

    None

    2017-12-09

    Laudatio pour Jack Steinberger né le 25 mai 1921, à l'occasion de son 65me anniversaire et sa retraite officielle, pour sa précieuse collaboration au Cern. Néanmoins son principal activité continuera comme avant dans sa recherche au Cern. Plusieurs orateurs prennent la parole (p.ex. E.Picasso) pour le féliciter et lui rendre hommage

  3. History of Cern

    ScienceCinema

    None

    2017-12-09

    Cérémonie à l'occasion de l'apparition du premier volume du livre sur l'histoire du Cern, avec plusieurs personnes présentes qui jouaient un rôle important dans cette organisation européenne couronnée de succès grâce à l'esprit des membres fondateurs qui est et restera essentiel

  4. Investigating the Inverse Square Law with the Timepix Hybrid Silicon Pixel Detector: A CERN [at] School Demonstration Experiment

    ERIC Educational Resources Information Center

    Whyntie, T.; Parker, B.

    2013-01-01

    The Timepix hybrid silicon pixel detector has been used to investigate the inverse square law of radiation from a point source as a demonstration of the CERN [at] school detector kit capabilities. The experiment described uses a Timepix detector to detect the gamma rays emitted by an [superscript 241]Am radioactive source at a number of different…

  5. Towards Near Real-time Convective Rainfall Observations over Kenya

    NASA Astrophysics Data System (ADS)

    Hoedjes, Joost; Said, Mohammed; Becht, Robert; Kifugo, Shem; Kooiman, André; Limo, Agnes; Maathuis, Ben; Moore, Ian; Mumo, Mark; Nduhiu Mathenge, Joseph; Su, Bob; Wright, Iain

    2013-04-01

    The existing meteorological infrastructure in Kenya is poorly suited for the countrywide real-time monitoring of precipitation. Rainfall radar is not available, and the existing network of rain gauges is sparse and challenging to maintain. This severely restricts Kenya's capacity to warn for, and respond to, weather related emergencies. Furthermore, the lack of accurate rainfall observations severely limits Kenya's climate change adaptation capabilities. Over the past decade, the mobile telephone network in Kenya has expanded rapidly. This network makes extensive use of terrestrial microwave (MW) links, received signal level (RSL) data from which can be used for the calculation of rainfall intensities. We present a novel method for the near-real time observation of convective rainfall over Kenya, based on the combined use of MW RSL data and Meteosat Second Generation (MSG) satellite data. In this study, the variable density rainfall information derived from several MW links is scaled up using MSG data to provide full rainfall information coverage for the region surrounding the links. Combining MSG data and MW link derived rainfall data for several adjacent MW links makes it possible to make the distinction between wet and dry pixels. This allows the disaggregation of the MW link derived rainfall intensities. With the distinction between wet and dry pixels made, and the MW derived rainfall intensities disaggregated, these data can then be used to develop instantaneous empirical relationships linking rainfall intensities to cloud physical properties. These relationships are then used to calculate rainfall intensities for the MSG scene. Since both the MSG and the MW data are available at the same temporal resolution, unique empirical coefficients can be determined for each interval. This approach ensures that changes in convective conditions from one interval to the next are taken into account. Initial results from a pilot study, which took place from November 2012 until January 2013, are presented. The work has been carried out in close cooperation with mobile telephone operator Safaricom, using RSL data from 15 microwave links in rain prone areas in Western Kenya (out of a total of 3000 MW links operated by Safaricom in Kenya). The data supplied by Safaricom consist of the mean, minimum and maximum RSL for each MW link over a 15 minute interval. For this pilot study, use has been made of the MSG Cloud Top Temperature data product from the Royal Dutch Meteorological Institute's MSG Cloud Physical Properties database (http://msgcpp.knmi.nl/).

  6. Radiation protection challenges in the management of radioactive waste from high-energy accelerators.

    PubMed

    Ulrici, Luisa; Algoet, Yvon; Bruno, Luca; Magistris, Matteo

    2015-04-01

    The European Laboratory for Particle Physics (CERN) has operated high-energy accelerators for fundamental physics research for nearly 60 y. The side-product of this activity is the radioactive waste, which is mainly generated as a result of preventive and corrective maintenance, upgrading activities and the dismantling of experiments or accelerator facilities. Prior to treatment and disposal, it is common practice to temporarily store radioactive waste on CERN's premises and it is a legal requirement that these storage facilities are safe and secure. Waste treatment typically includes sorting, segregation, volume and size reduction and packaging, which will depend on the type of component, its chemical composition, residual activity and possible surface contamination. At CERN, these activities are performed in a dedicated waste treatment centre under the supervision of the Radiation Protection Group. This paper gives an overview of the radiation protection challenges in the conception of a temporary storage and treatment centre for radioactive waste in an accelerator facility, based on the experience gained at CERN. The CERN approach consists of the classification of waste items into 'families' with similar radiological and physical-chemical properties. This classification allows the use of specific, family-dependent techniques for radiological characterisation and treatment, which are simultaneously efficient and compliant with best practices in radiation protection. The storage was planned on the basis of radiological and other possible hazards such as toxicity, pollution and fire load. Examples are given of technical choices for the treatment and radiological characterisation of selected waste families, which could be of interest to other accelerator facilities. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. STS-69 Flight Day 9 Video File

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The song, 'He's A Tramp', from the Walt Disney cartoon movie, 'Lady and the Tramp', awakened the astronauts, Cmdr. Dave Walker, Pilot Ken Cockrell, and Mission Specialists Jim Voss, Jim Newman, and Mike Gernhardt, on the ninth day of the STS-69 mission. The Wake Shield Facility (WSF) was again unberthed from the shuttle cargo bay and , using the shuttle's robot arm, held over the side of the shuttle for five hours where it collected data on the electrical field build-up around the spacecraft as part of the Charging Hazards and Wake Studies Experiment (CHAWS). Voss and Gernhardt rehearsed their Extravehicular Activity (EVA) spacewalk, which was planned for the next day. Earth views included cloud cover, a hurricane, and its eye.

  8. STS-69 flight day 9 highlights

    NASA Astrophysics Data System (ADS)

    1995-09-01

    The song, 'He's A Tramp', from the Walt Disney cartoon movie, 'Lady and the Tramp', awakened the astronauts, Cmdr. Dave Walker, Pilot Ken Cockrell, and Mission Specialists Jim Voss, Jim Newman, and Mike Gernhardt, on the ninth day of the STS-69 mission. The Wake Shield Facility (WSF) was again unberthed from the shuttle cargo bay and , using the shuttle's robot arm, held over the side of the shuttle for five hours where it collected data on the electrical field build-up around the spacecraft as part of the Charging Hazards and Wake Studies Experiment (CHAWS). Voss and Gernhardt rehearsed their Extravehicular Activity (EVA) spacewalk, which was planned for the next day. Earth views included cloud cover, a hurricane, and its eye.

  9. DTO 1118 - Survey of the Mir Space Station

    NASA Image and Video Library

    1998-01-29

    STS089-714-066 (22-31 Jan. 1998) --- A series of 70mm still shots was recorded of Russia's Mir Space Station from the Earth-orbiting space shuttle Endeavour following undocking of the two spacecraft. A large blanket of white clouds cover thousands of square miles in this oblique panorama. Onboard the Mir at this point were cosmonaut Anatoly Y. Solovyev, commander; Pavel V. Vinogradov, flight engineer; and Andrew S. W. Thomas, cosmonaut guest researcher. Onboard Endeavour were Terrence W. (Terry) Wilcutt, commander; Joe F. Edwards Jr., pilot; Bonnie J. Dunbar, payload commander; mission specialists David A. Wolf (former cosmonaut guest researcher), Michael P. Anderson, James F. Reilly, and Salizhan S. Sharipov representing Russian Space Agency (RSA). Photo credit: NASA

  10. DFW (Dallas-Ft. Worth) microburst on August 2, 1985

    NASA Technical Reports Server (NTRS)

    Fujita, T. T.

    1986-01-01

    The features of the microburst on August 2, 1985, related to the Delta 191 accident during the approach to Runway 17L of the Dallas-Ft. Worth Airport is described. Both radar and satellite data, along with ground-based measurements, were used in determining the characteristics of the parent cloud which spawned the most complicated microburst winds ever analyzed by the author. The detailed reconstruction of the airflow and the aircraft's maneuver were made possible by a series of computer analyses of the Digital Flight Data Recorder (DFDR) readout. Both measured and computed values in color diagrams that can be evaluated readily by meteorologists, pilots, structural engineers, and other interested persons in preventing microburst-related accidents in future years are presented.

  11. Recent results and performance of the multi-gap resistive plate chambers network for the EEE Project

    NASA Astrophysics Data System (ADS)

    Abbrescia, M.; Avanzini, C.; Baldini, L.; Baldini Ferroli, R.; Batignani, G.; Bencivenni, G.; Bossini, E.; Chiavassa, A.; Cicalò, C.; Cifarelli, L.; Coccia, E.; Corvaglia, A.; De Gruttola, D.; De Pasquale, S.; Di Giovanni, A.; D`Incecco, M.; Dreucci, M.; Fabbri, F. L.; Fattibene, E.; Ferraro, A.; Frolov, V.; Galeotti, P.; Garbini, M.; Gemme, G.; Gnesi, I.; Grazzi, S.; Gustavino, C.; Hatzifotiadou, D.; La Rocca, P.; Licciulli, F.; Maggiora, A.; Maragoto Rodriguez, O.; Maron, G.; Martelli, B.; Mazziotta, M. N.; Miozzi, S.; Nania, R.; Noferini, F.; Nozzoli, F.; Panareo, M.; Panetta, M. P.; Paoletti, R.; Park, W.; Perasso, L.; Pilo, F.; Piragino, G.; Riggi, F.; Righini, G. C.; Sartorelli, G.; Scapparone, E.; Schioppa, M.; Scribano, A.; Selvi, M.; Serci, S.; Siddi, E.; Squarcia, S.; Stori, L.; Taiuti, M.; Terreni, G.; Visnyei, O. B.; Vistoli, M. C.; Votano, L.; Williams, M. C. S.; Zani, S.; Zichichi, A.; Zuyeusky, R.

    2016-11-01

    The Extreme Energy Events (EEE) Project is devoted to the study of Extensive Atmospheric Showers through a network of muon telescopes, installed in High Schools, with the further aim of introducing young students to particle and astroparticle physics. Each telescope is a tracking detector composed of three Multi-gap Resistive Plate Chambers (MRPC) with an active area of 1.60 × 0.80 m2. Their characteristics are similar to the ones built for the Time Of Flight array of the ALICE Experimentat LHC . The EEE Project started with a few pilot towns, where the telescopes have been taking data since 2008, and it has been constantly extended, reaching at present more than 50 MRPCs telescopes. They are spread across Italy with two additional stations at CERN, covering an area of around 3 × 105 km2, with a total surface area for all the MRPCs of 190 m2. A comprehensive description of the MRPCs network is reported here: efficiency, time and spatial resolution measured using cosmic rays hitting the telescopes. The most recent results on the detector and physics performance from a series of coordinated data acquisition periods are also presented.

  12. Air liquide 1.8 K refrigeration units for CERN LHC project

    NASA Astrophysics Data System (ADS)

    Hilbert, Benoît; Gistau-Baguer, Guy M.; Caillaud, Aurélie

    2002-05-01

    The Large Hadron Collider (LHC) will be CERN's next research instrument for high energy physics. This 27 km long circular accelerator will make intensive use of superconducting magnets, operated below 2.0 K. It will thus require high capacity refrigeration below 2.0 K [1, 2]. Coupled to a refrigerator providing 18 kW equivalent at 4.5 K [3], these systems will be able to absorb a cryogenic power of 2.4 kW at 1.8 K in nominal conditions. Air Liquide has designed one Cold Compressor System (CCS) pre-series for CERN-preceding 3 more of them (among 8 in total located around the machine). These systems, making use of cryogenic centrifugal compressors in a series arrangement coupled to room temperature screw compressors, are presented. Key components characteristics will be given.

  13. Upgrade of the cryogenic CERN RF test facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pirotte, O.; Benda, V.; Brunner, O.

    2014-01-29

    With the large number of superconducting radiofrequency (RF) cryomodules to be tested for the former LEP and the present LHC accelerator a RF test facility was erected early in the 1990’s in the largest cryogenic test facility at CERN located at Point 18. This facility consisted of four vertical test stands for single cavities and originally one and then two horizontal test benches for RF cryomodules operating at 4.5 K in saturated helium. CERN is presently working on the upgrade of its accelerator infrastructure, which requires new superconducting cavities operating below 2 K in saturated superfluid helium. Consequently, the RFmore » test facility has been renewed in order to allow efficient cavity and cryomodule tests in superfluid helium and to improve its thermal performances. The new RF test facility is described and its performances are presented.« less

  14. Wolfgang Kummer at CERN

    NASA Astrophysics Data System (ADS)

    Schopper, Herwig

    Wolfgang Kummer was not only a great theorist but also a man with a noble spirit and extensive education, based on a fascinating long-term Austrian cultural tradition. As an experimentalist I am not sufficiently knowledgeable to evaluate his contributions to theoretical physics - this will certainly be done by more competent scientists. Nevertheless I admired him for not only being attached to fundamental and abstract problems like quantum field theory, quantum gravity or black holes, but for his interest in down to earth questions like electron-proton scattering or the toponium mass. I got to know Wolfgang Kummer very well and appreciate his human qualities during his long attachment to CERN, in particular when he served as president of the CERN Council, the highest decision taking authority of this international research centre, from 1985 to 1987 falling into my term as Director-General…

  15. Database on Demand: insight how to build your own DBaaS

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, Ruben; Coterillo Coz, Ignacio

    2015-12-01

    At CERN, a number of key database applications are running on user-managed MySQL, PostgreSQL and Oracle database services. The Database on Demand (DBoD) project was born out of an idea to provide CERN user community with an environment to develop and run database services as a complement to the central Oracle based database service. The Database on Demand empowers the user to perform certain actions that had been traditionally done by database administrators, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently three major RDBMS (relational database management system) vendors are offered. In this article we show the actual status of the service after almost three years of operations, some insight of our new redesign software engineering and near future evolution.

  16. Application of the Medipix2 technology to space radiation dosimetry and hadron therapy beam monitoring

    NASA Astrophysics Data System (ADS)

    Pinsky, Lawrence; Stoffle, Nicholas; Jakubek, Jan; Pospisil, Stanislav; Leroy, Claude; Gutierrez, Andrea; Kitamura, Hisashi; Yasuda, Nakahiro; Uchihori, Yulio

    2011-02-01

    The Medipix2 Collaboration, based at CERN, has developed the TimePix version of the Medipix pixel readout chip, which has the ability to provide either an ADC or TDC capability separately in each of its 256×256 pixels. When coupled to a Si detector layer, the device is an excellent candidate for application as an active dosimeter for use in space radiation environments. In order to facilitate such a development, data have been taken with heavy ions at the HIMAC facility in Chiba, Japan. In particular, the problem of determining the resolution of such a detector system with respect to heavy ions of differing charges and energies, but with similar d E/d x values has been explored for several ions. The ultimate problem is to parse the information in the pixel "footprint" images from the drift of the charge cloud produced in the detector layer. In addition, with the use of convertor materials, the detector can be used as a neutron detector, and it has been used both as a charged particle and neutron detector to evaluate the detailed properties of the radiation fields produced by hadron therapy beams. New versions of the basic chip design are ongoing.

  17. Impact of large beam-induced heat loads on the transient operation of the beam screens and the cryogenic plants of the Future Circular Collider (FCC)

    NASA Astrophysics Data System (ADS)

    Correia Rodrigues, H.; Tavian, L.

    2017-12-01

    The Future Circular Collider (FCC) under study at CERN will produce 50-TeV high-energy proton beams. The high-energy particle beams are bent by 16-T superconducting dipole magnets operating at 1.9 K and distributed over a circumference of 80 km. The circulating beams induce 5 MW of dynamic heat loads by several processes such as synchrotron radiation, resistive dissipation of beam image currents and electron clouds. These beam-induced heat loads will be intercepted by beam screens operating between 40 and 60 K and induce transients during beam injection. Energy ramp-up and beam dumping on the distributed beam-screen cooling loops, the sector cryogenic plants and the dedicated circulators. Based on the current baseline parameters, numerical simulations of the fluid flow in the cryogenic distribution system during a beam operation cycle were performed. The effects of the thermal inertia of the headers on the helium flow temperature at the cryogenic plant inlet as well as the temperature gradient experienced by the beam screen has been assessed. Additionally, this work enabled a thorough exergetic analysis of different cryogenic plant configurations and laid the building-block for establishing design specification of cold and warm circulators.

  18. The beam test of muon detector parameters for the SHiP experiment at CERN

    NASA Astrophysics Data System (ADS)

    Likhacheva, V. L.; Kudenko, Yu. G.; Mefodiev, A. V.; Mineev, O. V.; Khotyantsev, A. N.

    2018-01-01

    Scintillation detectors based on extruded plastics have been tested in a 10 GeV/c beam at CERN. The scintillation signal readout was provided using optical wavelength shifting fibers Y11 Kuraray and Hamamatsu MPPC micropixel avalanche photodiodes. The light yield was scanned along and across the detectors. Time resolution was found by fitting the MPPC digitized pulse rise and other methods.

  19. Determining the structure of Higgs couplings at the CERN LargeHadron Collider.

    PubMed

    Plehn, Tilman; Rainwater, David; Zeppenfeld, Dieter

    2002-02-04

    Higgs boson production via weak boson fusion at the CERN Large Hadron Collider has the capability to determine the dominant CP nature of a Higgs boson, via the tensor structure of its coupling to weak bosons. This information is contained in the azimuthal angle distribution of the two outgoing forward tagging jets. The technique is independent of both the Higgs boson mass and the observed decay channel.

  20. CERN data services for LHC computing

    NASA Astrophysics Data System (ADS)

    Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.

    2017-10-01

    Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.

  1. Commissioning results of CERN HIE-ISOLDE and INFN ALPI cryogenic control systems

    NASA Astrophysics Data System (ADS)

    Inglese, V.; Pezzetti, M.; Calore, A.; Modanese, P.; Pengo, R.

    2017-02-01

    The cryogenic systems of both accelerators, namely HIE ISOLDE (High Intensity and Energy Isotope Separator On Line DEvice) at CERN and ALPI (Acceleratore Lineare Per Ioni) at LNL, have been refurbished. HIE ISOLDE is a major upgrade of the existing ISOLDE facilities, which required the construction of a superconducting linear accelerator consisting of six cryomodules, each containing five superconductive RF cavities and superconducting solenoids. The ALPI linear accelerator, similar to HIE ISOLDE, is located at Legnaro National Laboratories (LNL) and became operational in the early 90’s. It is composed of 74 superconducting RF cavities, assembled inside 22 cryostats. The new control systems are equipped with PLC, developed on the CERN UNICOS framework, which include Schneider and Siemens PLCs and various fieldbuses (Profibus DP and PA, WorldFIP). The control systems were developed in synergy between CERN and LNL in order to build, effectively and with an optimized use of resources, control systems allowing to enhance ease of operation, maintainability, and long-term availability. This paper describes (i) the cryogenic systems, with special focus on the design of the control systems hardware and software, (ii) the strategy adopted in order to achieve a synergic approach, and (iii) the commissioning results after the cool-down to 4.5 K of the cryomodules.

  2. A multi-sensor study of the impact of ground-based glaciogenic seeding on orogrpahic clouds and precipitation

    NASA Astrophysics Data System (ADS)

    Pokharel, Binod

    This dissertation examines reflectivity data from three different radar systems, as well as airborne and ground-based in situ particle imaging data, to study the impact of ground-based glaciogenic seeding on orographic clouds and precipitation formed over the mountains in southern Wyoming. The data for this study come from the AgI Seeding Cloud Impact Investigation (ASCII) field campaign conducted over the Sierra Madre mountains in 2012 (ASCII-12) and over the Medicine Bow mountains in 2013 (ASCII-13) in the context of the Wyoming Weather Modification Pilot Project (WWMPP). The campaigns were supported by a network of ground-based instruments, including a microwave radiometer, two profiling Ka-band Micro Rain Radars (MRRs), a Doppler on Wheels (DOW), rawinsondes, a Cloud Particle Imager, and a Parsivel disdrometer. The University of Wyoming King Air with profiling Wyoming Cloud Radar (WCR) conducted nine successful flights in ASCII-12, and eight flights in ASCII-13. WCR profiles from these flights are combined with those from seven other flights, which followed the same geographically-fixed pattern in 2008-09 (pre-ASCII) over the Medicine Bow range. All sampled storms were relatively shallow, with low-level air forced over the target mountain, and cold enough to support ice initiation by silver iodide (AgI) nuclei in cloud. Three detailed case studies are conducted, each with different atmospheric conditions and different cloud and snow growth properties: one case (21 Feb 2012) is stratiform, with strong winds and cloud droplets too small to enable snow growth by accretion (riming). A second case (13 Feb 2012) contains shallow convective cells. Clouds in the third case study (22 Feb 2012) are stratiform but contain numerous large droplets (mode ~35 microm in diameter), large enough for ice particle growth by riming. These cases and all others, each with a treated period following an untreated period, show that a clear seeding signature is not immediately apparent in individual WCR reflectivity transects downwind of the silver iodide (AgI) generators, and that the natural trends in the precipitation over short timescales can easily overwhelm any seeding-induced change. Therefore the ASCII experimental design included a control region, upwind of the AgI generators. The three case studies generally show an increase in surface snow particle concentration in the target region during the seeding period. Frequency-by-altitude displays of all WCR reflectivity data collected during the flights show slightly higher reflectivity values during seeding near the ground, at least when compared to the control region, in all three cases. This also applies to the two other radar systems (MRR and DOW), both with their own sampling strategy and target/control regions. An examination of all ASCII cases combined (the "composite" analysis) also shows a positive trend in low-level reflectivity relative to the control region, both in convective and in stratiform cases. Also, convective cells sampled at flight level downwind of the AgI generators contain a higher concentration of small ice crystals during seeding. A word of caution is warranted: both the magnitude and the sign of the change in the target region, compared to that in the control region, varies from case to case in the composite, and amongst the three radar systems (WCR, DOW and MRR). We speculate that this variation is only partly driven by different responses of orographic clouds to glaciogenic seeding, related to factors such as cloud base and cloud top temperature, cloud liquid water content, and snow growth mechanism. Instead, most of this variation probably relates to non-homogenous natural trends across the mountain range, and/or to sample unrepresentativeness, especially for the (relative small) control region, in other words to the sampling methods. The impact of natural variability and sampling aliasing can only be overcome by a storm sample size much larger than that collected in ASCII. As such, the ASCII sample size is not adequate either to quantify the magnitude of the seeding impact on snowfall, or to identify the conditions most suitable for ground-based seeding. This study is an exploration of cloud microphysical evidence linking AgI cloud seeding to snowfall. It is not a statistical study. The preponderance of evidence from different radars and ground-based and airborne particle probes deployed in ASCII, in three case studies and in the composite analysis, points to the ability of ground-based glaciogenic seeding to increase the snowfall rate in orographic clouds..

  3. Classroom virtual lab experiments as teaching tools for explaining how we understand planetary processes

    NASA Astrophysics Data System (ADS)

    Hill, C. N.; Schools, H.; Research Team Members

    2012-12-01

    This presentation will report on a classroom pilot study in which we teamed with school teachers in four middle school classes to develop and deploy course modules that connect the real-world to virtual forms of laboratory experiments.The broad goal is to help students realize that seemingly complex Earth system processes can be connected to basic properties of the planet and that this can be illustrated through idealized experiment. Specifically the presentation will describe virtual modules based on on-demand cloud computing technologies that allow students to test the notion that pole equator gradients in radiative forcing together with rotation can explain characteristic patterns of flow in the atmosphere. The module developed aligns with new Massachusetts science standard requirements regarding understanding of weather and climate processes. These new standards emphasize an appreciation of differential solar heating and a qualitative understanding of the significance of rotation. In our preliminary classroom pilot studies we employed pre and post evaluation tests to establish that the modules had increased student knowledge of phenomenology and terms. We will describe the results of these tests as well as results from anecdotal measures of student response. This pilot study suggests that one way to help make Earth science concepts more tractable to a wider audience is through virtual experiments that distill phenomena down, but still retain enough detail that students can see the connection to the real world. Modern computer technology and developments in research models appear to provide an opportunity for more work in this area. We will describe some follow-up possibilities that we envisage.

  4. Live Storybook Outcomes of Pilot Multidisciplinary Elementary Earth Science Collaborative Project

    NASA Astrophysics Data System (ADS)

    Soeffing, C.; Pierson, R.

    2017-12-01

    Live Storybook Outcomes of pilot multidisciplinary elementary earth science collaborative project Anchoring phenomena leading to student led investigations are key to applying the NGSS standards in the classroom. This project employs the GLOBE elementary storybook, Discoveries at Willow Creek, as an inspiration and operational framework for a collaborative pilot project engaging 4th grade students in asking questions, collecting relevant data, and using analytical tools to document and understand natural phenomena. The Institute of Global Environmental Strategies (IGES), a GLOBE Partner, the Outdoor Campus, an informal educational outdoor learning facility managed by South Dakota Game, Fish and Parks, University of Sioux Falls, and All City Elementary, Sioux Falls are collaborating partners in this project. The Discoveries at Willow Creek storyline introduces young students to the scientific process, and models how they can apply science and engineering practices (SEPs) to discover and understand the Earth system in which they live. One innovation associated with this project is the formal engagement of elementary students in a global citizen science program (for all ages), GLOBE Observer, and engaging them in data collection using GLOBE Observer's Cloud and Mosquito Habitat Mapper apps. As modeled by the fictional students from Willow Creek, the 4th grade students will identify their 3 study sites at the Outdoor Campus, keep a journal, and record observations. The students will repeat their investigations at the Outdoor Campus to document and track change over time. Students will be introduced to "big data" in a manageable way, as they see their observations populate GLOBE's map-based data visualization and . Our research design recognizes the comfort and familiarity factor of literacy activities in the elementary classroom for students and teachers alike, and postulates that connecting a science education project to an engaging storybook text will contribute to a successful implementation and measurable learning outcomes. We will report on the Fall 2017 pilot metrics of success, along with a discussion of multi partner collaborations, project scale-up and sustainability.

  5. A smartphone application for psoriasis segmentation and classification (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Vasefi, Fartash; MacKinnon, Nicholas B.; Horita, Timothy; Shi, Kevin; Khan Munia, Tamanna Tabassum; Tavakolian, Kouhyar; Alhashim, Minhal; Fazel-Rezai, Reza

    2017-02-01

    Psoriasis is a chronic skin disease affecting approximately 125 million people worldwide. Currently, dermatologists monitor changes of psoriasis by clinical evaluation or by measuring psoriasis severity scores over time which lead to Subjective management of this condition. The goal of this paper is to develop a reliable assessment system to quantitatively assess the changes of erythema and intensity of scaling of psoriatic lesions. A smartphone deployable mobile application is presented that uses the smartphone camera and cloud-based image processing to analyze physiological characteristics of psoriasis lesions, identify the type and stage of the scaling and erythema. The application targets to automatically evaluate Psoriasis Area Severity Index (PASI) by measuring the severity and extent of psoriasis. The mobile application performs the following core functions: 1) it captures text information from user input to create a profile in a HIPAA compliant database. 2) It captures an image of the skin with psoriasis as well as image-related information entered by the user. 3) The application color correct the image based on environmental lighting condition using calibration process including calibration procedure by capturing Macbeth ColorChecker image. 4) The color-corrected image will be transmitted to a cloud-based engine for image processing. In cloud, first, the algorithm removes the non-skin background to ensure the psoriasis segmentation is only applied to the skin regions. Then, the psoriasis segmentation algorithm estimates the erythema and scaling boundary regions of lesion. We analyzed 10 images of psoriasis images captured by cellphone, determined PASI score for each subject during our pilot study, and correlated it with changes in severity scores given by dermatologists. The success of this work allows smartphone application for psoriasis severity assessment in a long-term treatment.

  6. Volcanic Ash and Aviation - the 2014 Eruptions of Kelut and Sangeang Api, Indonesia

    NASA Astrophysics Data System (ADS)

    Tupper, A. C.; Jansons, E.

    2014-12-01

    Two significant eruptions in Indonesia during the first part of 2014 have highlighted the continuing challenges of safe air traffic management around volcanic ash clouds. The stratospheric eruption of Kelut (also known as Kelud) in Java late on 13 February 2014 resulted in widespread aviation disruption over Indonesia and at least one serious volcanic ash encounter from an international airline. An upper-tropospheric eruption of Sangeang Api in the Lesser Sunda Islands on 30 May 2014 did not result in any known aircraft encounters, but did result in many delays and flight cancellations between Indonesia and Australia. In both cases, the eruption and resultant ash clouds were relatively well observed, if subject to the usual issues in characterising such clouds. For example, as tropical eruptions frequently reach 15 km amsl and above due to the height of the tropical tropopause, it is frequently very difficult to provide an accurate estimation of conditions at the cruising levels of aircraft, at 10-11 km (or lower for shorter domestic routes). More critically, the challenge of linking operational results from two scientific professions (volcanology and meteorology) with real-time aviation users remains strongly evident. Situational awareness of domestic and international airlines, ground-based monitoring and communications prior to and during the eruption, receiving and sharing pilot reports of volcanic ash, and appropriate flight responses all remain inadequate even in relatively fine conditions, with an unacceptable ongoing risk of serious aviation encounters should improvements not be made. Despite the extensive efforts of the International Civil Aviation Organization, World Meteorological Organization, and all partners in the International Airways Volcano Watch, and despite the acceleration of work on the issue since 2010, volcanic ash management remains sub-optimal.

  7. Icing detection from geostationary satellite data using machine learning approaches

    NASA Astrophysics Data System (ADS)

    Lee, J.; Ha, S.; Sim, S.; Im, J.

    2015-12-01

    Icing can cause a significant structural damage to aircraft during flight, resulting in various aviation accidents. Icing studies have been typically performed using two approaches: one is a numerical model-based approach and the other is a remote sensing-based approach. The model based approach diagnoses aircraft icing using numerical atmospheric parameters such as temperature, relative humidity, and vertical thermodynamic structure. This approach tends to over-estimate icing according to the literature. The remote sensing-based approach typically uses meteorological satellite/ground sensor data such as Geostationary Operational Environmental Satellite (GOES) and Dual-Polarization radar data. This approach detects icing areas by applying thresholds to parameters such as liquid water path and cloud optical thickness derived from remote sensing data. In this study, we propose an aircraft icing detection approach which optimizes thresholds for L1B bands and/or Cloud Optical Thickness (COT) from Communication, Ocean and Meteorological Satellite-Meteorological Imager (COMS MI) and newly launched Himawari-8 Advanced Himawari Imager (AHI) over East Asia. The proposed approach uses machine learning algorithms including decision trees (DT) and random forest (RF) for optimizing thresholds of L1B data and/or COT. Pilot Reports (PIREPs) from South Korea and Japan were used as icing reference data. Results show that RF produced a lower false alarm rate (1.5%) and a higher overall accuracy (98.8%) than DT (8.5% and 75.3%), respectively. The RF-based approach was also compared with the existing COMS MI and GOES-R icing mask algorithms. The agreements of the proposed approach with the existing two algorithms were 89.2% and 45.5%, respectively. The lower agreement with the GOES-R algorithm was possibly due to the high uncertainty of the cloud phase product from COMS MI.

  8. A cost-effective laser scanning method for mapping stream channel geometry and roughness

    NASA Astrophysics Data System (ADS)

    Lam, Norris; Nathanson, Marcus; Lundgren, Niclas; Rehnström, Robin; Lyon, Steve

    2015-04-01

    In this pilot project, we combine an Arduino Uno and SICK LMS111 outdoor laser ranging camera to acquire high resolution topographic area scans for a stream channel. The microprocessor and imaging system was installed in a custom gondola and suspended from a wire cable system. To demonstrate the systems capabilities for capturing stream channel topography, a small stream (< 2m wide) in the Krycklan Catchment Study was temporarily diverted and scanned. Area scans along the stream channel resulted in a point spacing of 4mm and a point cloud density of 5600 points/m2 for the 5m by 2m area. A grain size distribution of the streambed material was extracted from the point cloud using a moving window, local maxima search algorithm. The median, 84th and 90th percentiles (common metrics to describe channel roughness) of this distribution were found to be within the range of measured values while the largest modelled element was approximately 35% smaller than its measured counterpart. The laser scanning system captured grain sizes between 30mm and 255mm (coarse gravel/pebbles and boulders based on the Wentworth (1922) scale). This demonstrates that our system was capable of resolving both large-scale geometry (e.g. bed slope and stream channel width) and small-scale channel roughness elements (e.g. coarse gravel/pebbles and boulders) for the study area. We further show that the point cloud resolution is suitable for estimating ecohydraulic parameters such as Manning's n and hydraulic radius. Although more work is needed to fine-tune our system's design, these preliminary results are encouraging, specifically for those with a limited operational budget.

  9. Challenges in the Development of a Self-Calibrating Network of Ceilometers.

    NASA Astrophysics Data System (ADS)

    Hervo, Maxime; Wagner, Frank; Mattis, Ina; Baars, Holger; Haefele, Alexander

    2015-04-01

    There are more than 700 Automatic Lidars and Ceilometers (ALCs) currently operating in Europe. Modern ceilometers can do more than simply measure the cloud base height. They can also measure aerosol layers like volcanic ash, Saharan dust or aerosols within the planetary boundary layer. In the frame of E-PROFILE, which is part of EUMETNET, a European network of automatic lidars and ceilometers will be set up exploiting this new capability. To be able to monitor the evolution of aerosol layers over a large spatial scale, the measurements need to be consistent from one site to another. Currently, most of the instruments do not provide calibrated, only relative measurements. Thus, it is necessary to calibrate the instruments to develop a consistent product for all the instruments from various network and to combine them in an European Network like E-PROFILE. As it is not possible to use an external reference (like a sun photometer or a Raman Lidar) to calibrate all the ALCs in the E-PROFILE network, it is necessary to use a self-calibration algorithm. Two calibration methods have been identified which are suited for automated use in a network: the Rayleigh and the liquid cloud calibration methods In the Rayleigh method, backscatter signals from molecules (this is the Rayleigh signal) can be measured and used to calculate the lidar constant (Wiegner et al. 2012). At the wavelength used for most ceilometers, this signal is weak and can be easily measured only during cloud-free nights. However, with the new algorithm implemented in the frame of the TOPROF COST Action, the Rayleigh calibration was successfully performed on a CHM15k for more than 50% of the nights from October 2013 to September 2014. This method was validated against two reference instruments, the collocated EARLINET PollyXT lidar and the CALIPSO space-borne lidar. The lidar constant was on average within 5.5% compare to the lidar constant determined by the EARLINET lidar. It confirms the validity of the self-calibration method. For 3 CALIPSO overpasses the agreement was on average 20.0%. It is less accurate due to the large uncertainties of CALIPSO data close to the surface. In opposition to the Rayleigh method, Cloud calibration method uses the complete attenuation of the transmitter beam by a liquid water cloud to calculate the lidar constant (O'Connor 2004). The main challenge is the selection of accurately measured water clouds. These clouds should not contain any ice crystals and the detector should not get into saturation. The first problem is especially important during winter time and the second problem is especially important for low clouds. Furthermore the overlap function should be known accurately, especially when the water cloud is located at a distance where the overlap between laser beam and telescope field-of-view is still incomplete. In the E-PROFILE pilot network, the Rayleigh calibration is already performed automatically. This demonstration network maked available, in real time, calibrated ALC measurements from 8 instruments of 4 different types in 6 countries. In collaboration with TOPROF and 20 national weathers services, E-PROFILE will provide, in 2017, near real time ALC measurements in most of Europe.

  10. Tracking Clouds on Venus using Venus Express Data

    NASA Astrophysics Data System (ADS)

    Pertzborn, Rosalyn; Limaye, Sanjay; Markiewicz, Wojciech; Jasmin, Tommy; Udgaonkar, Nishant

    2014-05-01

    In the US, a growing emphasis has been placed on the development of inclusive and authentic educational experiences which promote active participation by the K-12 learning community as well as the general public in NASA's earth and space science research activities. In the face of growing national and international budgetary constraints which present major challenges across all scientific research organizations around the world, the need for scientific communities to dramatically improve strategies for effective public engagement experiences, demonstrating the relevance of earth and space science research contributions to the citizenry, have become paramount. This presentation will provide an introduction to the online Venus Express Cloud tracking applet, an overview of feedback from educational users based on classroom/pilot implementation efforts, as well as the concept's potential viability for the promotion of expanded public participation in the analysis of data in future planetary exploration and research activities, nationally and internationally. Acknowledgements: We wish to acknowledge the contributions of Mr. Nishant Udgaonkar, a summer intern with the S.N. Bose Scholars Program, sponsored by the Science and Engineering Board, Department of Science and Technology, Government of India, the Indo-U.S. Science and Technology Forum, and the University of Wisconsin-Madison. We also wish to acknowledge the Space Science and Engineering Center as well as NASA for supporting this project.

  11. About Separation of Hadron and Electromagnetic Cascades in the Pamela Calorimeter

    NASA Astrophysics Data System (ADS)

    Stozhkov, Yuri I.; Basili, A.; Bencardino, R.; Casolino, M.; de Pascale, M. P.; Furano, G.; Menicucci, A.; Minori, M.; Morselli, A.; Picozza, P.; Sparvoli, R.; Wischnewski, R.; Bakaldin, A.; Galper, A. M.; Koldashov, S. V.; Korotkov, M. G.; Mikhailov, V. V.; Voronov, S. A.; Yurkin, Y. T.; Adriani, O.; Bonechi, L.; Bongi, M.; Papini, P.; Ricciarini, S. B.; Spillantini, P.; Straulino, S.; Taccetti, F.; Vannuccini, E.; Castellini, G.; Boezio, M.; Bonvicini, M.; Mocchiutti, E.; Schiavon, P.; Vacchi, A.; Zampa, G.; Zampa, N.; Carlson, P.; Lund, J.; Lundquist, J.; Orsi, S.; Pearce, M.; Barbarino, G. C.; Campana, D.; Osteria, G.; Rossi, G.; Russo, S.; Boscherini, M.; Mennh, W.; Simonh, M.; Bongiorno, L.; Ricci, M.; Ambriola, M.; Bellotti, R.; Cafagna, F.; Circella, M.; de Marzo, C.; Giglietto, N.; Mirizzi, N.; Romita, M.; Spinelli, P.; Bogomolov, E.; Krutkov, S.; Vasiljev, G.; Bazilevskaya, G. A.; Kvashnin, A. N.; Logachev, V. I.; Makhmutov, V. S.; Maksumov, O. S.; Stozhkov, Yu. I.; Mitchell, J. W.; Streitmatter, R. E.; Stochaj, S. J.

    Results of calibration of the PAMELA instrument at the CERN facilities are discussed. In September, 2003, the calibration of the Neutron Detector together with the Calorimeter was performed with the CERN beams of electrons and protons with energies of 20 - 180 GeV. The implementation of the Neutron Detector increases a rejection factor of hadrons from electrons about ten times. The results of calibration are in agreement with calculations.

  12. DAMPE prototype and its beam test results at CERN

    NASA Astrophysics Data System (ADS)

    Wu, Jian; Hu, Yiming; Chang, Jin

    The first Chinese high energy cosmic particle detector(DAMPE) aims to detect electron/gamma at the range between 5GeV and 10TeV in space. A prototype of this detector is made and tested using both cosmic muons and test beam at CERN. Energy and space resolution as well as strong separation power for electron and proton are shown in the results. The detector structure is illustrated as well.

  13. Measurement of the inclusive jet cross section at the CERN pp collider

    NASA Astrophysics Data System (ADS)

    Arnison, G.; Albrow, M. G.; Allkofer, O. C.; Astbury, A.; Aubert, B.; Bacci, C.; Batley, J. R.; Bauer, G.; Bettini, A.; Bézaguet, A.; Bock, R. K.; Bos, K.; Buckley, E.; Bunn, J.; Busetto, G.; Catz, P.; Cennini, P.; Centro, S.; Ceradini, F.; Ciapetti, G.; Cittolin, S.; Clarke, D.; Cline, D.; Cochet, C.; Colas, J.; Colas, P.; Corden, M.; Cox, G.; Dallman, D.; Dau, D.; Debeer, M.; Debrion, J. P.; Degiorgi, M.; della Negra, M.; Demoulin, M.; Denby, B.; Denegri, D.; Diciaccio, A.; Dobrzynski, L.; Dorenbosch, J.; Dowell, J. D.; Duchovni, E.; Edgecock, R.; Eggert, K.; Eisenhandler, E.; Ellis, N.; Erhard, P.; Faissner, H.; Fince Keeler, M.; Flynn, P.; Fontaine, G.; Frey, R.; Frühwirth, R.; Garvey, J.; Gee, D.; Geer, S.; Ghesquière, C.; Ghez, P.; Ghio, F.; Giacomelli, P.; Gibson, W. R.; Giraud-Héraud, Y.; Givernaud, A.; Gonidec, A.; Goodman, M.; Grassmann, H.; Grayer, G.; Guryn, W.; Hansl-Kozanecka, T.; Haynes, W.; Haywood, S. J.; Hoffmann, H.; Holthuizen, D. J.; Homer, R. J.; Homer, R. J.; Honma, A.; Jank, W.; Jimack, M.; Jorat, G.; Kalmus, P. I. P.; Karimäri, V.; Keeler, R.; Kenyon, I.; Kernan, A.; Kienzle, W.; Kinnunen, R.; Kozanecki, W.; Kroll, J.; Kryn, D.; Kyberd, P.; Lacava, F.; Laugier, J. P.; Lees, J. P.; Leuchs, R.; Levegrun, S.; Lévêque, A.; Levi, M.; Linglin, D.; Locci, E.; Long, K.; Markiewicz, T.; Markytan, M.; Martin, T.; Maurin, F.; McMahon, T.; Mendiburu, J.-P.; Meneguzzo, A.; Meyer, O.; Meyer, T.; Minard, M.-N.; Mohammadi, M.; Morgan, K.; Moricca, M.; Moser, H.; Mours, B.; Muller, Th.; Nandi, A.; Naumann, L.; Norton, A.; Paoluzi, L.; Pascoli, D.; Pauss, F.; Perault, C.; Piano Mortari, G.; Pietarinen, E.; Pigot, C.; Pimiä, M.; Pitman, D.; Placci, A.; Porte, J.-P.; Radermacher, E.; Ransdell, J.; Redelberger, T.; Reithler, H.; Revol, J. P.; Richman, J.; Rijssenbeek, M.; Rohlf, J.; Rossi, P.; Roberts, C.; Ruhm, W.; Rubbia, C.; Sajot, G.; Salvini, G.; Sass, J.; Sadoulet, B.; Samyn, D.; Savoy-Navarro, A.; Schinzel, D.; Schwartz, A.; Scott, W.; Scott, W.; Shah, T. P.; Sheer, I.; Siotis, I.; Smith, D.; Sobie, R.; Sphicas, P.; Strauss, J.; Streets, J.; Stubenrauch, C.; Summers, D.; Sumorok, K.; Szonczo, F.; Tao, C.; Ten Have, I.; Thompson, G.; Tscheslog, E.; Tuominiemi, J.; van Eijk, B.; Verecchia, P.; Vialle, J. P.; Virdee, T. S.; von der Schmitt, H.; von Schlippe, W.; Vrana, J.; Vuillemin, V.; Wahl, H. D.; Watkins, P.; Wilke, R.; Wilson, J.; Wingerter, I.; Wimpenny, S. J.; Wulz, C.-E.; Wyatt, T.; Yvert, M.; Zacharov, I.; Zaganidis, N.; Zanello, L.; Zotto, P.

    1986-05-01

    The inclusive jet cross section has been measured in the UA1 experiment at the CERN pp Collider at centre-of-mass energies √s = 546 GeV and √s = 630 eV. The cross sections are found to be consistent with QCD predictions, The observed change in the cross section with the centre-of-mass energy √s is accounted for in terms of xT scaling.

  14. Highlights from High Energy Neutrino Experiments at CERN

    NASA Astrophysics Data System (ADS)

    Schlatter, W.-D.

    2015-07-01

    Experiments with high energy neutrino beams at CERN provided early quantitative tests of the Standard Model. This article describes results from studies of the nucleon quark structure and of the weak current, together with the precise measurement of the weak mixing angle. These results have established a new quality for tests of the electroweak model. In addition, the measurements of the nucleon structure functions in deep inelastic neutrino scattering allowed first quantitative tests of QCD.

  15. PARTICLE PHYSICS: CERN Collider Glimpses Supersymmetry--Maybe.

    PubMed

    Seife, C

    2000-07-14

    Last week, particle physicists at the CERN laboratory in Switzerland announced that by smashing together matter and antimatter in four experiments, they detected an unexpected effect in the sprays of particles that ensued. The anomaly is subtle, and physicists caution that it might still be a statistical fluke. If confirmed, however, it could mark the long-sought discovery of a whole zoo of new particles--and the end of a long-standing model of particle physics.

  16. An alternative model to distribute VO software to WLCG sites based on CernVM-FS: a prototype at PIC Tier1

    NASA Astrophysics Data System (ADS)

    Lanciotti, E.; Merino, G.; Bria, A.; Blomer, J.

    2011-12-01

    In a distributed computing model as WLCG the software of experiment specific application software has to be efficiently distributed to any site of the Grid. Application software is currently installed in a shared area of the site visible for all Worker Nodes (WNs) of the site through some protocol (NFS, AFS or other). The software is installed at the site by jobs which run on a privileged node of the computing farm where the shared area is mounted in write mode. This model presents several drawbacks which cause a non-negligible rate of job failure. An alternative model for software distribution based on the CERN Virtual Machine File System (CernVM-FS) has been tried at PIC, the Spanish Tierl site of WLCG. The test bed used and the results are presented in this paper.

  17. The management of large cabling campaigns during the Long Shutdown 1 of LHC

    NASA Astrophysics Data System (ADS)

    Meroli, S.; Machado, S.; Formenti, F.; Frans, M.; Guillaume, J. C.; Ricci, D.

    2014-03-01

    The Large Hadron Collider at CERN entered into its first 18 month-long shutdown period in February 2013. During this period the entire CERN accelerator complex will undergo major consolidation and upgrade works, preparing the machines for LHC operation at nominal energy (7 TeV/beam). One of the most challenging activities concerns the cabling infrastructure (copper and optical fibre cables) serving the CERN data acquisition, networking and control systems. About 1000 kilometres of cables, distributed in different machine areas, will be installed, representing an investment of about 15 MCHF. This implies an extraordinary challenge in terms of project management, including resource and activity planning, work execution and quality control. The preparation phase of this project started well before its implementation, by defining technical solutions and setting financial plans for staff recruitment and material supply. Enhanced task coordination was further implemented by deploying selected competences to form a central support team.

  18. CERN@school: demonstrating physics with the Timepix detector

    NASA Astrophysics Data System (ADS)

    Whyntie, T.; Bithray, H.; Cook, J.; Coupe, A.; Eddy, D.; Fickling, R. L.; McKenna, J.; Parker, B.; Paul, A.; Shearer, N.

    2015-10-01

    This article shows how the Timepix hybrid silicon pixel detector, developed by the Medipix2 Collaboration, can be used by students and teachers alike to demonstrate some key aspects of any well-rounded physics curriculum with CERN@school. After an overview of the programme, the detector's capabilities for measuring and visualising ionising radiation are examined. The classification of clusters - groups of adjacent pixels - is discussed with respect to identifying the different types of particles. Three demonstration experiments - background radiation measurements, radiation profiles and the attenuation of radiation - are described; these can used as part of lessons or as inspiration for independent research projects. Results for exemplar data-sets are presented for reference, as well as details of ongoing research projects inspired by these experiments. Interested readers are encouraged to join the CERN@school Collaboration and so contribute to achieving the programme's aim of inspiring the next generation of scientists and engineers.

  19. CERN's approach to public outreach

    NASA Astrophysics Data System (ADS)

    Landua, Rolf

    2016-03-01

    CERN's communication goes beyond publishing scientific results. Education and outreach are equally important ways of communicating with the general public, and in particular with the young generation. Over the last decade, CERN has significantly increased its efforts to accommodate the very large interest of the general public (about 300,000 visit requests per year), by ramping up its capacity for guided tours from 25,000 to more than 100,000 visitors per year, by creating six new of state-of-the-art exhibitions on-site, by building and operating a modern physics laboratory for school teachers and students, and by showing several traveling exhibitions in about 10 countries per year. The offer for school teachers has also been expanded, to 35-40 weeks of teacher courses with more than 1000 participants from more than 50 countries per year. The talk will give an overview about these and related activities.

  20. Final Technical Report for Grant # DE-FG02-06ER64169

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Beat Schmid, PI

    2007-07-13

    The Atmospheric Radiation Measurement (ARM) program is funding this project to improve the methodology of ground-based remote sensing of the vertical distribution of aerosol and cloud optical properties, and their effect on atmospheric radiative transfer. Remotely-sensed and in situ observed aerosol, cloud physical, and optical properties collected during the May 2003 Aerosol Intensive Operational Period (AIOP) and the Aerosol Lidar Validation Experiment (ALIVE), conducted from September 11-22, 2005, are the basis for the investigation. We have used ground-based lidar, airborne sunphotometer and in situ measurements and other data to evaluate the vertical profile of aerosol properties. We have been pursuingmore » research in the following three areas: (1) Aerosol Best Estimate Product--Sensitivity Study: ARM is developing an Aerosol Best Estimate (ABE) Value Added Product (VAP) to provide aerosol optical properties at all times and heights above its sites. The ABE is used as input for the Broadband Heating Rate Profile (BBHRP) VAP, whose output will be used to evaluate the radiative treatment of aerosols and clouds in climate models. ARM has a need to assess how much detail is required for the ABE and if a useful ABE can be derived for the tropical and arctic climate research facilities (CRFs) where only limited aerosol information in the vertical is available. We have been determining the sensitivity of BBHRP to the vertical profile of aerosol optical properties used in ABE. (2) Vertically Resolved Aerosol and Cloud Radiative Properties over the Southern Great Plains (SGP): The AIOP delivered an unprecedented airborne radiometric and in situ data set related to aerosols and clouds. The Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS's) Twin Otter aircraft carried solar pointing, up- and down-looking radiometers (spectral and broadband, visible, and infrared) with the uplooking radiometers mounted on a stabilized platform. We are performing an integrated analysis of the largely unexploited radiometric data set to provide observation-based quantification of the effect of aerosols and clouds on the radiation field. We will link aerosol and cloud properties measured in situ with the observed radiative fluxes using radiative transfer models. This over-determined dataset will provide validation of the BBHRP VAP. (3) Integrated Analysis of Data from the Aerosol Lidar Validation Experiment: The ABE VAP relies on continuous lidar observations to provide the vertical distribution of the aerosols above the ARM sites. The goal of ALIVE, conducted in September 2005, was the validation of the aerosol extinction profiles obtained from the SGP Raman lidar, which has been recently refurbished/updated, and the Micro Pulse Lidar, for which a new algorithm to retrieve aerosol profiles has recently been developed, using the National Aeronautics and Space Administration (NASA) Ames Airborne Tracking 14 channel Sun photometer. We are performing and publishing the integrated analysis of the ALIVE data set.« less

  1. A Digital Knowledge Preservation Platform for Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Pertinez, Esther; Palacio, Aida; Perez, David

    2017-04-01

    The Digital Knowledge Preservation Platform is the evolution of a pilot project for Open Data supporting the full research data life cycle. It is currently being evolved at IFCA (Instituto de Física de Cantabria) as a combination of different open tools that have been extended: DMPTool (https://dmptool.org/) with pilot semantics features (RDF export, parameters definition), INVENIO (http://invenio-software.org/ ) customized version to integrate the entire research data life cycle and Jupyter (http://jupyter.org/) as processing tool and reproducibility environment. This complete platform aims to provide an integrated environment for research data management following the FAIR+R principles: -Findable: The Web portal based on Invenio provides a search engine and all elements including metadata to make them easily findable. -Accessible: Both data and software are available online with internal PIDs and DOIs (provided by Datacite). -Interoperable: Datasets can be combined to perform new analysis. The OAI-PMH standard is also integrated. -Re-usable: different licenses types and embargo periods can be defined. -+Reproducible: directly integrated with cloud computing resources. The deployment of the entire system over a Cloud framework helps to build a dynamic and scalable solution, not only for managing open datasets but also as a useful tool for the final user, who is able to directly process and analyse the open data. In parallel, the direct use of semantics and metadata is being explored and integrated in the framework. Ontologies, being a knowledge representation, can contribute to define the elements and relationships of the research data life cycle, including DMP, datasets, software, etc. The first advantage of developing an ontology of a knowledge domain is that they provide a common vocabulary hierarchy (i.e. a conceptual schema) that can be used and standardized by all the agents interested in the domain (either humans or machines). This way of using ontologies is one of the basis of the Semantic Web, where ontologies are set to play a key role in establishing a common terminology between agents. To develop the ontology we are using a graphical tool called Protégé. Protégé is a graphical ontology-development tool which supports a rich knowledge model and it is open-source and freely available. However in order to process and manage the ontology from the web framework, we are using Semantic MediaWiki, which is able to process queries. Semantic MediaWiki is an extension of MediaWiki where we can do semantic search and export data in RDF and CSV format. This system is used as a testbed for the potential use of semantics in a more general environment. This Digital Knowledge Preservation Platform is very closed related to INDIGO-DataCloud project (https://www.indigo-datacloud.eu) since the same data life cycle approach is taking into account (Planning, Collect, Curate, Analyze, Publish, Preserve). INDIGO-DataCloud solutions will be able to support all the different elements in the system, as we showed in the last Research Data Alliance Plenary. This presentation will show the different elements on the system and how they work, as well as the roadmap of their continuous integration.

  2. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework.

    PubMed

    Khazaei, Hamzeh; McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-11-18

    Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids' NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution.

  3. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework

    PubMed Central

    McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-01-01

    Background Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. Objective To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. Methods We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). Results We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids’ NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Conclusions Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution. PMID:26582268

  4. International Workshop on Linear Colliders 2010

    ScienceCinema

    Lebrun, Ph.

    2018-06-20

    IWLC2010 International Workshop on Linear Colliders 2010ECFA-CLIC-ILC joint meeting: Monday 18 October - Friday 22 October 2010Venue: CERN and CICG (International Conference Centre Geneva, Switzerland). This year, the International Workshop on Linear Colliders organized by the European Committee for Future Accelerators (ECFA) will study the physics, detectors and accelerator complex of a linear collider covering both CLIC and ILC options. Contact Workshop Secretariat  IWLC2010 is hosted by CERN.

  5. CERN: A global project

    NASA Astrophysics Data System (ADS)

    Voss, Rüdiger

    2017-07-01

    In the most important shift of paradigm of its membership rules in 60 years, CERN in 2010 introduced a policy of “Geographical Enlargement” which for the first time opened the door for membership of non-European States in the Organization. This short article reviews briefly the history of CERN’s membership rules, discusses the rationale behind the new policy, its relationship with the emerging global roadmap of particle physics, and gives a short overview of the status of the enlargement process.

  6. International Workshop on Linear Colliders 2010

    ScienceCinema

    Yamada, Sakue

    2018-05-24

    IWLC2010 International Workshop on Linear Colliders 2010ECFA-CLIC-ILC joint meeting: Monday 18 October - Friday 22 October 2010Venue: CERN and CICG (International Conference Centre Geneva, Switzerland) This year, the International Workshop on Linear Colliders organized by the European Committee for Future Accelerators (ECFA) will study the physics, detectors and accelerator complex of a linear collider covering both CLIC and ILC options. Contact Workshop Secretariat  IWLC2010 is hosted by CERN

  7. Performance of a liquid argon time projection chamber exposed to the CERN West Area Neutrino Facility neutrino beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arneodo, F.; Cavanna, F.; Mitri, I. De

    2006-12-01

    We present the results of the first exposure of a Liquid Argon TPC to a multi-GeV neutrino beam. The data have been collected with a 50 liters ICARUS-like chamber located between the CHORUS and NOMAD experiments at the CERN West Area Neutrino Facility (WANF). We discuss both the instrumental performance of the detector and its capability to identify and reconstruct low-multiplicity neutrino interactions.

  8. Upper limits of the proton magnetic form factor in the time-like region from p¯p--> e+e- at the CERN-ISR

    NASA Astrophysics Data System (ADS)

    Baglin, C.; Baird, S.; Bassompierre, G.; Borreani, G.; Brient, J. C.; Broll, C.; Brom, J. M.; Bugge, L.; Buran, T.; Burq, J. P.; Bussière, A.; Buzzo, A.; Cester, R.; Chemarin, M.; Chevallier, M.; Escoubes, B.; Fay, J.; Ferroni, S.; Gracco, V.; Guillaud, J. P.; Khan-Aronsen, E.; Kirsebom, K.; Ille, B.; Lambert, M.; Leistam, L.; Lundby, A.; Macri, M.; Marchetto, F.; Mattera, L.; Menichetti, E.; Mouellic, B.; Pastrone, N.; Petrillo, L.; Pia, M. G.; Poulet, M.; Pozzo, A.; Rinaudo, G.; Santroni, A.; Severi, M.; Skjevling, G.; Stapnes, S.; Stugu, B.; Tomasini, F.; Valbusa, U.

    1985-11-01

    From the measurement of e+e- pairs from the reaction p¯p-->e+e- at the CERN-ISR, using an antiproton beam and a hydrogen jet target, we derived upper limits for the proton magnetic form factor in the time-like region at Q2⋍8.9(GeV/c)2 and Q2⋍12.5(GeV/c)2.

  9. Diffractive Higgs boson production at the Fermilab Tevatron and the CERN Large Hadron Collider.

    PubMed

    Enberg, R; Ingelman, G; Kissavos, A; Tîmneanu, N

    2002-08-19

    Improved possibilities to find the Higgs boson in diffractive events, having less hadronic activity, depend on whether the cross section is large enough. Based on the soft color interaction models that successfully describe diffractive hard scattering at DESY HERA and the Fermilab Tevatron, we find that only a few diffractive Higgs events may be produced at the Tevatron, but we predict a substantial rate at the CERN Large Hadron Collider.

  10. Integrating new Storage Technologies into EOS

    NASA Astrophysics Data System (ADS)

    Peters, Andreas J.; van der Ster, Dan C.; Rocha, Joaquim; Lensing, Paul

    2015-12-01

    The EOS[1] storage software was designed to cover CERN disk-only storage use cases in the medium-term trading scalability against latency. To cover and prepare for long-term requirements the CERN IT data and storage services group (DSS) is actively conducting R&D and open source contributions to experiment with a next generation storage software based on CEPH[3] and ethernet enabled disk drives. CEPH provides a scale-out object storage system RADOS and additionally various optional high-level services like S3 gateway, RADOS block devices and a POSIX compliant file system CephFS. The acquisition of CEPH by Redhat underlines the promising role of CEPH as the open source storage platform of the future. CERN IT is running a CEPH service in the context of OpenStack on a moderate scale of 1 PB replicated storage. Building a 100+PB storage system based on CEPH will require software and hardware tuning. It is of capital importance to demonstrate the feasibility and possibly iron out bottlenecks and blocking issues beforehand. The main idea behind this R&D is to leverage and contribute to existing building blocks in the CEPH storage stack and implement a few CERN specific requirements in a thin, customisable storage layer. A second research topic is the integration of ethernet enabled disks. This paper introduces various ongoing open source developments, their status and applicability.

  11. A possible biomedical facility at the European Organization for Nuclear Research (CERN)

    PubMed Central

    Dosanjh, M; Myers, S

    2013-01-01

    A well-attended meeting, called “Brainstorming discussion for a possible biomedical facility at CERN”, was held by the European Organization for Nuclear Research (CERN) at the European Laboratory for Particle Physics on 25 June 2012. This was concerned with adapting an existing, but little used, 78-m circumference CERN synchrotron to deliver a wide range of ion species, preferably from protons to at least neon ions, with beam specifications that match existing clinical facilities. The potential extensive research portfolio discussed included beam ballistics in humanoid phantoms, advanced dosimetry, remote imaging techniques and technical developments in beam delivery, including gantry design. In addition, a modern laboratory for biomedical characterisation of these beams would allow important radiobiological studies, such as relative biological effectiveness, in a dedicated facility with standardisation of experimental conditions and biological end points. A control photon and electron beam would be required nearby for relative biological effectiveness comparisons. Research beam time availability would far exceed that at other facilities throughout the world. This would allow more rapid progress in several biomedical areas, such as in charged hadron therapy of cancer, radioisotope production and radioprotection. The ethos of CERN, in terms of open access, peer-reviewed projects and governance has been so successful for High Energy Physics that application of the same to biomedicine would attract high-quality research, with possible contributions from Europe and beyond, along with potential new funding streams. PMID:23549990

  12. Volcanic ash cloud detection from space: a preliminary comparison between RST approach and water vapour corrected BTD procedure

    NASA Astrophysics Data System (ADS)

    Piscini, Alessandro; Marchese, Francesco; Merucci, Luca; Pergola, Nicola; Corradini, Stefano; Tramutoli, Valerio

    2010-05-01

    Volcanic eruptions can inject large amounts (Tg) of gas and particles into the troposphere and, sometimes, into the stratosphere. Besides the main gases (H2O, CO2 , SO2 and HCl), volcanic clouds contain a mix of silicate ash particles in the size range 0.1μm to mm or larger. Interest in the ash presence detection is high in particular because it represents a serious hazard for air traffic. Particles with dimension of several millimeters can damage the aircraft structure (windows, wings, ailerons), while particles less than 10μm may be extremely dangerous for the jet engines and are undetectable by the pilots during night or in low visibility conditions. Satellite data are useful for measuring volcanic clouds because of the large vertical range of these emissions and their likely large horizontal spread. Moreover, since volcanoes are globally distributed and inherently dangerous, satellite measurements offer a practical and safe platform from which to make observations. Two different techniques used to detect volcanic clouds from satellite data are considered here for a preliminary comparison, with possible implications on quantitative retrievals of plume parameters. In particular, the Robust Satellite Techniques (RST) approach and a water vapour corrected version of the Brightness Temperature Difference (BTD) procedure, will be compared. The RST approach is based on the multi-temporal analysis of historical, long-term satellite records, devoted to a former characterization of the measured signal, in terms of expected value and natural variability and a further recognition of signal anomalies by an automatic, unsupervised change detection step. The BTD method is based on the difference between the brightness temperature measured in two channels centered around 11 and 12 mm. To take into account the atmospheric water vapour differential absorption in the 11-12 μm spectral range that tends to reduce (and in some cases completely mask) the BTD signal, a water vapor correction procedure, based on measured or synthetic atmospheric profiles, has been applied. Results independently achieved by both methods during recent Mt. Etna eruptions are presented, compared and discussed also in terms of further implications for quantitative retrievals of plume parameters.

  13. The environmental virtual observatory pilot (EVOp): a cloud solution demonstrating effective science for efficient decisions

    NASA Astrophysics Data System (ADS)

    Gurney, R. J.; Emmett, B.; McDonald, A.

    2012-12-01

    Environmental managers and policy makers face a challenging future trying to accommodate growing expectations of environmental well-being, while subject to maturing regulation, constrained budgets and a public scrutiny that expects easier and more meaningful access to data and decision logic. To support such a challenge requires new tools and new approaches. The EVOp is an initiative from the UK Natural Environment Research Council (NERC) designed to deliver proof of concept for these new tools and approaches. A series of exemplar 'big catchment science questions' are posed and the prospects for their solution are assessed. These are then used to develop cloud solutions for serving data, models, visualisation and analysis tools to scientists, regulators, private companies and the public, all of whom have different expectations of what environmental information is important. Approaches are tested regularly with users using SCRUM. The VO vision encompasses seven key ambitions: i. being driven by the need to contribute to the solution of major environmental issues that impinge on, or link to, catchment science ii. having the flexibility and adaptability to address future problems not yet defined or fully clarified iii. being able to communicate issues and solutions to a range of audiences iv. supporting easy access by a variety of users v. drawing meaningful information from data and models and identifying the constraints on application in terms of errors, uncertainties, etc vi. adding value and cost effectiveness to current investigations by supporting transfer and scale adjustment thus limiting the repetition of expensive field monitoring addressing essentially the same issues in varying locations vii. promoting effective interfacing of robust science with a variety of end users by using terminology or measures familiar to the user (or required by regulation), including financial and carbon accounting, whole life or fixed period costing, risk as probability or as disability adjusted life years/ etc as appropriate Architectures pivotal to communicating these ambitions are presented. Cloud computing facilitates the required interoperability across data sets, models, visualisations etc. There are also additional legal, security, culrural and standards barriers that need to be solved before such a cloud becomes operational.

  14. The Influence of Visibility, Cloud Ceiling, Financial Incentive, and Personality Factors on General Aviation Pilots’ Willingness to Take Off Into Marginal Weather, Part 1: The Data and Preliminary Conclusions

    DTIC Science & Technology

    2005-04-01

    8 A ct in st h r t ot 0. 90 4 0. 48 1 0. 58 1 0. 64...IS (t ot ) 0. 81 5 0. 77 5 E IS Im pu ls iv ity 0. 55 8 0. 57 8 0. 48 8 H az ar do us E ve nt s In de x 0. 60 4 0. 50 8 0. 59 7 M P Q C on tro l...ts) BA RTe xplos ions (tot) B ART explo sions (lo) Wald p(m odel w.co nstan t) -0 .5 67 0. 53 2 0. 56 2 0. 64 7 -0 .5

  15. Degraded visual environment image/video quality metrics

    NASA Astrophysics Data System (ADS)

    Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.

    2014-06-01

    A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.

  16. Liftoff of Space Shuttle Columbia on mission STS-93

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The fiery launch of Space Shuttle Columbia casts ghost-like shadows on the clouds of smoke and steam surrounding it. Liftoff occurred at 12:31 a.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The crew numbers five: Commander Eileen M. Collins, Pilot Jeffrey S. Ashby, and Mission Specialists Stephen A. Hawley (Ph.D.), Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a Shuttle mission. The target landing date is July 27, 1999, at 11:20 p.m. EDT.

  17. KSC-99pp0952

    NASA Image and Video Library

    1999-07-23

    KENNEDY SPACE CENTER, FLA. -- The fiery launch of Space Shuttle Columbia casts ghost-like shadows on the clouds of smoke and steam surrounding it. Liftoff occurred at 12:31 a.m. EDT. STS-93 is a five-day mission primarily to release the Chandra X-ray Observatory, which will allow scientists from around the world to study some of the most distant, powerful and dynamic objects in the universe. The crew numbers five: Commander Eileen M. Collins, Pilot Jeffrey S. Ashby, and Mission Specialists Steven A. Hawley (Ph.D.), Catherine G. Coleman (Ph.D.) and Michel Tognini of France, with the Centre National d'Etudes Spatiales (CNES). Collins is the first woman to serve as commander of a Shuttle mission. The target landing date is July 27, 1999, at 11:20 p.m. EDT

  18. A Molecular Line Survey around Orion at Low Frequencies with the MWA

    NASA Astrophysics Data System (ADS)

    Tremblay, C. D.; Jones, P. A.; Cunningham, M.; Hurley-Walker, N.; Jordan, C. H.; Tingay, S. J.

    2018-06-01

    The low-frequency sky may reveal some of the secrets yet to be discovered. Until recently, molecules had never been detected within interstellar clouds at frequencies below 700 MHz. Following the pilot survey toward the Galactic center at 103–133 MHz with the Murchison Widefield Array, we surveyed 400 deg2 centered on the Orion KL nebula from 99 to 170 MHz. Orion is a nearby region of active star formation and known to be a chemically rich environment. In this paper, we present tentative detections of nitric oxide and its isotopologues, singularly deuterated formic acid, molecular oxygen, and several unidentified transitions. The three identified molecules are particularly interesting, as laboratory experiments have suggested that these molecules are precursors to the formation of amines.

  19. STS-72 Flight Day 2

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On this second day of the STS-72 mission, the flight crew, Cmdr. Brian Duffy, Pilot Brent W. Jett, and Mission Specialists Leroy Chiao, Daniel T. Barry, Winston E. Scott, and Koichi Wakata (NASDA), awakened to music from the motion picture 'Star Wars.' The crew performed a systems checkout, prepared for the retrieval of the Japanese Space Flyer Unit (SFU), tested the spacesuits for the EVA, and activated some of the secondary experiments. An in-orbit news interview was conducted with the crew via satellite downlinking. Questions asked ranged from the logistics of the mission to the avoidance procedures the Endeavour Orbiter performed to miss hitting the inactive Air Force satellite, nicknamed 'Misty' (MSTI). Earth views included cloud cover, several storm systems, and various land masses with several views of the shuttle's open cargo bay in the foreground.

  20. STS-73 Flight Day 5

    NASA Technical Reports Server (NTRS)

    1995-01-01

    On this fifth day of the STS-73 sixteen day mission, the crew Cmdr. Kenneth Bowersox, Pilot Kent Rominger, Payload Specialists Albert Sacco and Fred Leslie, and Mission Specialists Kathryn Thornton, Catherine 'Cady' Coleman, and Michael Lopez-Alegria are shown performing several of the spaceborne experiments onboard the United States Microgravity Lab-2 (USML-2). These experiments are downlinked to Mission Control from the Spacelab using the High-Packed Digital Television (HI-PAC) systems onboard the Shuttle. The experiments shown include the Drop Physics Module (DPM) experiment, the Surface Tension Driven Convection Experiment (STDCE), the Protein Crystal Growth (PCG) experiment, and a Hand-Held Diffusion Test Cell experiment. Lopez-Alegria is interviewed in Spanish by two Spanish radio show hosts. Earth views include cloud cover, the Earth's horizon and atmospheric boundary layers, and several oceans.

  1. Early in-flight detection of SO2 via Differential Optical Absorption Spectroscopy: a feasible aviation safety measure to prevent potential encounters with volcanic plumes

    NASA Astrophysics Data System (ADS)

    Vogel, L.; Galle, B.; Kern, C.; Delgado Granados, H.; Conde, V.; Norman, P.; Arellano, S.; Landgren, O.; Lübcke, P.; Alvarez Nieves, J. M.; Cárdenas Gonzáles, L.; Platt, U.

    2011-09-01

    Volcanic ash constitutes a risk to aviation, mainly due to its ability to cause jet engines to fail. Other risks include the possibility of abrasion of windshields and potentially serious damage to avionic systems. These hazards have been widely recognized since the early 1980s, when volcanic ash provoked several incidents of engine failure in commercial aircraft. In addition to volcanic ash, volcanic gases also pose a threat. Prolonged and/or cumulative exposure to sulphur dioxide (SO2) or sulphuric acid (H2SO4) aerosols potentially affects e.g. windows, air frame and may cause permanent damage to engines. SO2 receives most attention among the gas species commonly found in volcanic plumes because its presence above the lower troposphere is a clear proxy for a volcanic cloud and indicates that fine ash could also be present. Up to now, remote sensing of SO2 via Differential Optical Absorption Spectroscopy (DOAS) in the ultraviolet spectral region has been used to measure volcanic clouds from ground based, airborne and satellite platforms. Attention has been given to volcanic emission strength, chemistry inside volcanic clouds and measurement procedures were adapted accordingly. Here we present a set of experimental and model results, highlighting the feasibility of DOAS to be used as an airborne early detection system of SO2 in two spatial dimensions. In order to prove our new concept, simultaneous airborne and ground-based measurements of the plume of Popocatépetl volcano, Mexico, were conducted in April 2010. The plume extended at an altitude around 5250 m above sea level and was approached and traversed at the same altitude with several forward looking DOAS systems aboard an airplane. These DOAS systems measured SO2 in the flight direction and at ±40 mrad (2.3°) angles relative to it in both, horizontal and vertical directions. The approaches started at up to 25 km distance to the plume and SO2 was measured at all times well above the detection limit. In combination with radiative transfer studies, this study indicates that an extended volcanic cloud with a concentration of 1012 molecules cm-3 at typical flight levels of 10 km can be detected unambiguously at distances of up to 80 km away. This range provides enough time (approx. 5 min) for pilots to take action to avoid entering a volcanic cloud in the flight path, suggesting that this technique can be used as an effective aid to prevent dangerous aircraft encounters with potentially ash rich volcanic clouds.

  2. Early in-flight detection of SO2 via Differential Optical Absorption Spectroscopy: a feasible aviation safety measure to prevent potential encounters with volcanic plumes

    NASA Astrophysics Data System (ADS)

    Vogel, L.; Galle, B.; Kern, C.; Delgado Granados, H.; Conde, V.; Norman, P.; Arellano, S.; Landgren, O.; Lübcke, P.; Alvarez Nieves, J. M.; Cárdenas Gonzáles, L.; Platt, U.

    2011-05-01

    Volcanic ash constitutes a risk to aviation, mainly due to its ability to cause jet engines to fail. Other risks include the possibility of abrasion of windshields and potentially serious damage to avionic systems. These hazards have been widely recognized since the early 1980s, when volcanic ash provoked several incidents of engine failure in commercial aircraft. In addition to volcanic ash, volcanic gases also pose a threat. Prolonged and/or cumulative exposure to sulphur dioxide (SO2) or sulphuric acid (H2SO4) aerosols potentially affects e.g. windows, air frame and may cause permanent damage to engines. SO2 receives most attention among the gas species commonly found in volcanic plumes because its presence above the lower troposphere is a clear proxy for a volcanic cloud and indicates that fine ash could also be present. Up to now, remote sensing of SO2 via Differential Optical Absorption Spectroscopy (DOAS) in the ultraviolet spectral region has been used to measure volcanic clouds from ground based, airborne and satellite platforms. Attention has been given to volcanic emission strength, chemistry inside volcanic clouds and measurement procedures were adapted accordingly. Here we present a set of experimental and model results, highlighting the feasibility of DOAS to be used as an airborne early detection system of SO2 in two spatial dimensions. In order to prove our new concept, simultaneous airborne and ground-based measurements of the plume of Popocatépetl volcano, Mexico, were conducted in April 2010. The plume extended at an altitude around 5250 m above sea level and was approached and traversed at the same altitude with several forward looking DOAS systems aboard an airplane. These DOAS systems measured SO2 in the flight direction and at ± 40 mrad (2.3°) angles relative to it in both, horizontal and vertical directions. The approaches started at up to 25 km distance to the plume and SO2 was measured at all times well above the detection limit. In combination with radiative transfer studies, this study indicates that an extended volcanic cloud with a concentration of 1012 molecules cm-3 at typical flight levels of 10 km can be detected unambiguously at distances of up to 80 km away. This range provides enough time (approx. 5 min) for pilots to take action to avoid entering a volcanic cloud in the flight path, suggesting that this technique can be used as an effective aid to prevent dangerous aircraft encounters with potentially ash rich volcanic clouds.

  3. Early in-flight detection of SO2 via Differential Optical Absorption Spectroscopy: A feasible aviation safety measure to prevent potential encounters with volcanic plumes

    USGS Publications Warehouse

    Vogel, L.; Galle, B.; Kern, C.; Delgado, Granados H.; Conde, V.; Norman, P.; Arellano, S.; Landgren, O.; Lubcke, P.; Alvarez, Nieves J.M.; Cardenas, Gonzales L.; Platt, U.

    2011-01-01

    Volcanic ash constitutes a risk to aviation, mainly due to its ability to cause jet engines to fail. Other risks include the possibility of abrasion of windshields and potentially serious damage to avionic systems. These hazards have been widely recognized 5 since the early 1980s, when volcanic ash provoked several incidents of engine failure in commercial aircraft. In addition to volcanic ash, volcanic gases also pose a threat. Prolonged and/or cumulative exposure to sulphur dioxide (SO2) or sulphuric acid (H2SO4) aerosols potentially affects e.g. windows, air frame and may cause permanent damage to engines. SO2 receives most attention among the gas species commonly found in 10 volcanic plumes because its presence above the lower troposphere is a clear proxy for a volcanic cloud and indicates that fine ash could also be present. Up to now, remote sensing of SO2 via Differential Optical Absorption Spectroscopy (DOAS) in the ultraviolet spectral region has been used to measure volcanic clouds from ground based, airborne and satellite platforms. Attention has been given to vol- 15 canic emission strength, chemistry inside volcanic clouds and measurement procedures were adapted accordingly. Here we present a set of experimental and model results, highlighting the feasibility of DOAS to be used as an airborne early detection system of SO2 in two spatial dimensions. In order to prove our new concept, simultaneous airborne and ground-based measurements of the plume of Popocatepetl volcano, Mexico, were conducted in April 2010. The plume extended at an altitude around 5250 m above sea level and was approached and traversed at the same altitude with several forward looking DOAS systems aboard an airplane. These DOAS systems measured SO2 in the flight direction and at ±40 mrad (2.3◦) angles relative to it in both, horizontal and vertical directions. The approaches started at up to 25 km distance to 25 the plume and SO2 was measured at all times well above the detection limit. In combination with radiative transfer studies, this study indicates that an extended volcanic cloud with a concentration of 1012 molecules cm−3 at typical flight levels of 10 km can be detected unambiguously at distances of up to 80 km away. This range provides enough time (approx. 5 min) for pilots to take action to avoid entering a volcanic cloud in the flight path, suggesting that this technique can be used as an effective aid to prevent dangerous aircraft encounters with potentially ash rich volcanic clouds.

  4. Measurements and FLUKA Simulations of Bismuth, Aluminium and Indium Activation at the upgraded CERN Shielding Benchmark Facility (CSBF)

    NASA Astrophysics Data System (ADS)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.; Yashima, H.

    2018-06-01

    The CERN High energy AcceleRator Mixed field (CHARM) facility is situated in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5·1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7·1010 protons per second. The extracted proton beam impacts on a cylindrical copper target. The shielding of the CHARM facility includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target that allows deep shielding penetration benchmark studies of various shielding materials. This facility has been significantly upgraded during the extended technical stop at the beginning of 2016. It consists now of 40 cm of cast iron shielding, a 200 cm long removable sample holder concrete block with 3 inserts for activation samples, a material test location that is used for the measurement of the attenuation length for different shielding materials as well as for sample activation at different thicknesses of the shielding materials. Activation samples of bismuth, aluminium and indium were placed in the CSBF in September 2016 to characterize the upgraded version of the CSBF. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields of bismuth isotopes (206 Bi, 205 Bi, 204 Bi, 203 Bi, 202 Bi, 201 Bi) from 209 Bi, 24 Na from 27 Al and 115 m I from 115 I for these samples. The production yields estimated by FLUKA Monte Carlo simulations are compared to the production yields obtained from γ-spectroscopy measurements of the samples taking the beam intensity profile into account. The agreement between FLUKA predictions and γ-spectroscopy measurements for the production yields is at a level of a factor of 2.

  5. Two-particle correlations in azimuthal angle and pseudorapidity in inelastic p + p interactions at the CERN Super Proton Synchrotron

    DOE PAGES

    Aduszkiewicz, A.; Ali, Y.; Andronov, E.; ...

    2017-01-30

    Results on two-particle ΔηΔΦ correlations in inelastic p + p interactions at 20, 31, 40, 80, and 158 GeV/c are presented. The measurements were performed using the large acceptance NA61/SHINE hadron spectrometer at the CERN Super Proton Synchrotron. The data show structures which can be attributed mainly to effects of resonance decays, momentum conservation, and quantum statistics. Furthermore, the results are compared with the Epos and UrQMD models.

  6. News UK public libraries offer walk-in access to research Atoms for Peace? The Atomic Weapons Establishment and UK universities Students present their research to academics: CERN@school Science in a suitcase: Marvin and Milo visit Ethiopia Inspiring telescopes A day for everyone teaching physics 2014 Forthcoming Events

    NASA Astrophysics Data System (ADS)

    2014-05-01

    UK public libraries offer walk-in access to research Atoms for Peace? The Atomic Weapons Establishment and UK universities Students present their research to academics: CERN@school Science in a suitcase: Marvin and Milo visit Ethiopia Inspiring telescopes A day for everyone teaching physics 2014 Forthcoming Events

  7. Overview of LHC physics results at ICHEP

    ScienceCinema

    Mangano, Michelangelo

    2018-06-20

    This month LHC physics day will review the physics results presented by the LHC experiments at the 2010 ICHEP in Paris. The experimental presentations will be preceeded by the bi-weekly LHC accelerator status report.The meeting will be broadcast via EVO (detailed info will appear at the time of the meeting in the "Video Services" item on the left menu bar). For those attending, information on accommodation, access to CERN and laptop registration is available from http://cern.ch/lpcc/visits

  8. CERN at 60: giant magnet journeys through Geneva

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2014-07-01

    More than 30,000 people descended onto Geneva's harbour last month to celebrate the bicentenary of the city's integration into Switzerland with a parade through the city. Joining the 1200 participants at the Genève200 celebrations were staff from the CERN particle-physics lab, which is located on the outskirts of Geneva, who paraded a superconducting dipole magnet - similar to the thousands used in the Large Hadron Collider - through the city's narrow streets on a 20 m lorry.

  9. Astronomie, écologie et poésie par Hubert Reeves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-09-21

    Hubert ReevesL'astrophysicien donne une conférence puis s'entretient avec l'écrivain François Bon autour de :"Astronomie, écologie et poésie"Pour plus d'informations : http://outreach.web.cern.ch/outreach/FR/evenements/conferences.htmlNombre de places limité. Réservation obligatoire à la Réception du CERN : +41 22 767 76 76  Soirée diffusée en direct sur le Web : http://webcast.cern.ch/      

  10. Retirement Kjell Johnsen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2007-12-05

    A l'occasion de son 65me anniversaire plusieurs orateurs (aussi l'ambassadeur de Norvège) remercient Kjell Johnsen, né en juin 1921 en Norvège, pour ses 34 ans de service au Cern et retracent sa vie et son travail. K.Johnsen a pris part aux premières études sur les accélérateurs du futur centre de physique et fut aussi le père et le premier directeur de l'Ecole du Cern sur les accélérateurs (CAS)

  11. News Conference: Physics brings the community together Training: CERN trains physics teachers Education: World conference fosters physics collaborations Lecture: Physics education live at ASE Prize: Physics teacher wins first Moore medal Festival: European presidents patronize Science on Stage festival Videoconference: Videoconference brings Durban closer to the classroom

    NASA Astrophysics Data System (ADS)

    2012-03-01

    Conference: Physics brings the community together Training: CERN trains physics teachers Education: World conference fosters physics collaborations Lecture: Physics education live at ASE Prize: Physics teacher wins first Moore medal Festival: European presidents patronize Science on Stage festival Videoconference: Videoconference brings Durban closer to the classroom

  12. CERN's Common Unix and X Terminal Environment

    NASA Astrophysics Data System (ADS)

    Cass, Tony

    The Desktop Infrastructure Group of CERN's Computing and Networks Division has developed a Common Unix and X Terminal Environment to ease the migration to Unix based Interactive Computing. The CUTE architecture relies on a distributed filesystem—currently Trans arc's AFS—to enable essentially interchangeable client work-stations to access both "home directory" and program files transparently. Additionally, we provide a suite of programs to configure workstations for CUTE and to ensure continued compatibility. This paper describes the different components and the development of the CUTE architecture.

  13. News Festival: Science on stage deadline approaches Conference: Welsh conference attracts teachers Data: New phase of CERN openlab tackles exascale IT challenges for science Meeting: German Physical Society holds its physics education spring meeting Conference: Association offers golden opportunity in Norway Competition: So what's the right answer then?

    NASA Astrophysics Data System (ADS)

    2012-07-01

    Festival: Science on stage deadline approaches Conference: Welsh conference attracts teachers Data: New phase of CERN openlab tackles exascale IT challenges for science Meeting: German Physical Society holds its physics education spring meeting Conference: Association offers golden opportunity in Norway Competition: So what's the right answer then?

  14. Overview of LHC physics results at ICHEP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2011-02-25

     This month LHC physics day will review the physics results presented by the LHC experiments at the 2010 ICHEP in Paris. The experimental presentations will be preceeded by the bi-weekly LHC accelerator status report.The meeting will be broadcast via EVO (detailed info will appear at the time of the meeting in the "Video Services" item on the left menu bar)For those attending, information on accommodation, access to CERN and laptop registration is available from http://cern.ch/lpcc/visits

  15. Measurement of the antiproton-nucleus annihilation cross-section at low energy

    NASA Astrophysics Data System (ADS)

    Aghai-Khozani, H.; Bianconi, A.; Corradini, M.; Hayano, R.; Hori, M.; Leali, M.; Lodi Rizzini, E.; Mascagna, V.; Murakami, Y.; Prest, M.; Vallazza, E.; Venturelli, L.; Yamada, H.

    2018-02-01

    Systematic measurements of the annihilation cross sections of low energy antinucleons were performed at CERN in the 80's and 90's. However the antiproton data on medium-heavy and heavy nuclear targets are scarce. The ASACUSA Collaboration at CERN has measured the antiproton annihilation cross section on carbon at 5.3 MeV: the value is (1.73 ± 0.25) barn. The result is compared with the antineutron experimental data and with the theoretical previsions.

  16. High Energy Electron Detection with ATIC

    NASA Technical Reports Server (NTRS)

    Chang, J.; Schmidt, W. K. H.; Adams, James H., Jr.; Ahn, H.; Ampe, J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The ATIC (Advanced Thin Ionization Calorimeter) balloon-borne ionization calorimeter is well suited to record and identify high energy cosmic ray electrons. The instrument was exposed to high-energy beams at CERN H2 bean-dine in September of 1999. We have simulated the performance of the instrument, and compare the simulations with actual high energy electron exposures at the CERN accelerator. Simulations and measurements do not compare exactly, in detail, but overall the simulations have predicted actual measured behavior quite well.

  17. Optical fibres in the radiation environment of CERN

    NASA Astrophysics Data System (ADS)

    Guillermain, E.

    2017-11-01

    CERN, the European Organization for Nuclear Research (in Geneva, Switzerland), is home to a complex scientific instrument: the 27-kilometre Large Hadron Collider (LHC) collides beams of high-energy particles at close to the speed of light. Optical fibres are widely used at CERN, both in surface areas (e.g. for inter-building IT networks) and in the accelerator complex underground (e.g. for cryogenics, vacuum, safety systems). Optical fibres in the accelerator are exposed to mixed radiation fields (mainly composed of protons, pions, neutrons and other hadrons, gamma rays and electrons), with dose rates depending on the particular installation zone, and with radiation levels often significantly higher than those encountered in space. In the LHC and its injector chain radiation levels range from relatively low annual doses of a few Gy up to hundreds of kGy. Optical fibres suffer from Radiation Induced Attenuation (RIA, expressed in dB per unit length) that affect light transmission and which depends on the irradiation conditions (e.g. dose rate, total dose, temperature). In the CERN accelerator complex, the failure of an optical link can affect the proper functionality of control or monitoring systems and induce the interruption of the accelerator operation. The qualification of optical fibres for installation in critical radiation areas is therefore crucial. Thus, all optical fibre types installed in radiation areas at CERN are subject to laboratory irradiation tests, in order to evaluate their RIA at different total dose and dose rates. This allows the selection of the appropriate optical fibre type (conventional or radiation resistant) compliant with the requirements of each installation. Irradiation tests are performed in collaboration with Fraunhofer INT (irradiation facilities and expert team in Euskirchen, Germany). Conventional off-the-shelf optical fibres can be installed for optical links exposed to low radiation levels (i.e. annual dose typically below few kGy). Nevertheless, the conventional optical fibres must be carefully qualified as a spread in RIA of factor 10 is observed among optical fibres of different types and dopants. In higher radiation areas, special radiation resistant optical fibres are installed. For total dose above 1 kGy, the RIA of these special optical fibres is at least 10 times lower than the conventional optical fibres RIA at same irradiation conditions. 2400 km of these special radiation resistant optical fibres were recently procured at CERN. As part of this procurement process, a quality assurance plan including the irradiation testing of all 65 produced batches was set up. This presentation will review the selection process of the appropriate optical fibre types to be installed in the radiation environment of CERN. The methodology for choosing the irradiation parameters for the laboratory tests will be discussed together with an overview of the RIA of different optical fibre types under several irradiation conditions.

  18. LHC, le Big Bang en éprouvette

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Notre compréhension de l’Univers est en train de changer… Bar des Sciences - Tout public Débat modéré par Marie-Odile Montchicourt, journaliste de France Info. Evenement en vidéoconférence entre le Globe de la science et de l’innovation, le bar le Baloard de Montpellier et la Maison des Métallos à Paris. Intervenants au CERN : Philippe Charpentier et Daniel Froideveaux, physiciens au CERN. Intervenants à Paris : Vincent Bontemps, philosophe et chercheur au CEA ; Jacques Arnould, philosophe, historien des sciences et théologien, Jean-Jacques Beineix, réalisateur, producteur, scénariste de cinéma. Intervenants à Montpellier (LPTA) : André Neveu, physicien théoricien et directeur demore » recherche au CNRS ; Gilbert Moultaka, physicien théoricien et chargé de recherche au CNRS. Partenariat : CERN, CEA, IN2P3, Université MPL2 (LPTA) Dans le cadre de la Fête de la science 2008.« less

  19. Disk storage at CERN

    NASA Astrophysics Data System (ADS)

    Mascetti, L.; Cano, E.; Chan, B.; Espinal, X.; Fiorot, A.; González Labrador, H.; Iven, J.; Lamanna, M.; Lo Presti, G.; Mościcki, JT; Peters, AJ; Ponce, S.; Rousseau, H.; van der Ster, D.

    2015-12-01

    CERN IT DSS operates the main storage resources for data taking and physics analysis mainly via three system: AFS, CASTOR and EOS. The total usable space available on disk for users is about 100 PB (with relative ratios 1:20:120). EOS actively uses the two CERN Tier0 centres (Meyrin and Wigner) with 50:50 ratio. IT DSS also provide sizeable on-demand resources for IT services most notably OpenStack and NFS-based clients: this is provided by a Ceph infrastructure (3 PB) and few proprietary servers (NetApp). We will describe our operational experience and recent changes to these systems with special emphasis to the present usages for LHC data taking, the convergence to commodity hardware (nodes with 200-TB each with optional SSD) shared across all services. We also describe our experience in coupling commodity and home-grown solution (e.g. CERNBox integration in EOS, Ceph disk pools for AFS, CASTOR and NFS) and finally the future evolution of these systems for WLCG and beyond.

  20. First test of BNL electron beam ion source with high current density electron beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pikin, Alexander, E-mail: pikin@bnl.gov; Alessi, James G., E-mail: pikin@bnl.gov; Beebe, Edward N., E-mail: pikin@bnl.gov

    A new electron gun with electrostatic compression has been installed at the Electron Beam Ion Source (EBIS) Test Stand at BNL. This is a collaborative effort by BNL and CERN teams with a common goal to study an EBIS with electron beam current up to 10 A, current density up to 10,000 A/cm{sup 2} and energy more than 50 keV. Intensive and pure beams of heavy highly charged ions with mass-to-charge ratio < 4.5 are requested by many heavy ion research facilities including NASA Space Radiation Laboratory (NSRL) at BNL and HIE-ISOLDE at CERN. With a multiampere electron gun, themore » EBIS should be capable of delivering highly charged ions for both RHIC facility applications at BNL and for ISOLDE experiments at CERN. Details of the electron gun simulations and design, and the Test EBIS electrostatic and magnetostatic structures with the new electron gun are presented. The experimental results of the electron beam transmission are given.« less

  1. Protocols for Scholarly Communication

    NASA Astrophysics Data System (ADS)

    Pepe, A.; Yeomans, J.

    2007-10-01

    CERN, the European Organization for Nuclear Research, has operated an institutional preprint repository for more than 10 years. The repository contains over 850,000 records of which more than 450,000 are full-text OA preprints, mostly in the field of particle physics, and it is integrated with the library's holdings of books, conference proceedings, journals and other grey literature. In order to encourage effective propagation and open access to scholarly material, CERN is implementing a range of innovative library services into its document repository: automatic keywording, reference extraction, collaborative management tools and bibliometric tools. Some of these services, such as user reviewing and automatic metadata extraction, could make up an interesting testbed for future publishing solutions and certainly provide an exciting environment for e-science possibilities. The future protocol for scientific communication should guide authors naturally towards OA publication, and CERN wants to help reach a full open access publishing environment for the particle physics community and related sciences in the next few years.

  2. First experimental evidence of hydrodynamic tunneling of ultra-relativistic protons in extended solid copper target at the CERN HiRadMat facility

    NASA Astrophysics Data System (ADS)

    Schmidt, R.; Blanco Sancho, J.; Burkart, F.; Grenier, D.; Wollmann, D.; Tahir, N. A.; Shutov, A.; Piriz, A. R.

    2014-08-01

    A novel experiment has been performed at the CERN HiRadMat test facility to study the impact of the 440 GeV proton beam generated by the Super Proton Synchrotron on extended solid copper cylindrical targets. Substantial hydrodynamic tunneling of the protons in the target material has been observed that leads to significant lengthening of the projectile range, which confirms our previous theoretical predictions [N. A. Tahir et al., Phys. Rev. Spec. Top.-Accel. Beams 15, 051003 (2012)]. Simulation results show very good agreement with the experimental measurements. These results have very important implications on the machine protection design for powerful machines like the Large Hadron Collider (LHC), the future High Luminosity LHC, and the proposed huge 80 km circumference Future Circular Collider, which is currently being discussed at CERN. Another very interesting outcome of this work is that one may also study the field of High Energy Density Physics at this test facility.

  3. First experience with carbon stripping foils for the 160 MeV H- injection into the CERN PSB

    NASA Astrophysics Data System (ADS)

    Weterings, Wim; Bracco, Chiara; Jorat, Louise; Noulibos, Remy; van Trappen, Pieter

    2018-05-01

    160 MeV H- beam will be delivered from the new CERN linear accelerator (Linac4) to the Proton Synchrotron Booster (PSB), using a H- charge-exchange injection system. A 200 µg/cm2 carbon stripping foil will convert H- into protons by stripping off the electrons. The H- charge-exchange injection principle will be used for the first time in the CERN accelerator complex and involves many challenges. In order to gain experience with the foil changing mechanism and the very fragile foils, in 2016, prior to the installation in the PSB, a stripping foil test stand has been installed in the Linac4 transfer line. In addition, parts of the future PSB injection equipment are also temporarily installed in the Linac4 transfer line for tests with a 160 MeV H- commissioning proton beam. This paper describes the foil changing mechanism and control system, summarizes the practical experience of gluing and handling these foils and reports on the first results with beam.

  4. Chicago Ebola Response Network (CERN): A Citywide Cross-hospital Collaborative for Infectious Disease Preparedness.

    PubMed

    Lateef, Omar; Hota, Bala; Landon, Emily; Kociolek, Larry K; Morita, Julie; Black, Stephanie; Noskin, Gary; Kelleher, Michael; Curell, Krista; Galat, Amy; Ansell, David; Segreti, John; Weber, Stephen G

    2015-11-15

    The 2014-2015 Ebola virus disease (EVD) epidemic and international public health emergency has been referred to as a "black swan" event, or an event that is unlikely, hard to predict, and highly impactful once it occurs. The Chicago Ebola Response Network (CERN) was formed in response to EVD and is capable of receiving and managing new cases of EVD, while also laying the foundation for a public health network that can anticipate, manage, and prevent the next black swan public health event. By sharing expertise, risk, and resources among 4 major academic centers, Chicago created a sustainable network to respond to the latest in a series of public health emergencies. In this respect, CERN is a roadmap for how a region can prepare to respond to public health emergencies, thereby preventing negative impacts through planning and implementation. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. EOS developments

    NASA Astrophysics Data System (ADS)

    Sindrilaru, Elvin A.; Peters, Andreas J.; Adde, Geoffray M.; Duellmann, Dirk

    2017-10-01

    CERN has been developing and operating EOS as a disk storage solution successfully for over 6 years. The CERN deployment provides 135 PB and stores 1.2 billion replicas distributed over two computer centres. Deployment includes four LHC instances, a shared instance for smaller experiments and since last year an instance for individual user data as well. The user instance represents the backbone of the CERNBOX service for file sharing. New use cases like synchronisation and sharing, the planned migration to reduce AFS usage at CERN and the continuous growth has brought EOS to new challenges. Recent developments include the integration and evaluation of various technologies to do the transition from a single active in-memory namespace to a scale-out implementation distributed over many meta-data servers. The new architecture aims to separate the data from the application logic and user interface code, thus providing flexibility and scalability to the namespace component. Another important goal is to provide EOS as a CERN-wide mounted filesystem with strong authentication making it a single storage repository accessible via various services and front- ends (/eos initiative). This required new developments in the security infrastructure of the EOS FUSE implementation. Furthermore, there were a series of improvements targeting the end-user experience like tighter consistency and latency optimisations. In collaboration with Seagate as Openlab partner, EOS has a complete integration of OpenKinetic object drive cluster as a high-throughput, high-availability, low-cost storage solution. This contribution will discuss these three main development projects and present new performance metrics.

  6. An Analysis of Unique Aerial Photographs of Atmospheric Eddies in Marine Stratocumulus Clouds Downwind of Complex Terrain Along the California Coast

    NASA Astrophysics Data System (ADS)

    Muller, B. M.; Herbster, C. G.; Mosher, F. R.

    2013-12-01

    Unique aerial photographs of atmospheric eddies in marine stratocumulus clouds downwind of complex terrain along the California coast are presented and analyzed. While satellite imagery of similar eddies have appeared in the scientific literature since the 1960's, it is believed that these are the first close-up photographs of such eddies, taken from an airplane, to appear in publication. Two photographs by a commercial pilot, flying California coastal routes, are presented: one from July 16, 2006 downwind of Santa Cruz Island, a 740 m peak bordering the Santa Barbara Channel off the California coast; and one from September 12, 2006 near Grover Beach, California, downwind of a headland containing the San Luis Range, a region of complex terrain near San Luis Obispo, California, with ridges ranging approximately from 240 to 550 m elevation. Both eddies occurred in the lee of inversion-penetrating terrain, and were marked by a cyclonic vortex in the clouds with a striking cloud-free 'eye' feature roughly 3 km in diameter. The Santa Cruz Island eddy was 25 km in length and 9-10 km in width, while the Grover Beach eddy was 17 km in length and had a width of 9 km, placing it in the meso-gamma scale of atmospheric features. GOES (Geostationary Operational Environmental Satellite) imagery for both cases was obtained and help to define the lifecycle and motions of the eddies captured in the snapshots. Relevant meteorological observations for the Santa Cruz Island eddy were not located, but in-situ observations from the Diablo Canyon Nuclear Power Plant, California Polytechnic State University (Cal Poly) pier, and the San Luis Obispo County Air Pollution Control District, made possible a more detailed examination of the Grover Beach eddy and its structure. Additionally, we offer speculation on an eddy formation mechanism consistent with the satellite and in-situ observations described in this presentation, and hypotheses from the literature on low Froude number, continuously stratified flow. Attempting to analyze and understand the very small scale meteorological features in this case brings to light a variety of issues of increasing importance to modern meteorology and modeling of atmospheric flows near complex terrain. Fig. 1 Aerial photograph of stratocumulus cloud vortex just north of Santa Cruz Island on July 16, 2006 at 11:26 PDT (18:26 UTC), viewing toward the southwest. Photo by 'KB' courtesy of Capt. Peter Weiss of SkyWest Airlines.

  7. Insight into acid-base nucleation experiments by comparison of the chemical composition of positive, negative, and neutral clusters.

    PubMed

    Bianchi, Federico; Praplan, Arnaud P; Sarnela, Nina; Dommen, Josef; Kürten, Andreas; Ortega, Ismael K; Schobesberger, Siegfried; Junninen, Heikki; Simon, Mario; Tröstl, Jasmin; Jokinen, Tuija; Sipilä, Mikko; Adamov, Alexey; Amorim, Antonio; Almeida, Joao; Breitenlechner, Martin; Duplissy, Jonathan; Ehrhart, Sebastian; Flagan, Richard C; Franchin, Alessandro; Hakala, Jani; Hansel, Armin; Heinritzi, Martin; Kangasluoma, Juha; Keskinen, Helmi; Kim, Jaeseok; Kirkby, Jasper; Laaksonen, Ari; Lawler, Michael J; Lehtipalo, Katrianne; Leiminger, Markus; Makhmutov, Vladimir; Mathot, Serge; Onnela, Antti; Petäjä, Tuukka; Riccobono, Francesco; Rissanen, Matti P; Rondo, Linda; Tomé, António; Virtanen, Annele; Viisanen, Yrjö; Williamson, Christina; Wimmer, Daniela; Winkler, Paul M; Ye, Penglin; Curtius, Joachim; Kulmala, Markku; Worsnop, Douglas R; Donahue, Neil M; Baltensperger, Urs

    2014-12-02

    We investigated the nucleation of sulfuric acid together with two bases (ammonia and dimethylamine), at the CLOUD chamber at CERN. The chemical composition of positive, negative, and neutral clusters was studied using three Atmospheric Pressure interface-Time Of Flight (APi-TOF) mass spectrometers: two were operated in positive and negative mode to detect the chamber ions, while the third was equipped with a nitrate ion chemical ionization source allowing detection of neutral clusters. Taking into account the possible fragmentation that can happen during the charging of the ions or within the first stage of the mass spectrometer, the cluster formation proceeded via essentially one-to-one acid-base addition for all of the clusters, independent of the type of the base. For the positive clusters, the charge is carried by one excess protonated base, while for the negative clusters it is carried by a deprotonated acid; the same is true for the neutral clusters after these have been ionized. During the experiments involving sulfuric acid and dimethylamine, it was possible to study the appearance time for all the clusters (positive, negative, and neutral). It appeared that, after the formation of the clusters containing three molecules of sulfuric acid, the clusters grow at a similar speed, independent of their charge. The growth rate is then probably limited by the arrival rate of sulfuric acid or cluster-cluster collision.

  8. Atmospheric Radiation Measurement Program facilities newsletter, January 2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sisterson, D.L.

    2000-02-16

    The subject of this newsletter is the ARM unmanned aerospace vehicle program. The ARM Program's focus is on climate research, specifically research related to solar radiation and its interaction with clouds. The SGP CART site contains highly sophisticated surface instrumentation, but even these instruments cannot gather some crucial climate data from high in the atmosphere. The Department of Energy and the Department of Defense joined together to use a high-tech, high-altitude, long-endurance class of unmanned aircraft known as the unmanned aerospace vehicle (UAV). A UAV is a small, lightweight airplane that is controlled remotely from the ground. A pilot sitsmore » in a ground-based cockpit and flies the aircraft as if he were actually on board. The UAV can also fly completely on its own through the use of preprogrammed computer flight routines. The ARM UAV is fitted with payload instruments developed to make highly accurate measurements of atmospheric flux, radiance, and clouds. Using a UAV is beneficial to climate research in many ways. The UAV puts the instrumentation within the environment being studied and gives scientists direct measurements, in contrast to indirect measurements from satellites orbiting high above Earth. The data collected by UAVs can be used to verify and calibrate measurements and calculated values from satellites, therefore making satellite data more useful and valuable to researchers.« less

  9. Early Wheel Train Damage Detection Using Wireless Sensor Network Antenna

    NASA Astrophysics Data System (ADS)

    Fazilah, A. F. M.; Azemi, S. N.; Azremi, A. A. H.; Soh, P. J.; Kamarudin, L. M.

    2018-03-01

    Antenna for a wireless sensor network for early wheel trains damage detection has successfully developed and fabricated with the aim to minimize the risk and increase the safety guaranty for train. Current antenna design is suffered in gain and big in size. For the sensor, current existing sensor only detect when the wheel malfunction. Thus, a compact microstrip patch antenna with operating frequency at 2.45GHz is design with high gain of 4.95dB will attach to the wireless sensor device. Simulation result shows that the antenna is working at frequency 2.45GHz and the return loss at -34.46dB are in a good agreement. The result also shows the good radiation pattern and almost ideal VSWR which is 1.04. The Arduino Nano, LM35DZ and ESP8266-07 Wi-Fi module is applied to the core system with capability to sense the temperature and send the data wirelessly to the cloud. An android application has been created to monitor the temperature reading based on the real time basis. The mainly focuses for the future improvement is by minimize the size of the antenna in order to make in more compact. In addition, upgrade an android application that can collect the raw data from cloud and make an alarm system to alert the loco pilot.

  10. Projective drawings for assessing stress among subjects with medical symptoms compatible with sick building syndrome, and validation of a modified version of the Stress Load Index from the Drawing Personality Profile: a pilot study.

    PubMed

    Runeson, Roma; Wahlstedt, Kurt; Norbäck, Dan

    2007-02-01

    It was hypothesized that subjects with medical symptoms would show more signs of stress in projective drawings. A Stress Load Index, including five signs of stress in drawings, was evaluated. A questionnaire with an instruction to draw "a person in the rain" was sent to a cohort of 195 subjects, and the drawings were analysed blindly for eight stress items. Men had a higher index than women (p < .05) and drew clouds more often (p < .05). Drawing of clouds was associated with headache (adjOR = 4.28; 95% CI 1.75; 11.68). Drawing of puddles was associated with ocular symptoms (adjOR = 3.22; 95% CI 1.38, 7.50), facial dermal symptoms (adjOR= 2.94; 95% CI 1.28, 6.81), and tiredness (adjOR = 2.44; 95% CI 1.05, 5.67). Drawing of long rain strokes was associated with nasal symptoms (adjOR = 2.28; 95% CI 1.05, 2.06) and headache (adjOR = 3.20; 95% CI 1.28, 8.05). Age and stress load were predictors of sick building syndrome symptoms (p < .05). In conclusion, a nonverbal projective drawing test detected sex differences which represent directions opposite to those with verbal methods. These need empirical assessment.

  11. A sustainability model based on cloud infrastructures for core and downstream Copernicus services

    NASA Astrophysics Data System (ADS)

    Manunta, Michele; Calò, Fabiana; De Luca, Claudio; Elefante, Stefano; Farres, Jordi; Guzzetti, Fausto; Imperatore, Pasquale; Lanari, Riccardo; Lengert, Wolfgang; Zinno, Ivana; Casu, Francesco

    2014-05-01

    The incoming Sentinel missions have been designed to be the first remote sensing satellite system devoted to operational services. In particular, the Synthetic Aperture Radar (SAR) Sentinel-1 sensor, dedicated to globally acquire over land in the interferometric mode, guarantees an unprecedented capability to investigate and monitor the Earth surface deformations related to natural and man-made hazards. Thanks to the global coverage strategy and 12-day revisit time, jointly with the free and open access data policy, such a system will allow an extensive application of Differential Interferometric SAR (DInSAR) techniques. In such a framework, European Commission has been funding several projects through the GMES and Copernicus programs, aimed at preparing the user community to the operational and extensive use of Sentinel-1 products for risk mitigation and management purposes. Among them, the FP7-DORIS, an advanced GMES downstream service coordinated by Italian National Council of Research (CNR), is based on the fully exploitation of advanced DInSAR products in landslides and subsidence contexts. In particular, the DORIS project (www.doris-project.eu) has developed innovative scientific techniques and methodologies to support Civil Protection Authorities (CPA) during the pre-event, event, and post-event phases of the risk management cycle. Nonetheless, the huge data stream expected from the Sentinel-1 satellite may jeopardize the effective use of such data in emergency response and security scenarios. This potential bottleneck can be properly overcome through the development of modern infrastructures, able to efficiently provide computing resources as well as advanced services for big data management, processing and dissemination. In this framework, CNR and ESA have tightened up a cooperation to foster the use of GRID and cloud computing platforms for remote sensing data processing, and to make available to a large audience advanced and innovative tools for DInSAR products generation and exploitation. In particular, CNR is porting the multi-temporal DInSAR technique referred to as Small Baseline Subset (SBAS) into the ESA G-POD (Grid Processing On Demand) and CIOP (Cloud Computing Operational Pilot) platforms (Elefante et al., 2013) within the SuperSites Exploitation Platform (SSEP) project, which aim is contributing to the development of an ecosystem for big geo-data processing and dissemination. This work focuses on presenting the main results that have been achieved by the DORIS project concerning the use of advanced DInSAR products for supporting CPA during the risk management cycle. Furthermore, based on the DORIS experience, a sustainability model for Core and Downstream Copernicus services based on the effective exploitation of cloud platforms is proposed. In this framework, remote sensing community, both service providers and users, can significantly benefit from the Helix Nebula-The Science Cloud initiative, created by European scientific institutions, agencies, SMEs and enterprises to pave the way for the development and exploitation of a cloud computing infrastructure for science. REFERENCES Elefante, S., Imperatore, P. , Zinno, I., M. Manunta, E. Mathot, F. Brito, J. Farres, W. Lengert, R. Lanari, F. Casu, 2013, "SBAS-DINSAR Time series generation on cloud computing platforms". IEEE IGARSS Conference, Melbourne (AU), July 2013.

  12. A Cloud Robotics Based Service for Managing RPAS in Emergency, Rescue and Hazardous Scenarios

    NASA Astrophysics Data System (ADS)

    Silvagni, Mario; Chiaberge, Marcello; Sanguedolce, Claudio; Dara, Gianluca

    2016-04-01

    Cloud robotics and cloud services are revolutionizing not only the ICT world but also the robotics industry, giving robots more computing capabilities, storage and connection bandwidth while opening new scenarios that blend the physical to the digital world. In this vision, new IT architectures are required to manage robots, retrieve data from them and create services to interact with users. Among all the robots this work is mainly focused on flying robots, better known as drones, UAV (Unmanned Aerial Vehicle) or RPAS (Remotely Piloted Aircraft Systems). The cloud robotics approach shifts the concept of having a single local "intelligence" for every single UAV, as a unique device that carries out onboard all the computation and storage processes, to a more powerful "centralized brain" located in the cloud. This breakthrough opens new scenarios where UAVs are agents, relying on remote servers for most of their computational load and data storage, creating a network of devices where they can share knowledge and information. Many applications, using UAVs, are growing as interesting and suitable devices for environment monitoring. Many services can be build fetching data from UAVs, such as telemetry, video streaming, pictures or sensors data; once. These services, part of the IT architecture, can be accessed via web by other devices or shared with other UAVs. As test cases of the proposed architecture, two examples are reported. In the first one a search and rescue or emergency management, where UAVs are required for monitoring intervention, is shown. In case of emergency or aggression, the user requests the emergency service from the IT architecture, providing GPS coordinates and an identification number. The IT architecture uses a UAV (choosing among the available one according to distance, service status, etc.) to reach him/her for monitoring and support operations. In the meantime, an officer will use the service to see the current position of the UAV, its telemetry and video streaming from its camera. Data are stored for further use and documentation and can be shared to all the involved personal or services. The second case refer to imaging survey. An investigation area is selected using a map or a set of coordinates by a user that can be on the field on in a management facility. The cloud system elaborate this data and automatically compute a flight plan that consider the survey data requirements (i.e: picture ground resolution, overlapping) but also several environment constraints (i.e: no fly zones, possible hazardous areas, known obstacles, etc). Once the flight plan is loaded in the selected UAV the mission starts. During the mission, if a suitable data network coverage is available, the UAV transmit acquired images (typically low quality image to limit bandwidth) and shooting pose in order to perform a preliminary check during the mission and minimize failing in survey; if not, all data are uploaded asynchronously after the mission. The cloud servers perform all the tasks related to image processing (mosaic, ortho-photo, geo-referencing, 3D models) and data management.

  13. News Music: Here comes science that rocks Student trip: Two views of the future of CERN Classroom: Researchers can motivate pupils Appointment: AstraZeneca trust appoints new director Multimedia: Physics Education comes to YouTube Competition: Students compete in European Union Science Olympiad 2010 Physics roadshow: Pupils see wonders of physics

    NASA Astrophysics Data System (ADS)

    2010-07-01

    Music: Here comes science that rocks Student trip: Two views of the future of CERN Classroom: Researchers can motivate pupils Appointment: AstraZeneca trust appoints new director Multimedia: Physics Education comes to YouTube Competition: Students compete in European Union Science Olympiad 2010 Physics roadshow: Pupils see wonders of physics

  14. AMS data production facilities at science operations center at CERN

    NASA Astrophysics Data System (ADS)

    Choutko, V.; Egorov, A.; Eline, A.; Shan, B.

    2017-10-01

    The Alpha Magnetic Spectrometer (AMS) is a high energy physics experiment on the board of the International Space Station (ISS). This paper presents the hardware and software facilities of Science Operation Center (SOC) at CERN. Data Production is built around production server - a scalable distributed service which links together a set of different programming modules for science data transformation and reconstruction. The server has the capacity to manage 1000 paralleled job producers, i.e. up to 32K logical processors. Monitoring and management tool with Production GUI is also described.

  15. Ceremony 25th birthday Cern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2006-05-08

    Célébration du 25ème anniversaire du Cern (jour par jour) avec discours de L.Van Hove et J.B.Adams, des interludes musicals offerts par Mme Mey et ses collègues (au debut 1.mouvement du quatuor avec piano no 3 de L.van Beethoven) Les directeurs généraux procéderont à la remise du souvenir aux membres de personnel ayant 25 années de service dans l'organisation. Un témoignage de reconnaissance est auss fait à l'interprète Mme Zwerner

  16. Experience in running relational databases on clustered storage

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, Ruben; Potocky, Miroslav

    2015-12-01

    For past eight years, CERN IT Database group has based its backend storage on NAS (Network-Attached Storage) architecture, providing database access via NFS (Network File System) protocol. In last two and half years, our storage has evolved from a scale-up architecture to a scale-out one. This paper describes our setup and a set of functionalities providing key features to other services like Database on Demand [1] or CERN Oracle backup and recovery service. It also outlines possible trend of evolution that, storage for databases could follow.

  17. The ISOLDE LEGO® robot: building interest in frontier research

    NASA Astrophysics Data System (ADS)

    Elias Cocolios, Thomas; Lynch, Kara M.; Nichols, Emma

    2017-07-01

    An outreach programme centred around nuclear physics making use of a LEGO® Mindstorm® kit is presented. It consists of a presentation given by trained undergraduate students as science ambassadors followed by a workshop where the target audience programs the LEGO® Mindstorm® robots to familiarise themselves with the concepts in an interactive and exciting way. This programme has been coupled with the CERN-ISOLDE 50th anniversary and the launch of the CERN-MEDICIS facility in Geneva, Switzerland. The modular aspect of the programme readily allows its application to other topics.

  18. Neutron-induced fission cross section measurement of 233U, 241Am and 243Am in the energy range 0.5 MeV En 20 MeV at n TOF at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belloni, F.; Milazzo, P. M.; Calviani, M.

    2012-01-01

    Neutron-induced fission cross section measurements of 233U, 243Am and 241Am relative to 235U have been carried out at the neutron time-of-flight facility n TOF at CERN. A fast ionization chamber has been employed. All samples were located in the same detector; therefore the studied elements and the reference 235U target are subject to the same neutron beam.

  19. The CERN-EU high-energy Reference Field (CERF) facility: applications and latest developments

    NASA Astrophysics Data System (ADS)

    Silari, Marco; Pozzi, Fabio

    2017-09-01

    The CERF facility at CERN provides an almost unique high-energy workplace reference radiation field for the calibration and test of radiation protection instrumentation employed at high-energy accelerator facilities and for aircraft and space dosimetry. This paper describes the main features of the facility and supplies a non-exhaustive list of recent (as of 2005) applications for which CERF is used. Upgrade work started in 2015 to provide the scientific and industrial communities with a state-of-the-art reference facility is also discussed.

  20. Windows Terminal Servers Orchestration

    NASA Astrophysics Data System (ADS)

    Bukowiec, Sebastian; Gaspar, Ricardo; Smith, Tim

    2017-10-01

    Windows Terminal Servers provide application gateways for various parts of the CERN accelerator complex, used by hundreds of CERN users every day. The combination of new tools such as Puppet, HAProxy and Microsoft System Center suite enable automation of provisioning workflows to provide a terminal server infrastructure that can scale up and down in an automated manner. The orchestration does not only reduce the time and effort necessary to deploy new instances, but also facilitates operations such as patching, analysis and recreation of compromised nodes as well as catering for workload peaks.

Top