Science.gov

Sample records for cms underground experimental

  1. Underground

    ERIC Educational Resources Information Center

    Vrchota, Janet

    1974-01-01

    At a time when the future of New York's subway system looked bleak, new underground zoning legislation (the first ever) has been enacted. This new law requires buildings constructed near a subway station to provide transit easement space to allow public access to the subway through the building property. (MA)

  2. EXPERIMENTAL STUDIES ON DIFFICULTY OF EVACUATION FROM UNDERGROUND SPACES UNDER INUNDATED SITUATIONS USING REAL SCALE MODELS

    NASA Astrophysics Data System (ADS)

    Baba, Yasuyuki; Ishigaki, Taisuke; Toda, Keiichi; Nakagawa, Hajime

    Many urbanized cities in Japan are located in alluvial plains, and the vulnerability of urbanized areas to flood disaster is highlighted by flood attacks due to heavy rain fall or typhoons. Underground spaces located in the urbanized area are flood-prone areas, and the intrusion of flood watar into underground space inflicted severe damages on urban functions and infrastructures. In a similar way, low-lying areas like "bowl-shaped" depression and underpasses under highway and railroad bridges are also prone to floods. The underpasses are common sites of accidents of submerged vehicles, and severe damage including human damage occasionally occurs under flooding conditions. To reduce the damage due to inundation in underground space, needless to say, early evacuation is one of the most important countermeasures. This paper shows some experimental results of evacuation tests from underground spaces under inundated situations. The difficulities of the evacuation from underground space has been investigated by using real scale models (door, staircase and vehicle), and the limit for safety evacuation is discussed. From the results, it is found that water depth of 0.3 - 0.4m would be a critical situation for the evacuation from underground space through staircases and door and that 0.7 - 0.8m deep on the ground would be also a critical situation for safety evacuation though the doors of the vehicle. These criteria have some possibility to vary according to different inundated situations, and they are also influenced by the individual variation like the difference of physical strength. This means that these criteria requires cautious stance to use although they show a sort of an index of the limitation for saftty evacuation from underground space.

  3. Top quark physics experimental results at the LHC: Cross section and mass measurements with the CMS experiment

    NASA Astrophysics Data System (ADS)

    Gallinaro, M.

    2016-07-01

    The top quark, the heaviest known elementary particle discovered at the Fermilab Tevatron almost exactly twenty years ago, has taken a central role in the study of fundamental interactions. Its large mass suggests that it may play a special role in Nature. With approximately 25fb-1 of data collected by the CMS experiments at the Large Hadron Collider in Run 1 (2010-2012), top quark physics is at a turning point from first studies to precision measurements with sensitivity to new physics processes. This report summarizes the latest experimental results on top quark production cross section and mass measurements.

  4. The spectrum of cosmic ray muons obtained with 100-ton scintillation detector underground and the analysis of recent experimental data

    NASA Technical Reports Server (NTRS)

    Khalchukov, F. F.; Korolkova, E. V.; Kudryavtsev, V. A.; Malgin, A. S.; Ryazhskaya, O. G.; Zatsepin, G. T.

    1985-01-01

    The vertical muon spectrum up to 15 TeV obtained with the underground installation is presented. Recent experimental data dealing with horizontal and vertical cosmic ray muon spectra are analyzed and discussed.

  5. Survey of existing underground openings for in-situ experimental facilities

    SciTech Connect

    Wollenberg, H.; Graf, A.; Strisower, B.; Korbin, G.

    1981-07-01

    In an earlier project, a literature search identified 60 underground openings in crystalline rock capable of providing access for an in-situ experimental facility to develop geochemical and hydrological techniques for evaluating sites for radioactive waste isolation. As part of the current project, discussions with state geologists, owners, and operators narrowed the original group to 14. Three additional sites in volcanic rock and one site in granite were also identified. Site visits and application of technical criteria, including the geologic and hydrologic settings and depth, extent of the rock unit, condition, and accessibility of underground workings, determined four primary candidate sites: the Helms Pumped Storage Project in grandiodorite of the Sierra Nevada, California; the Tungsten Queen Mine in Precambrian granodiorite of the North Carolina Piedmont; the Mount Hope Mine in Precambrian granite and gneiss of northern New Jersey; and the Minnamax Project in the Duluth gabbro complex of northern Minnesota.

  6. Particle Shape Effect on Macroscopic Behaviour of Underground Structures: Numerical and Experimental Study

    NASA Astrophysics Data System (ADS)

    Szarf, Krzysztof; Combe, Gael; Villard, Pascal

    2015-02-01

    The mechanical performance of underground flexible structures such as buried pipes or culverts made of plastics depend not only on the properties of the structure, but also on the material surrounding it. Flexible drains can deflect by 30% with the joints staying tight, or even invert. Large deformations of the structure are difficult to model in the framework of Finite Element Method, but straightforward in Discrete Element Methods. Moreover, Discrete Element approach is able to provide information about the grain-grain and grain-structure interactions at the microscale. This paper presents numerical and experimental investigations of flexible buried pipe behaviour with focus placed on load transfer above the buried structure. Numerical modeling was able to reproduce the experimental results. Load repartition was observed, being affected by a number of factors such as particle shape, pipe friction and pipe stiffness.

  7. Detectability of underground electrical cables junction with a ground penetrating radar: electromagnetic simulation and experimental measurements

    NASA Astrophysics Data System (ADS)

    Liu, Xiang; serhir, mohammed; kameni, abelin; lambert, marc; pichon, lionel

    2016-04-01

    For a company like Electricity De France (EDF), being able to detect accurately using non-destructive methods the position of the buried junction between two underground cables is a crucial issue. The junction is the linking part where most maintenance operations are carried out. The challenge of this work is to conduct a feasibility study to confirm or deny the relevance of Ground Penetrating Radar (GPR) to detect these buried junctions in their actual environment against clutter. Indeed, the cables are buried in inhomogeneous medium at around 80cm deep. To do this, the study is conducted in a numerical environment. We use the 3D simulation software CST MWS to model a GPR scenario. In this simulation, we place the already optimized bowtie antennas operating in the frequency band [0.5 GHz - 3 GHz] in front of wet soil (dispersive) and dry soil where the underground cable is placed at 80cm deep. We collect the amplitude and phase of the reflected waves in order to detect the contrast provoked by the geometric dimensions variation of the cable [1] (diameter of the cable is 48mm and the diameter of the junction 74mm). The use of an ultra-wideband antenna is necessary to reconcile resolution and penetration of electromagnetic waves in the medium to be characterized. We focus on the performance of the GPR method according to the characteristics of the surrounding medium in which the electric cables are buried, the polarization of the Tx and Rx antennas. The experimental measurement collected in the EDF site will be presented. The measured data are processed using the clutter reduction method based on digital filtering [2]. We aim at showing that using the developed bowtie antennas that the GPR technique is well adapted for the cable junction localization even in cluttered environment. References [1] D. J. Daniels, "Surface-Penetrating Radar", London, IEE 1996. [2] Potin, D.; Duflos, E.; Vanheeghe, P., "Landmines Ground-Penetrating Radar Signal Enhancement by Digital

  8. Higgs Results from CMS

    NASA Astrophysics Data System (ADS)

    Bornheim, Adolf

    2014-03-01

    The Nobel Prize in physics 2013 has been awarded to François Englert and Peter W. Higgs for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles which plays a crucial role in our understanding of electro-weak symmetry breaking. I will review the experimental results manifesting the discovery of the so called Higgs boson from the perspective of the Compact Muon Solenoid (CMS) collaboration. The review is based on the final results from the proton-proton collision data at 7 TeV and 8 TeV center-of-mass energy, collected in 2011 and 2012 in the initial run of the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN). Results on the properties of the new particle with a mass around 125 GeV, all in agreement with the expectations for the Standard Model (SM) Higgs boson, are highlighted. Latest results on the couplings between the Higgs and fermionic fields, in particular the final results of searches for a Higgs boson decaying into a b-quark or a tau-lepton pair, are presented. Non-SM Higgs searches are briefly summarized. Future perspectives for Higgs physics with CMS at LHC for the next data taking period starting in 2015 and beyond are discussed. CMS Collaboration.

  9. High energy cosmic ray physics with underground muons in MACRO. I. Analysis methods and experimental results

    SciTech Connect

    Bellotti, R.; Cafagna, F.; Calicchio, M.; Castellano, M.; De Cataldo, G.; De Marzo, C.; Erriquez, O.; Favuzzi, C.; Fusco, P.; Giglietto, N.; Guarnaccia, P.; Mazziotta, M.N.; Montaruli, T.; Raino, A.; Spinelli, P.; Cecchini, S.; Dekhissi, H.; Fantini, R.; Giacomelli, G.; Mandrioli, G.; Margiotta-Neri, A.; Patrizii, L.; Popa, V.; Serra-Lugaresi, P.; Spurio, M.; Togo, V.; Hong, J.T.; Kearns, E.; Okada, C.; Orth, C.; Stone, J.L.; Sulak, L.R.; Barish, B.C.; Goretti, M.; Katsavounidis, E.; Kyriazopoulou, S.; Michael, D.G.; Nolty, R.; Peck, C.W.; Scholberg, K.; Walter, C.W.; Lane, C.; Steinberg, R.; Battistoni, G.; Bilokon, H.; Bloise, C.; Carboni, M.; Chiarella, V.; Forti, C.; Iarocci, E.; Marini, A.; Patera, V.; Ronga, F.; Satta, L.; Sciubba, A.; Spinetti, M.; Valente, V.; Antolini, R.; Bosio, T.; Di Credico, A.; Grillo, A.; Gustavino, C.; Mikheyev, S.; Parlati, S.; Reynoldson, J.; Scapparone, E.; Bower, C.; Habig, A.; Hawthorne, A.; Heinz, R.; Miller, L.; Mufson, S.; Musser, J.; De Mitri, I.; Monacelli, P.; Bernardini, P.; Mancarella, G.; Martello, D.; Palamara, O.; Petrera, S.; Pistilli, P.; Ricciardi, M.; Surdo, A.; Baker, R.; and others

    1997-08-01

    In this paper, the first of a two-part work, we present the reconstruction and measurement of muon events detected underground by the MACRO experiment at Gran Sasso (E{sub {mu}}{ge} 1.3 TeV in atmosphere). The main aim of this work is to discuss the muon multiplicity distribution as measured in the detector. The data sample analyzed consists of 4.4{times}10{sup 6} muon events, of which {approximately} 263000 are multiple muons, corresponding to a total live time of 5850 h. In this sample, the observed multiplicities extend above N{sub {mu}}=35, with intermuon separations up to 50 m and beyond. Additional complementing measurements, such as the inclusive muon flux, the angular distribution, and the muon separation distribution (decoherence), are also included. The physical interpretation of the results presented here is reported in the following companion paper. {copyright} {ital 1997} {ital The American Physical Society}

  10. An experimental scale-model study of seismic response of an underground opening in jointed rock mass

    SciTech Connect

    Kana, D.D.; Fox, D.J.; Hsiung, S.; Chowdhury, A.H.

    1997-02-01

    This report describes an experimental investigation conducted by the Center for Nuclear Waste Regulatory Analyses (CNWRA) to (i) obtain a better understanding of the seismic response of an underground opening in a highly-fractured and jointed rock mass and (ii) generate a data set that can be used to evaluate the capabilities (analytical methods) to calculate such response. This report describes the design and implementation of simulated seismic experiments and results for a 1/15 scale model of a jointed rock mass with a circular tunnel in the middle. The discussion on the design of the scale model includes a description of the associated similitude theory, physical design rationale, model material development, preliminary analytical evaluation, instrumentation design and calibration, and model assembly and pretest procedures. The thrust of this discussion is intended to provide the information necessary to understand the experimental setup and to provide the background necessary to understand the experimental results. The discussion on the experimental procedures and results includes the seismic input test procedures, test runs, and measured excitation and response time histories. The closure of the tunnel due to various levels of seismic activity is presented. A threshold level of seismic input amplitude was required before significant rock mass motion occurred. The experiment, though designed as a two-dimensional representation of a rock mass, behaved in a somewhat three-dimensional manner, which will have an effect on subsequent analytical model comparison.

  11. Possibility of experimental validation of criticality safety methodology in support of underground fuel storage efforts

    SciTech Connect

    Nikolaev, M.N.; Briggs, J.B.

    1997-10-01

    Critical systems which might be formed in geologic repositories as a result of long-term degradation of the storage media, leaching of plutonium from the storage media, and the redistribution of low concentrations of plutonium into underground sand layers or lenses can be characterized by positive reactivity feedback. Formation of such a systems can not be excluded when considering the burial of high enriched uranium or plutonium contaminated wastes or spent nuclear fuels. Although the probability of formation of a critical systems under such conditions is very low, the reliable prediction of neutron multiplication properties appears to be of great interest from a criticality safety view point. At the present time, all estimations of criticality are based only on evaluated neutron data because critical experiments are not available for large systems containing small quantities plutonium distributed throughout a typically encountered matrix material such as silicon dioxide. The possibility of providing such an experiment using the large Russian critical assemblies, BFS-1 or BFS-2, is considered. It is shown that critical systems containing small amounts of hydrogenous material (polyethylene) with positive reactivity feedback can by modeled in the BFS Facility.

  12. Numerical and experimental study of strata behavior and land subsidence in an underground coal gasification project

    NASA Astrophysics Data System (ADS)

    Sirdesai, N. N.; Singh, R.; Singh, T. N.; Ranjith, P. G.

    2015-11-01

    Underground Coal Gasification, with enhanced knowledge of hydrogeological, geomechanical and environmental aspects, can be an alternative technique to exploit the existing unmineable reserves of coal. During the gasification process, petro-physical and geomechanical properties undergo a drastic change due to heating to elevated temperatures. These changes, caused due to the thermal anisotropy of various minerals, result in the generation of thermal stresses; thereby developing new fracture pattern. These fractures cause the overhead rock strata to cave and fill the gasification chamber thereby causing subsidence. The degree of subsidence, change in fluid transport and geomechanical properties of the rock strata, in and around the subsidence zone, can affect the groundwater flow. This study aims to predict the thermo-geomechanical response of the strata during UCG. Petro-physical and geomechanical properties are incorporated in the numerical modelling software COMSOL Multiphysics and an analytical strength model is developed to validate and further study the mechanical response and heat conduction of the host rock around the gasification chamber. Once the problems are investigated and solved, the enhanced efficiency and the economic exploitation of gasification process would help meet country's energy demand.

  13. An Experimental and Theoretical Study of Fracture Patterns and Particle Motion Generated by Underground Explosions

    NASA Astrophysics Data System (ADS)

    Mihaly, J. M.; Rosakis, A.; Sammis, C. G.; Bhat, H.

    2013-12-01

    Fracture patterns and local particle velocities produced by point explosions in very brittle 'candy glass' plates are compared to those numerically predicted using a dynamic micro-mechanical damage mechanics model, developed by Bhat, Rosakis and Sammis, J. Appl. Mech., 2012. Empirically measured material properties for candy glass facilitate direct comparison between the numerical simulation and experimental results. The evolution of fracture damage produced in experiments is observed using high-speed digital photography, which also images resultant wave fronts (for both P and S). Local particle velocities are also recorded at up to three points using laser vibrometers. Numerical results for the spatial extent of circumferential and radial cracking, in addition to the growth-rate of individual radial cracks, are representative of experimental observations. Wave reflections from the plate edges are observed in both experiment and numerical simulation to affect the expansion of radial cracks. Numerically predicted wave-forms and arrivals compare well with experimental results observed at select points.

  14. The CMS Reconstruction Software

    NASA Astrophysics Data System (ADS)

    Lange, David J.; CMS Collaboration

    2011-12-01

    We report on the status and plans for the event reconstruction software of the CMS experiment. The CMS reconstruction algorithms are the basis for a wide range of data analysis approaches currently under study by the CMS collaboration using the first high-energy run of the LHC. These algorithms have been primarily developed and validated using simulated data samples, and are now being commissioned with LHC proton-proton collision data samples. The CMS reconstruction is now operated routinely on all events triggered by the CMS detector, both in a close to real-time prompt reconstruction processing and in frequent passes over the full recorded CMS data set. We discuss the overall software design, development cycle, computational requirements and performance, recent operational performance, and planned improvements of the CMS reconstruction software.

  15. An Experimental and Theoretical Study of Fracture Patterns Generated by Underground Explosions

    NASA Astrophysics Data System (ADS)

    Bhat, H.; Mihaly, J. M.; Rosakis, A.; Sammis, C. G.

    2012-12-01

    A dynamic micro-mechanical damage mechanics model, developed by Bhat, Rosakis and Sammis, J. Appl. Mech., 2012, is used to simulate two-dimensional explosions in a brittle material. The theoretical patterns of circumferential and radial fractures are quantitatively compared with those produced by point explosions in very brittle "candy glass" plates. In these experiments the evolution of the fracture pattern is monitored using high-speed digital photography, which also images the resultant elastic waves (P and S). Theoretical estimates of the spatial extent of circumferential and radial cracking as well as the propagation speed of the comminution front and the growth-rate of individual radial cracks all compare well with the experimental observations. The wave-forms of the P and S waves, specifically the local particle velocities, are also recorded at selected points using laser vibrometers. Asymmetric fracture patterns caused by a non isotropic pre-stress, the preferred orientation of initial flaws (a rift plane), or a lithostatic gradient lead to the generation of strong S-waves from the otherwise spherically symmetric point source.

  16. Measurement of the charge ratio of atmospheric muons with the CMS detector

    SciTech Connect

    Khachatryan, Vardan; et al.

    2010-08-01

    We present a measurement of the ratio of positive to negative muon fluxes from cosmic ray interactions in the atmosphere, using data collected by the CMS detector both at ground level and in the underground experimental cavern at the CERN LHC. Muons were detected in the momentum range from 5 GeV/c to 1 TeV/c. The surface flux ratio is measured to be 1.2766 \\pm 0.0032(stat.) \\pm 0.0032 (syst.), independent of the muon momentum, below 100 GeV/c. This is the most precise measurement to date. At higher momenta the data are consistent with an increase of the charge ratio, in agreement with cosmic ray shower models and compatible with previous measurements by deep-underground experiments.

  17. CMS Analysis School Model

    NASA Astrophysics Data System (ADS)

    Malik, S.; Shipsey, I.; Cavanaugh, R.; Bloom, K.; Chan, Kai-Feng; D'Hondt, J.; Klima, B.; Narain, M.; Palla, F.; Rolandi, G.; Schörner-Sadenius, T.

    2014-06-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  18. CMS Analysis School Model

    SciTech Connect

    Malik, S.; Shipsey, I.; Cavanaugh, R.; Bloom, K.; Chan, Kai-Feng; D'Hondt, J.; Klima, B.; Narain, M.; Palla, F.; Rolandi, G.; Schörner-Sadenius, T.

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  19. CMS tracker visualization tools

    NASA Astrophysics Data System (ADS)

    Mennea, M. S.; Osborne, I.; Regano, A.; Zito, G.

    2005-08-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  20. CMS Space Monitoring

    SciTech Connect

    Ratnikova, N.; Huang, C.-H.; Sanchez-Hernandez, A.; Wildish, T.; Zhang, X.

    2014-01-01

    During the first LHC run, CMS stored about one hundred petabytes of data. Storage accounting and monitoring help to meet the challenges of storage management, such as efficient space utilization, fair share between users and groups and resource planning. We present a newly developed CMS space monitoring system based on the storage metadata dumps produced at the sites. The information extracted from the storage dumps is aggregated and uploaded to a central database. A web based data service is provided to retrieve the information for a given time interval and a range of sites, so it can be further aggregated and presented in the desired format. The system has been designed based on the analysis of CMS monitoring requirements and experiences of the other LHC experiments. In this paper, we demonstrate how the existing software components of the CMS data placement system, PhEDEx, have been re-used, dramatically reducing the development effort.

  1. CMS investigates outlier payments.

    PubMed

    Brock, Thomas H

    2003-02-01

    CMS is increasing its scrutiny of hospital billing practices in the wake of excessive claims for outlier payments by some healthcare organizations. Hospitals should review their billing practices to ensure that they are using a charge schedule that complies with Medicare regulations Hospitals also should conduct ongoing reviews of their outlier cases to ensure that their charge structures are appropriate and their outlier services are medically necessary. Hospitals can expect CMS to implement changes to the outlier regulations. PMID:12602315

  2. CMS Geometry Through 2020

    NASA Astrophysics Data System (ADS)

    Osborne, I.; Brownson, E.; Eulisse, G.; Jones, C. D.; Lange, D. J.; Sexton-Kennedy, E.

    2014-06-01

    CMS faces real challenges with upgrade of the CMS detector through 2020 and beyond. One of the challenges, from the software point of view, is managing upgrade simulations with the same software release as the 2013 scenario. We present the CMS geometry description software model, its integration with the CMS event setup and core software. The CMS geometry configuration and selection is implemented in Python. The tools collect the Python configuration fragments into a script used in CMS workflow. This flexible and automated geometry configuration allows choosing either transient or persistent version of the same scenario and specific version of the same scenario. We describe how the geometries are integrated and validated, and how we define and handle different geometry scenarios in simulation and reconstruction. We discuss how to transparently manage multiple incompatible geometries in the same software release. Several examples are shown based on current implementation assuring consistent choice of scenario conditions. The consequences and implications for multiple/different code algorithms are discussed.

  3. Background Underground at WIPP

    NASA Astrophysics Data System (ADS)

    Esch, Ernst-Ingo; Hime, A.; Bowles, T. J.

    2001-04-01

    Recent interest to establish a dedicated underground laboratory in the United States prompted an experimental program at to quantify the enviromental backgrounds underground at the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. An outline of this program is provided along with recent experimental data on the cosmic ray muon flux at the 650 meter level of WIPP. The implications of the cosmic ray muon and fast neutron background at WIPP will be discussed in the context of new generation, low background experiments envisioned in the future.

  4. Experimental Plan of the 25Mg(p, γ)26Al Resonance Capture Reaction at Jinping Underground Laboratory

    NASA Astrophysics Data System (ADS)

    Li, Z. H.; Su, J.; Li, Y. J.; Guo, B.; Yan, S. Q.; Wang, Y. B.; Lian, G.; Zeng, S.; Zhang, Q. W.; He, G. Z.; Gan, L.; Zhou, C.; Liu, W. P.; Li, K. A.; Yu, X. Q.; Tang, X. D.; He, J. J.; Qian, Y. Z.

    The observation of 26Al is an useful tool for γ-ray astronomy and in studies of galactic chemical evolution. The most likely mechanism for 26A1 nucleosynthesis is in the hydrogen burning MgAl cycle, and the 26A1 production from the 25Mg(p, γ)26Al reaction at the important temperature range below T = 0.2 is still not well known. We present a proposal to measure the resonance strength of 58 keV resonance level of the 25Mg(p, γ)26Al reaction, and the effective counting rate is estimated for the direct measurement at Jinping underground laboratory.

  5. CMS analysis operations

    NASA Astrophysics Data System (ADS)

    Andreeva, J.; Calloni, M.; Colling, D.; Fanzago, F.; D'Hondt, J.; Klem, J.; Maier, G.; Letts, J.; Maes, J.; Padhi, S.; Sarkar, S.; Spiga, D.; Van Mulders, P.; Villella, I.

    2010-04-01

    During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.

  6. The CMS pixel system

    NASA Astrophysics Data System (ADS)

    Bortoletto, Daniela; CMS Collaboration

    2007-09-01

    The CMS hybrid pixel detector is located at the core of the CMS tracker and will contribute significantly to track and vertex reconstruction. The detector is subdivided into a three-layer barrel, and two end-cap disks on either side of the interaction region. The system operating in the 25-ns beam crossing time of the LHC must be radiation hard, low mass, and robust. The construction of the barrel modules and the forward disks has started after extensive R&D. The status of the project is reported.

  7. Automating the CMS DAQ

    SciTech Connect

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  8. The CMS Electromagnetic Calorimeter

    SciTech Connect

    Paramatti, Riccardo

    2005-10-12

    The electromagnetic calorimeter of the CMS experiment at LHC will consist of about 76000 Lead Tungstate crystals. Its main purpose is the very precise energy measurement of electrons and photons produced at 14 TeV centre-of-mass energy. A review of its performances and its construction status is given. Then the calibration strategy is described in details.

  9. Underground Mathematics

    ERIC Educational Resources Information Center

    Hadlock, Charles R

    2013-01-01

    The movement of groundwater in underground aquifers is an ideal physical example of many important themes in mathematical modeling, ranging from general principles (like Occam's Razor) to specific techniques (such as geometry, linear equations, and the calculus). This article gives a self-contained introduction to groundwater modeling with…

  10. The CMS muon detector

    NASA Astrophysics Data System (ADS)

    Giacomelli, P.

    2002-02-01

    The muon detection system of the Compact Muon Solenoid experiment is described. It consists of three different detector technologies: drift tubes in the barrel region, cathode strip chambers in the endcap region and resistive plate chambers in both barrel and endcap regions. The CMS muon detection system ensures excellent muon detection and efficient triggering in the pseudorapidity range 0< η<2.4. The most recent developments and some results from the R&D program will also be discussed.

  11. CMS electromagnetic calorimeter readout

    SciTech Connect

    Denes, P.; Wixted, R.

    1997-12-31

    The CMS Electromagnetic Calorimeter will consist of 109,008 crystals of Lead Tungstate (PbWO{sub 4}) arranged in a barrel (92880 crystals) and 2 endcaps (8064 crystals each). The crystals will be 25 radiation lengths long and cut in tapered shapes to make a hermetic calorimeter. The scintillation light from the crystals is captured by a photodetector, amplified and digitized. The properties of PbWO4, which is a new crystal still very much under development.

  12. The Muon Detector of Cms

    NASA Astrophysics Data System (ADS)

    Jiang, Chunhua

    2005-04-01

    Muons are an unmistakable signature of most of the LHC physics is designed to explore. The ability to trigger on and reconstruct muons at highest luminorsities is central to the concept of CMS. CMS is characterized by simplicity of design, with one magnet whose solenoideal field facilitates precision racking in the central barrel region and triggering on muons through their bending in the tharnverse and side views. The CMS muon system has three purpose: muon identification, muon trigger and nuon momentum measurement.

  13. Using fullscreen CMS at CERN

    SciTech Connect

    White, B.

    1991-05-01

    Fullscreen CMS is an optional console environment introduced in Release 5 of CMS which maintains the context of a VM session across invocations of full screen commands like XEDIT, FILELIST or MAIL. In addition it allows limited scrolling and windowing capabilities. This write-up provides CERNVM users who are interested in Fullscreen CMS with an overview of the concepts and operations which are involved. In that it is an optional environment, this write-up does not constitute an endorsement of Fullscreen CMS.

  14. CMS Frailty Adjustment Model

    PubMed Central

    Kautter, John; Pope, Gregory C.

    2004-01-01

    The authors document the development of the CMS frailty adjustment model, a Medicare payment approach that adjusts payments to a Medicare managed care organization (MCO) according to the functional impairment of its community-residing enrollees. Beginning in 2004, this approach is being applied to certain organizations, such as Program of All-Inclusive Care for the Elderly (PACE), that specialize in providing care to the community-residing frail elderly. In the future, frailty adjustment could be extended to more Medicare managed care organizations. PMID:25372243

  15. Opportunistic Resource Usage in CMS

    SciTech Connect

    Kreuzer, Peter; Hufnagel, Dirk; Dykstra, D.; Gutsche, O.; Tadel, M.; Sfiligoi, I.; Letts, J.; Wuerthwein, F.; McCrea, A.; Bockelman, B.; Fajardo, E.; Linares, L.; Wagner, R.; Konstantinov, P.; Blumenfeld, B.; Bradley, D.

    2014-01-01

    CMS is using a tiered setup of dedicated computing resources provided by sites distributed over the world and organized in WLCG. These sites pledge resources to CMS and are preparing them especially for CMS to run the experiment's applications. But there are more resources available opportunistically both on the GRID and in local university and research clusters which can be used for CMS applications. We will present CMS' strategy to use opportunistic resources and prepare them dynamically to run CMS applications. CMS is able to run its applications on resources that can be reached through the GRID, through EC2 compliant cloud interfaces. Even resources that can be used through ssh login nodes can be harnessed. All of these usage modes are integrated transparently into the GlideIn WMS submission infrastructure, which is the basis of CMS' opportunistic resource usage strategy. Technologies like Parrot to mount the software distribution via CVMFS and xrootd for access to data and simulation samples via the WAN are used and will be described. We will summarize the experience with opportunistic resource usage and give an outlook for the restart of LHC data taking in 2015.

  16. Opportunistic Resource Usage in CMS

    NASA Astrophysics Data System (ADS)

    Kreuzer, Peter; Hufnagel, Dirk; Dykstra, D.; Gutsche, O.; Tadel, M.; Sfiligoi, I.; Letts, J.; Wuerthwein, F.; McCrea, A.; Bockelman, B.; Fajardo, E.; Linares, L.; Wagner, R.; Konstantinov, P.; Blumenfeld, B.; Bradley, D.; Cms Collaboration

    2014-06-01

    CMS is using a tiered setup of dedicated computing resources provided by sites distributed over the world and organized in WLCG. These sites pledge resources to CMS and are preparing them especially for CMS to run the experiment's applications. But there are more resources available opportunistically both on the GRID and in local university and research clusters which can be used for CMS applications. We will present CMS' strategy to use opportunistic resources and prepare them dynamically to run CMS applications. CMS is able to run its applications on resources that can be reached through the GRID, through EC2 compliant cloud interfaces. Even resources that can be used through ssh login nodes can be harnessed. All of these usage modes are integrated transparently into the GlideIn WMS submission infrastructure, which is the basis of CMS' opportunistic resource usage strategy. Technologies like Parrot to mount the software distribution via CVMFS and xrootd for access to data and simulation samples via the WAN are used and will be described. We will summarize the experience with opportunistic resource usage and give an outlook for the restart of LHC data taking in 2015.

  17. CMS computing model evolution

    NASA Astrophysics Data System (ADS)

    Grandi, C.; Bonacorsi, D.; Colling, D.; Fisk, I.; Girone, M.

    2014-06-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  18. Hazardous gases and oxygen depletion in a wet paddy pile: an experimental study in a simulating underground rice mill pit, Thailand.

    PubMed

    Yenjai, Pornthip; Chaiear, Naesinee; Charerntanyarak, Lertchai; Boonmee, Mallika

    2012-01-01

    During the rice harvesting season in Thailand, large amounts of fresh paddy are sent to rice mills immediately after harvesting due to a lack of proper farm storage space. At certain levels of moisture content, rice grains may generate hazardous gases, which can replace oxygen (O(2)) in the confined spaces of underground rice mill pits. This phenomenon has been observed in a fatal accident in Thailand. Our study aimed to investigate the type of gases and their air concentrations emitted from the paddy piles at different levels of moisture content and duration of piling time. Four levels of moisture content in the paddy piles were investigated, including dry paddy group (< 14% wet basis (wb)), wet paddy groups (22-24, 25-27 and 28-30%wb). Our measurements were conducted in 16 experimental concrete pits 80 × 80 cm wide by 60 cm high. Gases emitted were measured with an infrared spectrophotometer and a multi-gas detector every 12 h for 5 days throughout the experiment. The results revealed high levels of carbon dioxide (CO(2)) (range 5,864-8,419 ppm) in all wet paddy groups, which gradually increased over time. The concentration of carbon monoxide (CO), methane (CH(4)), nitromethane (CH(3)NO(2)) and nitrous oxide (N(2)O) in all wet paddy groups increased with piling time and with moisture content, with ranges of 11-289; 2-8; 36-374; and 4-26 ppm, respectively. The highest levels of moisture content in the paddy piles were in the range 28-30%wb. Nitrogen dioxide (NO(2)) concentrations were low in all paddy groups. The percentage of O(2) in the wet paddy groups decreased with piling time and moisture content (from 18.7% to 4.1%). This study suggested that hazardous gases could be emitted in moist paddy piles, and their concentrations could increase with increasing moisture content and piling time period. PMID:23047081

  19. Recent CMS results on diffraction

    NASA Astrophysics Data System (ADS)

    Benoît, Roland

    2015-03-01

    Recent CMS results on diffraction are presented. These include the measurements of the soft diffractive cross sections, of the forward rapidity gap cross section, of the diffractive dijet cross section, the measurement of a large rapidity gap in W and Z boson events and the measurement of the pseudorapidity distribution of charged particles in a single diffractive enhanced sample. This last measurement is the first common result of the CMS and TOTEM collaborations. Some prospects of common CMS-TOTEM data taking are also discussed.

  20. Water underground

    NASA Astrophysics Data System (ADS)

    de Graaf, Inge

    2015-04-01

    The world's largest assessable source of freshwater is hidden underground, but we do not know what is happening to it yet. In many places of the world groundwater is abstracted at unsustainable rates: more water is used than being recharged, leading to decreasing river discharges and declining groundwater levels. It is predicted that for many regions of the world unsustainable water use will increase, due to increasing human water use under changing climate. It would not be long before shortage causes widespread droughts and the first water war begins. Improving our knowledge about our hidden water is the first step to stop this. The world largest aquifers are mapped, but these maps do not mention how much water they contain or how fast water levels decline. If we can add a third dimension to the aquifer maps, so a thickness, and add geohydrological information we can estimate how much water is stored. Also data on groundwater age and how fast it is refilled is needed to predict the impact of human water use and climate change on the groundwater resource.

  1. Water Underground

    NASA Astrophysics Data System (ADS)

    de Graaf, I. E. M.

    2014-12-01

    The world's largest accessible source of freshwater is hidden underground. However it remains difficult to estimate its volume, and we still cannot answer the question; will there be enough for everybody? In many places of the world groundwater abstraction is unsustainable: more water is used than refilled, leading to decreasing river discharges and declining groundwater levels. It is predicted that for many regions in the world unsustainable water use will increase in the coming decades, due to rising human water use under a changing climate. It would not take long before water shortage causes widespread droughts and the first water war begins. Improving our knowledge about our hidden water is the first step to prevent such large water conflicts. The world's largest aquifers are mapped, but these maps do not mention how much water these aquifers contain or how fast water levels decline. If we can add thickness and geohydrological information to these aquifer maps, we can estimate how much water is stored and its flow direction. Also, data on groundwater age and how fast the aquifer is refilled is needed to predict the impact of human water use and climate change on the groundwater resource. Ultimately, if we can provide this knowledge water conflicts will focus more on a fair distribution instead of absolute amounts of water.

  2. FIRE_AX_CMS_SOLAR_WK

    Atmospheric Science Data Center

    2015-11-24

    FIRE_AX_CMS_SOLAR_WK Project Title:  FIRE II ASTEX Discipline:  ... Order: Earthdata Search Parameters:  Solar Irradiance Order Data:  Search and Order: Earthdata Search Readme Files:  Readme CMS_SOLAR_WK CMS_SOLAR_WK Info 1 CMS_SOLAR_WK Info 2 ...

  3. The CMS central hadron calorimeter

    SciTech Connect

    Freeman, J.

    1998-11-01

    The CMS central hadron calorimeter is a brass absorber/scintillator sampling structure. We describe details of the mechanical and optical structure. We also discuss calibration techniques, and finally the anticipated construction schedule. {copyright} {ital 1998 American Institute of Physics.}

  4. Heavy quark physics in CMS

    NASA Astrophysics Data System (ADS)

    Fedi, G.; CMS Collaboration

    2016-07-01

    The most recent results which concern the heavy quark hadrons done in the CMS experiment are reported. The searching area spans over the heavy quark spectroscopy, production cross sections, beauty meson decay properties, rare decays, and CP violation.

  5. CMS multicore scheduling strategy

    SciTech Connect

    Perez-Calero Yzquierdo, Antonio; Hernandez, Jose; Holzman, Burt; Majewski, Krista; McCrea, Alison

    2014-01-01

    In the next years, processor architectures based on much larger numbers of cores will be most likely the model to continue 'Moore's Law' style throughput gains. This not only results in many more jobs in parallel running the LHC Run 1 era monolithic applications, but also the memory requirements of these processes push the workernode architectures to the limit. One solution is parallelizing the application itself, through forking and memory sharing or through threaded frameworks. CMS is following all of these approaches and has a comprehensive strategy to schedule multicore jobs on the GRID based on the glideinWMS submission infrastructure. The main component of the scheduling strategy, a pilot-based model with dynamic partitioning of resources that allows the transition to multicore or whole-node scheduling without disallowing the use of single-core jobs, is described. This contribution also presents the experiences made with the proposed multicore scheduling schema and gives an outlook of further developments working towards the restart of the LHC in 2015.

  6. CMS multicore scheduling strategy

    NASA Astrophysics Data System (ADS)

    Pérez-Calero Yzquierdo, Antonio; Hernández, Jose; Holzman, Burt; Majewski, Krista; McCrea, Alison; Cms Collaboration

    2014-06-01

    In the next years, processor architectures based on much larger numbers of cores will be most likely the model to continue "Moore's Law" style throughput gains. This not only results in many more jobs in parallel running the LHC Run 1 era monolithic applications, but also the memory requirements of these processes push the workernode architectures to the limit. One solution is parallelizing the application itself, through forking and memory sharing or through threaded frameworks. CMS is following all of these approaches and has a comprehensive strategy to schedule multicore jobs on the GRID based on the glideinWMS submission infrastructure. The main component of the scheduling strategy, a pilot-based model with dynamic partitioning of resources that allows the transition to multicore or whole-node scheduling without disallowing the use of single-core jobs, is described. This contribution also presents the experiences made with the proposed multicore scheduling schema and gives an outlook of further developments working towards the restart of the LHC in 2015.

  7. The CMS dataset bookkeeping service

    SciTech Connect

    Afaq, Anzar,; Dolgert, Andrew; Guo, Yuyi; Jones, Chris; Kosyakov, Sergey; Kuznetsov, Valentin; Lueking, Lee; Riley, Dan; Sekhri, Vijay; /Fermilab

    2007-10-01

    The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS is available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.

  8. Searching for extra-dimensions at CMS

    NASA Astrophysics Data System (ADS)

    Benucci, Leonardo

    2009-06-01

    A possible solution to the hierarchy problem is the presence of extra space dimensions beyond the three ones which are known from our everyday experience. The phenomenological ADD model of large extra-dimensions predicts a ETmiss +jet signature. Randall-Sundrum-type extra-dimensions predict di-lepton and di-jet resonances. This contribution addresses an overview of experimental issues and discovery potential for these new particles at the LHC, focusing on perspectives with the CMS detector during early data taking.

  9. The cathode strip chamber data acquisition electronics for CMS

    NASA Astrophysics Data System (ADS)

    Bylsma, B. G.; Durkin, L. S.; Gilmore, J.; Gu, J.; Ling, T. Y.; Rush, C.

    2009-03-01

    Data Acquisition (DAQ) electronics for Cathode Strip Chambers (CSC) [CMS Collaboration, The Muon Project Technical Design Report, CERN/LHCC 97-32, CMS TDR3, 1997] in the Compact Muon Solenoid (CMS) [CMS Collaboration, The Compact Muon Solenoid Technical Proposal, CERN/LHCC 94-38, 1994] experiment at the Large Hadron Collider (LHC) [The LHC study group, The Large Hadron Collider: Conceptual Design, CERN/AC 1995-05, 1995] is described. The CSC DAQ system [B. Bylsma, et al., in: Proceedings of the Topical Workshop on Electronics for Particle Physics, Prague, Czech Republic, CERN-2007-007, 2007, pp. 195-198] includes on-detector and off-detector electronics, encompassing five different types of custom circuit boards designed to handle the high event rate at the LHC. The on-detector electronics includes Cathode Front End Boards (CFEB) [R. Breedon, et al., Nucl. Instr. and Meth. A 471 (2001) 340], which amplify, shape, store, and digitize chamber cathode signals; Anode Front End Boards (AFEB) [T. Ferguson, et al., Nucl. Instr. and Meth. A 539 (2005) 386], which amplify, shape and discriminate chamber anode signals; and Data Acquisition Motherboards (DAQMB), which controls the on-chamber electronics and the readout of the chamber. The off-detector electronics, located in the underground service cavern, includes Detector Dependent Unit (DDU) boards, which perform real time data error checking, electronics reset requests and data concentration; and Data Concentrator Card (DCC) boards, which further compact the data and send it to the CMS DAQ System [CMS Collaboration, The TriDAS Project Technical Design Report, Volume 2: Data Acquisition and High-level Trigger, CERN/LHCC 2002-26, 2002], and serve as an interface to the CMS Trigger Timing Control (TTC) [TTC system ] system. Application Specific Integrated Circuits (ASIC) are utilized for analogous signal processing on front end boards. Field Programmable Gate Arrays (FPGA) are utilized

  10. Underground Layout Configuration

    SciTech Connect

    A. Linden

    2003-09-25

    The purpose of this analysis was to develop an underground layout to support the license application (LA) design effort. In addition, the analysis will be used as the technical basis for the underground layout general arrangement drawings.

  11. Vitrified underground structures

    DOEpatents

    Murphy, Mark T.; Buelt, James L.; Stottlemyre, James A.; Tixier, Jr., John S.

    1992-01-01

    A method of making vitrified underground structures in which 1) the vitrification process is started underground, and 2) a thickness dimension is controlled to produce substantially planar vertical and horizontal vitrified underground structures. Structures may be placed around a contaminated waste site to isolate the site or may be used as aquifer dikes.

  12. A novel totivirus and piscine reovirus (PRV) in Atlantic salmon (Salmo salar) with cardiomyopathy syndrome (CMS)

    PubMed Central

    2010-01-01

    Background Cardiomyopathy syndrome (CMS) is a severe disease affecting large farmed Atlantic salmon. Mortality often appears without prior clinical signs, typically shortly prior to slaughter. We recently reported the finding and the complete genomic sequence of a novel piscine reovirus (PRV), which is associated with another cardiac disease in Atlantic salmon; heart and skeletal muscle inflammation (HSMI). In the present work we have studied whether PRV or other infectious agents may be involved in the etiology of CMS. Results Using high throughput sequencing on heart samples from natural outbreaks of CMS and from fish experimentally challenged with material from fish diagnosed with CMS a high number of sequence reads identical to the PRV genome were identified. In addition, a sequence contig from a novel totivirus could also be constructed. Using RT-qPCR, levels of PRV in tissue samples were quantified and the totivirus was detected in all samples tested from CMS fish but not in controls. In situ hybridization supported this pattern indicating a possible association between CMS and the novel piscine totivirus. Conclusions Although causality for CMS in Atlantic salmon could not be proven for either of the two viruses, our results are compatible with a hypothesis where, in the experimental challenge studied, PRV behaves as an opportunist whereas the totivirus might be more directly linked with the development of CMS. PMID:21067578

  13. Status of the CMS Detector

    NASA Astrophysics Data System (ADS)

    Focardi, Ettore

    The Compact Muon Solenoid (CMS) detector is one of the two largest and most powerful particle physics detectors ever built. CMS is installed in P5 at CERN's Large Hadron Collider (LHC) and as of early 2011 has completed nearly a year of operation in which it recorded products of interactions produced in protonproton collisions at a center of mass energy of 7 TeV. The proton-proton run 2010 lasted 7 months and was followed by Pb-Pb ion collisions in November. During the first few months of 2011 the LHC has delivered higher luminosity. The LHC machine is performing extremely well, allowing CMS to record enough data to perform a large number of studies of the Standard Model (SM) of particle physics in this new energy domain for the first time and to search for evidence of new physics in regions of phase space that have never before been entered. The CMS detector components, the operational experience and the performance with colliding beams will be described.

  14. The CMS DBS query language

    NASA Astrophysics Data System (ADS)

    Kuznetsov, Valentin; Riley, Daniel; Afaq, Anzar; Sekhri, Vijay; Guo, Yuyi; Lueking, Lee

    2010-04-01

    The CMS experiment has implemented a flexible and powerful system enabling users to find data within the CMS physics data catalog. The Dataset Bookkeeping Service (DBS) comprises a database and the services used to store and access metadata related to CMS physics data. To this, we have added a generalized query system in addition to the existing web and programmatic interfaces to the DBS. This query system is based on a query language that hides the complexity of the underlying database structure by discovering the join conditions between database tables. This provides a way of querying the system that is simple and straightforward for CMS data managers and physicists to use without requiring knowledge of the database tables or keys. The DBS Query Language uses the ANTLR tool to build the input query parser and tokenizer, followed by a query builder that uses a graph representation of the DBS schema to construct the SQL query sent to underlying database. We will describe the design of the query system, provide details of the language components and overview of how this component fits into the overall data discovery system architecture.

  15. The CMS central hadron calorimeter

    SciTech Connect

    Freeman, J.; E892 Collaboration

    1996-12-31

    The CMS central hadron calorimeter is a copper absorber/ scintillator sampling structure. We describe design choices that led us to this concept, details of the mechanical and optical structure, and test beam results. We discuss calibration techniques, and finally the anticipated construction schedule.

  16. The CMS high level trigger

    NASA Astrophysics Data System (ADS)

    Gori, Valentina

    2014-05-01

    The CMS experiment has been designed with a 2-level trigger system: the Level 1 Trigger, implemented on custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a tradeoff between the complexity of the algorithms running on the available computing power, the sustainable output rate, and the selection efficiency. Here we will present the performance of the main triggers used during the 2012 data taking, ranging from simpler single-object selections to more complex algorithms combining different objects, and applying analysis-level reconstruction and selection. We will discuss the optimisation of the triggers and the specific techniques to cope with the increasing LHC pile-up, reducing its impact on the physics performance.

  17. The CMS High Level Trigger

    NASA Astrophysics Data System (ADS)

    Trocino, Daniele

    2014-06-01

    The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger, implemented in custom-designed electronics, and the High-Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a tradeoff between the complexity of the algorithms running with the available computing power, the sustainable output rate, and the selection efficiency. We present the performance of the main triggers used during the 2012 data taking, ranging from simple single-object selections to more complex algorithms combining different objects, and applying analysis-level reconstruction and selection. We discuss the optimisation of the trigger and the specific techniques to cope with the increasing LHC pile-up, reducing its impact on the physics performance.

  18. Virtual data in CMS production

    SciTech Connect

    Arbree, A. et al.

    2004-08-26

    Initial applications of the GriPhyN Chimera Virtual Data System have been performed within the context of CMS Production of Monte Carlo Simulated Data. The GriPhyN Chimera system consists of four primary components: (1) a Virtual Data Language, which is used to describe virtual data products, (2) a Virtual Data Catalog, which is used to store virtual data entries, (3) an Abstract Planner, which resolves all dependencies of a particular virtual data product and forms a location and existence independent plan, (4) a Concrete Planner, which maps an abstract, logical plan onto concrete, physical grid resources accounting for staging in/out files and publishing results to a replica location service. A CMS Workflow Planner, MCRunJob, is used to generate virtual data products using the Virtual Data Language. Subsequently, a prototype workflow manager, known as WorkRunner, is used to schedule the instantiation of virtual data products across a grid.

  19. The CMS pixel luminosity telescope

    NASA Astrophysics Data System (ADS)

    Kornmayer, A.

    2016-07-01

    The Pixel Luminosity Telescope (PLT) is a new complement to the CMS detector for the LHC Run II data taking period. It consists of eight 3-layer telescopes based on silicon pixel detectors that are placed around the beam pipe on each end of CMS viewing the interaction point at small angle. A fast 3-fold coincidence of the pixel planes in each telescope will provide a bunch-by-bunch measurement of the luminosity. Particle tracking allows collision products to be distinguished from beam background, provides a self-alignment of the detectors, and a continuous in-time monitoring of the efficiency of each telescope plane. The PLT is an independent luminometer, essential to enhance the robustness on the measurement of the delivered luminosity and to reduce its systematic uncertainties. This will allow to determine production cross-sections, and hence couplings, with high precision and to set more stringent limits on new particle production.

  20. Underground laboratories in Asia

    SciTech Connect

    Lin, Shin Ted; Yue, Qian

    2015-08-17

    Deep underground laboratories in Asia have been making huge progress recently because underground sites provide unique opportunities to explore the rare-event phenomena for the study of dark matter searches, neutrino physics and nuclear astrophysics as well as the multi-disciplinary researches based on the low radioactive environments. The status and perspectives of Kamioda underground observatories in Japan, the existing Y2L and the planned CUP in Korea, India-based Neutrino Observatory (INO) in India and China JinPing Underground Laboratory (CJPL) in China will be surveyed.

  1. Evidence from the Soudan 1 experiment for underground muons associated with Cygnus X-3

    NASA Technical Reports Server (NTRS)

    Ayres, D. S. E.

    1986-01-01

    The Soudan 1 experiment has yielded evidence for an average underground muon flux of approximately 7 x 10 to the minus 11th power/sq cm/s which points back to the X-ray binary Cygnus X-3, and which exhibits the 4.8 h periodicity observed for other radiation from this source. Underground muon events which seem to be associated with Cygnus X-3 also show evidence for longer time variability of the flux. Such underground muons cannot be explained by any conventional models of the propagation and interaction of cosmic rays.

  2. Upgrade of the CMS tracker

    NASA Astrophysics Data System (ADS)

    Tricomi, A.

    2014-03-01

    The LHC machine is planning an upgrade program which will smoothly bring the luminosity up to or above 5 × 1034 cm-2s-1 sometimes after 2020, to possibly reach an integrated luminosity of 3000 fb-1 at the end of that decade. The foreseen increases of both the instantaneous and the integrated luminosity by the LHC during the next ten years will necessitate a stepwise upgrade of the CMS tracking detector. During the extended end-of-year shutdown 2016-2017 the pixel detector will be exchanged with a new one. The so-called Phase1 Pixel foresees one additional barrel layer and one additional end-cap disk, a new readout chip, reduction of material, and the installation of more efficient cooling and powering systems. In the so-called Phase2, when LHC will reach the High Luminosity (HL-LHC) phase, CMS will need a completely new Tracker detector, in order to fully exploit the high-demanding operating conditions and the delivered luminosity. The new Tracker should have also trigger capabilities. To achieve such goals, R&D activities are ongoing to explore options and develop solutions that would allow including tracking information at Level-1. The design choices for the CMS pixel and outer tracker upgrades are discussed along with some highlights of the R&D activities.

  3. The CMS integration grid testbed

    SciTech Connect

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  4. Distributed data transfers in CMS

    NASA Astrophysics Data System (ADS)

    Magini, Nicolo; Ratnikova, Natalia; Rossman, Paul; Sánchez-Hernández, Alberto; Wildish, Tony

    2011-12-01

    The multi-tiered computing infrastructure of the CMS experiment at the LHC depends on the reliable and fast transfer of data between the different CMS computing sites. Data have to be transferred from the Tier-0 to the Tier-l sites for archival in a timely manner to avoid overflowing disk buffers at CERN. Data have to be transferred in bursts to all Tier-2 level sites for analysis as well as synchronized between the different Tier-l sites. The data transfer system is the key ingredient which enables the optimal usage of all distributed resources. The operation of the transfer system consists of monitoring and debugging of transfer issues to guarantee a timely delivery of data to all corners of the CMS computing infrastructure. Further task of transfer operation is to guarantee the consistency of the data at all sites, both on disk and on tape. Procedures to verify the consistency and to debug and repair problems will be discussed.

  5. HAWAII UNDERGROUND STORAGE TANKS

    EPA Science Inventory

    This is a point coverage of underground storage tanks(UST) for the state of Hawaii. The original database was developed and is maintained by the State of Hawaii, Dept. of Health. The point locations represent facilities where one or more underground storage tanks occur. Each fa...

  6. CMS Centres Worldwide - a New Collaborative Infrastructure

    NASA Astrophysics Data System (ADS)

    Taylor, Lucas

    2011-12-01

    The CMS Experiment at the LHC has established a network of more than fifty inter-connected "CMS Centres" at CERN and in institutes in the Americas, Asia, Australasia, and Europe. These facilities are used by people doing CMS detector and computing grid operations, remote shifts, data quality monitoring and analysis, as well as education and outreach. We present the computing, software, and collaborative tools and videoconferencing systems. These include permanently running "telepresence" video links (hardware-based H.323, EVO and Vidyo), Webcasts, and generic Web tools such as CMS-TV for broadcasting live monitoring and outreach information. Being Web-based and experiment-independent, these systems could easily be extended to other organizations. We describe the experiences of using CMS Centres Worldwide in the CMS data-taking operations as well as for major media events with several hundred TV channels, radio stations, and many more press journalists simultaneously around the world.

  7. CMS Full Simulation for Run-2

    NASA Astrophysics Data System (ADS)

    Hildreth, M.; Ivanchenko, V. N.; Lange, D. J.; Kortelainen, M. J.

    2015-12-01

    During LHC shutdown between run-1 and run-2 intensive developments were carried out to improve performance of CMS simulation. For physics improvements migration from Geant4 9.4p03 to Geant4 10.0p02 has been performed. CPU performance has been improved by introduction of the Russian roulette method inside CMS calorimeters, optimization of CMS simulation sub-libraries, and usage of statics build of the simulation executable. As a result of these efforts, CMS simulation has been speeded up by about factor two. In this work we provide description of updates for different software components of CMS simulation. Development of a multi-threaded (MT) simulation approach for CMS will be also discuss.

  8. Enabling opportunistic resources for CMS Computing Operations

    NASA Astrophysics Data System (ADS)

    Hufnagel, D.; CMS Collaboration

    2015-12-01

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  9. The CMS Journey to LHC Physics

    ScienceCinema

    None

    2011-10-06

    An overview of the design, the construction and physics of CMS will be given. A history of construction, encompassing the R&D; and challenges faced over the last decade and a half, will be recalled using selected examples. CMS is currently in the final stages of installation and commissioning is gathering pace. After a short status report of where CMS stands today some of the expected (great) physics to come will be outlined. * Tea & coffee will be served at 16:00.

  10. Readiness of CMS Simulation Towards LHC Startup

    SciTech Connect

    Banerjee, Sunanda; /Fermilab

    2007-11-01

    CMS experiment has used detector simulation software in its conceptual as well as technical design. With the detector construction near its completion, the role of simulation has changed toward understanding collision data to be collected by CMS in near future. CMS simulation software is becoming a data driven, realistic and accurate Monte Carlo program. The software architecture is described with some detail of the framework as well as detector specific components. Performance issues are discussed as well.

  11. The CMS Journey to LHC Physics

    SciTech Connect

    2011-02-09

    An overview of the design, the construction and physics of CMS will be given. A history of construction, encompassing the R&D; and challenges faced over the last decade and a half, will be recalled using selected examples. CMS is currently in the final stages of installation and commissioning is gathering pace. After a short status report of where CMS stands today some of the expected (great) physics to come will be outlined. * Tea & coffee will be served at 16:00.

  12. Experimental study of effectiveness of four radon mitigation solutions, based on underground depressurization, tested in prototype housing built in a high radon area in Spain.

    PubMed

    Frutos Vázquez, Borja; Olaya Adán, Manuel; Quindós Poncela, Luis Santiago; Sainz Fernandez, Carlos; Fuente Merino, Ismael

    2011-04-01

    The present paper discusses the results of an empirical study of four approaches to reducing indoor radon concentrations based on depressurization techniques in underground sumps. The experiments were conducted in prototype housing built in an area of Spain where the average radon concentration at a depth of 1 m is 250 kBq m(-3). Sump effectiveness was analysed in two locations: underneath the basement, which involved cutting openings into the foundation, ground storey and roof slabs, and outside the basement walls, which entailed digging a pit alongside the building exterior. The effectiveness of both sumps was likewise tested with passive and forced ventilation methods. The systems proved to be highly efficient, lowering radon levels by 91-99%, except in the solution involving passive ventilation and the outside sump, where radon levels were reduced by 53-55%. At wind speeds of over 8 m/s, however, passive ventilation across an outside sump lowered radon levels by 95% due to a Venturi effect induced drop in pressure. PMID:21382656

  13. An integrated environment monitoring system for underground coal mines--Wireless Sensor Network subsystem with multi-parameter monitoring.

    PubMed

    Zhang, Yu; Yang, Wei; Han, Dongsheng; Kim, Young-Il

    2014-01-01

    Environment monitoring is important for the safety of underground coal mine production, and it is also an important application of Wireless Sensor Networks (WSNs). We put forward an integrated environment monitoring system for underground coal mine, which uses the existing Cable Monitoring System (CMS) as the main body and the WSN with multi-parameter monitoring as the supplementary technique. As CMS techniques are mature, this paper mainly focuses on the WSN and the interconnection between the WSN and the CMS. In order to implement the WSN for underground coal mines, two work modes are designed: periodic inspection and interrupt service; the relevant supporting technologies, such as routing mechanism, collision avoidance, data aggregation, interconnection with the CMS, etc., are proposed and analyzed. As WSN nodes are limited in energy supply, calculation and processing power, an integrated network management scheme is designed in four aspects, i.e., topology management, location management, energy management and fault management. Experiments were carried out both in a laboratory and in a real underground coal mine. The test results indicate that the proposed integrated environment monitoring system for underground coal mines is feasible and all designs performed well as expected. PMID:25051037

  14. An Integrated Environment Monitoring System for Underground Coal Mines—Wireless Sensor Network Subsystem with Multi-Parameter Monitoring

    PubMed Central

    Zhang, Yu; Yang, Wei; Han, Dongsheng; Kim, Young-Il

    2014-01-01

    Environment monitoring is important for the safety of underground coal mine production, and it is also an important application of Wireless Sensor Networks (WSNs). We put forward an integrated environment monitoring system for underground coal mine, which uses the existing Cable Monitoring System (CMS) as the main body and the WSN with multi-parameter monitoring as the supplementary technique. As CMS techniques are mature, this paper mainly focuses on the WSN and the interconnection between the WSN and the CMS. In order to implement the WSN for underground coal mines, two work modes are designed: periodic inspection and interrupt service; the relevant supporting technologies, such as routing mechanism, collision avoidance, data aggregation, interconnection with the CMS, etc., are proposed and analyzed. As WSN nodes are limited in energy supply, calculation and processing power, an integrated network management scheme is designed in four aspects, i.e., topology management, location management, energy management and fault management. Experiments were carried out both in a laboratory and in a real underground coal mine. The test results indicate that the proposed integrated environment monitoring system for underground coal mines is feasible and all designs performed well as expected. PMID:25051037

  15. The CMS tracker control system

    NASA Astrophysics Data System (ADS)

    Dierlamm, A.; Dirkes, G. H.; Fahrer, M.; Frey, M.; Hartmann, F.; Masetti, L.; Militaru, O.; Shah, S. Y.; Stringer, R.; Tsirou, A.

    2008-07-01

    The Tracker Control System (TCS) is a distributed control software to operate about 2000 power supplies for the silicon modules of the CMS Tracker and monitor its environmental sensors. TCS must thus be able to handle about 104 power supply parameters, about 103 environmental probes from the Programmable Logic Controllers of the Tracker Safety System (TSS), about 105 parameters read via DAQ from the DCUs in all front end hybrids and from CCUs in all control groups. TCS is built on top of an industrial SCADA program (PVSS) extended with a framework developed at CERN (JCOP) and used by all LHC experiments. The logical partitioning of the detector is reflected in the hierarchical structure of the TCS, where commands move down to the individual hardware devices, while states are reported up to the root which is interfaced to the broader CMS control system. The system computes and continuously monitors the mean and maximum values of critical parameters and updates the percentage of currently operating hardware. Automatic procedures switch off selected parts of the detector using detailed granularity and avoiding widespread TSS intervention.

  16. 42 CFR 422.510 - Termination of contract by CMS.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) MEDICARE PROGRAM MEDICARE ADVANTAGE PROGRAM Application Procedures and Contracts for Medicare Advantage Organizations § 422.510 Termination of contract by CMS. (a) Termination by CMS. CMS may at...

  17. Low energy neutron background in deep underground laboratories

    NASA Astrophysics Data System (ADS)

    Best, Andreas; Görres, Joachim; Junker, Matthias; Kratz, Karl-Ludwig; Laubenstein, Matthias; Long, Alexander; Nisi, Stefano; Smith, Karl; Wiescher, Michael

    2016-03-01

    The natural neutron background influences the maximum achievable sensitivity in most deep underground nuclear, astroparticle and double-beta decay physics experiments. Reliable neutron flux numbers are an important ingredient in the design of the shielding of new large-scale experiments as well as in the analysis of experimental data. Using a portable setup of 3He counters we measured the thermal neutron flux at the Kimballton Underground Research Facility, the Soudan Underground Laboratory, on the 4100 ft and the 4850 ft levels of the Sanford Underground Research Facility, at the Waste Isolation Pilot Plant and at the Gran Sasso National Laboratory. Absolute neutron fluxes at these laboratories are presented.

  18. Conservation with underground power lines

    SciTech Connect

    Graneau, P.

    1980-01-01

    The following aspects of underground power transmission lines are discussed: their contribution to area beautification; line losses and their causes; the energy conservation potential of large-conductor underground cables; reliability and outage advantages as compared with overhead lines; the history of underground systems; problems with polyethylene insulation; and the development and performance of sodium conductors for underground cables. (LCL)

  19. The Status of the Cms Experiment

    NASA Astrophysics Data System (ADS)

    Green, Dan

    The CMS experiment was completely assembled in the fall of 2008 after a decade of design, construction and installation. During the last two years, cosmic ray data were taken on a regular basis. These data have enabled CMS to align the detector components, both spatially and temporally. Initial use of muons has also established the relative alignment of the CMS tracking and muon systems. In addition, the CMS calorimetry has been crosschecked with test beam data, thus providing an initial energy calibration of CMS calorimetry to about 5%. The CMS magnet has been powered and field mapped. The trigger and data acquisition systems have been installed and run at full speed. The tiered data analysis system has been exercised at full design bandwidth for Tier0, Tier1 and Tier2 sites. Monte Carlo simulation of the CMS detector has been constructed at a detailed geometric level and has been tuned to test beam and other production data to provide a realistic model of the CMS detector prior to first collisions.

  20. The Diverse use of Clouds by CMS

    NASA Astrophysics Data System (ADS)

    Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; Colling, David; Dobson, Marc; Fayer, Simon; Girone, Maria; Grandi, Claudio; Huffman, Adam; Hufnagel, Dirk; Aftab Khan, Farrukh; Lahiff, Andrew; McCrae, Alison; Rand, Duncan; Sgaravatto, Massimo; Tiradani, Anthony; Zhang, Xiaomei

    2015-12-01

    The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of the trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. We present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.

  1. CMS: The Midwife of Instruction and Learning.

    ERIC Educational Resources Information Center

    Maxwell, Valerie

    1989-01-01

    Gifted students may exhibit a deficit in ability to follow a sequence of verbal instructions, termed Cognition of SeMantic Systems (CMS). Three types of low-CMS students are described, and counseling strategies are outlined. Achievement of academic success and emotional health calls for teachers to build students' self-esteem and be patient. (JDD)

  2. The Diverse use of Clouds by CMS

    SciTech Connect

    Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; Colling, David; Dobson, Marc; Fayer, Simon; Girone, Maria; Grandi, Claudio; Huffman, Adam; Hufnagel, Dirk; Khan, Farrukh Aftab; Lahiff, Andrew; McCrae, Alison; Rand, Duncan; Sgaravatto, Massimo; Tiradani, Anthony; Zhang, Xiaomei

    2015-12-23

    The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of the trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources.We present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.

  3. The diverse use of clouds by CMS

    DOE PAGESBeta

    Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; Colling, David; Dobson, Marc; Fayer, Simon; Girone, Maria; Grandi, Claudio; Huffman, Adam; Hufnagel, Dirk; et al

    2015-01-01

    The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of themore » trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. Lastly, we present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.« less

  4. The diverse use of clouds by CMS

    SciTech Connect

    Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; Colling, David; Dobson, Marc; Fayer, Simon; Girone, Maria; Grandi, Claudio; Huffman, Adam; Hufnagel, Dirk; Khan, Farrukh Aftab; Lahiff, Andrew; McCrae, Alison; Rand, Duncan; Sgaravatto, Massimo; Tiradani, Anthony; Zhang, Xiaomei

    2015-01-01

    The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of the trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. Lastly, we present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.

  5. Forward Physics Results from ATLAS and CMS

    NASA Astrophysics Data System (ADS)

    Sen, Sercan

    2013-12-01

    We present recent forward and diffractive physics results from ATLAS and CMS experiments. Mainly, the physics results on diffraction, underlying event at forward rapidity and forward jets measurements are discussed. Also, using the combined CMS and TOTEM detectors, we show the first event candidate for central jets production with two leading protons detected in the TOTEM Roman Pot stations.

  6. Final Technical Report CMS fast optical calorimetry

    SciTech Connect

    Winn, David R.

    2012-07-12

    This is the final report of CMS FAST OPTICAL CALORIMETRY, a grant to Fairfield University for development, construction, installation and operation of the forward calorimeter on CMS, and for upgrades of the forward and endcap calorimeters for higher luminosity and radiation damage amelioration.

  7. Science Goes Underground.

    ERIC Educational Resources Information Center

    Naylor, Stuart; Keogh, Brenda

    1999-01-01

    Cartoons illustrating scientific concepts were used in the London Underground and informal learning sites to raise adult awareness of science and promote lifelong learning. They are also being used in formal learning situations such as adult-literacy classes. (SK)

  8. Science Center Goes Underground

    ERIC Educational Resources Information Center

    Modern Schools, 1977

    1977-01-01

    A unique underground science center at Bluffton College, designed to save energy and preserve trees, rolling landscape, and other environmental features of the campus, is under construction in Bluffton, Ohio. (Author)

  9. Discovery Reach of Charged MSSM Higgs Bosons at CMS

    SciTech Connect

    Heinemeyer, S.; Nikitenko, A.; Weiglein, G.

    2008-11-23

    We review the 5{sigma} discovery contours for the charged MSSM Higgs boson at the CMS experiment with 30 fb{sup -1} for the two cases M{sub H{sup {+-}}}m{sub t}. In order to analyze the search reach we combine the latest results for the CMS experimental sensitivities based on full simulation studies with state-of-the-art theoretical predictions of MSSM Higgs-boson production and decay properties. Special emphasis is put on the SUSY parameter dependence of the 5{sigma} contours. The variation of {mu} can shift the prospective discovery reach in tan{beta} by up to {delta}tan{beta} = 40.

  10. Heavy neutrinos and the pp → lljj CMS data

    NASA Astrophysics Data System (ADS)

    Gluza, Janusz; Jeliński, Tomasz

    2015-09-01

    We show that the excess in the pp → eejj CMS data can be naturally interpreted within the Minimal Left-Right Symmetric model (MLRSM), keeping gL =gR, if CP phases and non-degenerate masses of heavy neutrinos are taken into account. As an additional benefit, a natural interpretation of the reported ratio (14 : 1) of the opposite-sign (OS) pp →l±l∓ jj to the same-sign (SS) pp →l±l± jj lepton signals is possible. Finally, a suppression of muon pairs with respect to electron pairs in the pp → lljj data is obtained, in accordance with experimental data. If the excess in the CMS data survives in the future, it would be a first clear hint towards presence of heavy neutrinos in right-handed charged currents with specific CP phases, mixing angles and masses, which will have far reaching consequences for particle physics directions.

  11. The CMS Data Management System

    NASA Astrophysics Data System (ADS)

    Giffels, M.; Guo, Y.; Kuznetsov, V.; Magini, N.; Wildish, T.

    2014-06-01

    The data management elements in CMS are scalable, modular, and designed to work together. The main components are PhEDEx, the data transfer and location system; the Data Booking Service (DBS), a metadata catalog; and the Data Aggregation Service (DAS), designed to aggregate views and provide them to users and services. Tens of thousands of samples have been cataloged and petabytes of data have been moved since the run began. The modular system has allowed the optimal use of appropriate underlying technologies. In this contribution we will discuss the use of both Oracle and NoSQL databases to implement the data management elements as well as the individual architectures chosen. We will discuss how the data management system functioned during the first run, and what improvements are planned in preparation for 2015.

  12. CMS centres worldwide: A new collaborative infrastructure

    SciTech Connect

    Taylor, Lucas; Gottschalk, Erik; /Fermilab

    2010-01-01

    The CMS Experiment at the LHC is establishing a global network of inter-connected 'CMS Centres' for controls, operations and monitoring. These support: (1) CMS data quality monitoring, detector calibrations, and analysis; and (2) computing operations for the processing, storage and distribution of CMS data. We describe the infrastructure, computing, software, and communications systems required to create an effective and affordable CMS Centre. We present our highly successful operations experiences with the major CMS Centres at CERN, Fermilab, and DESY during the LHC first beam data-taking and cosmic ray commissioning work. The status of the various centres already operating or under construction in Asia, Europe, Russia, South America, and the USA is also described. We emphasise the collaborative communications aspects. For example, virtual co-location of experts in CMS Centres Worldwide is achieved using high-quality permanently-running 'telepresence' video links. Generic Web-based tools have been developed and deployed for monitoring, control, display management and outreach.

  13. The CMS Masterclass and Particle Physics Outreach

    SciTech Connect

    Cecire, Kenneth; Bardeen, Marjorie; McCauley, Thomas

    2014-01-01

    The CMS Masterclass enables high school students to analyse authentic CMS data. Students can draw conclusions on key ratios and particle masses by combining their analyses. In particular, they can use the ratio of W^+ to W^- candidates to probe the structure of the proton, they can find the mass of the Z boson, and they can identify additional particles including, tentatively, the Higgs boson. In the United States, masterclasses are part of QuarkNet, a long-term program that enables students and teachers to use cosmic ray and particle physics data for learning with an emphasis on data from CMS.

  14. The CMS Masterclass and Particle Physics Outreach

    NASA Astrophysics Data System (ADS)

    Cecire, Kenneth; Bardeen, Marjorie; McCauley, Thomas

    2014-04-01

    The CMS Masterclass enables high school students to analyse authentic CMS data. Students can draw conclusions on key ratios and particle masses by combining their analyses. In particular, they can use the ratio of W+ to W- candidates to probe the structure of the proton, they can find the mass of the Z boson, and they can identify additional particles including, tentatively, the Higgs boson. In the United States, masterclasses are part of QuarkNet, a long-term program that enables students and teachers to use cosmic ray and particle physics data for learning with an emphasis on data from CMS.

  15. Underground mine communications: a survey

    SciTech Connect

    Yarkan, S.; Guzelgoz, S.; Arslan, H.; Murphy, R.R.

    2009-07-01

    After a recent series of unfortunate underground mining disasters, the vital importance of communications for underground mining is underlined one more time. Establishing reliable communication is a very difficult task for underground mining due to the extreme environmental conditions. Until now, no single communication system exists which can solve all of the problems and difficulties encountered in underground mine communications. However, combining research with previous experiences might help existing systems improve, if not completely solve all of the problems. In this survey, underground mine communication is investigated. Major issues which underground mine communication systems must take into account are discussed. Communication types, methods, and their significance are presented.

  16. Leaking underground storage tanks

    SciTech Connect

    Dowd, R.M.

    1984-10-01

    The problems associated with leaking underground storage tanks are discussed. An estimated 10-30% of the 3.5 million or more underground tanks now used to store petroleum products and other liquids may be leaking their contents to the surrounding environment. The EPA is initiating a national field survey of tanks used for the storing of engine fuels. The first phase of the survey will cover a representative sample of 1050 facilities and approximately 2800 tanks. EPA will analyze the questionnaires and then select a sub-sample of about 500 tanks to examine leakage problems in more detail. In the absence of specific groundwater protection legislation or regulation, EPA is planning to use the Toxic Substances Control Act to regulate underground tanks.

  17. Underground physics with DUNE

    NASA Astrophysics Data System (ADS)

    Kudryavtsev, Vitaly A.; DUNE Collaboration

    2016-05-01

    The Deep Underground Neutrino Experiment (DUNE) is a project to design, construct and operate a next-generation long-baseline neutrino detector with a liquid argon (LAr) target capable also of searching for proton decay and supernova neutrinos. It is a merger of previous efforts of the LBNE and LBNO collaborations, as well as other interested parties to pursue a broad programme with a staged 40-kt LAr detector at the Sanford Underground Research Facility (SURF) 1300 km from Fermilab. This programme includes studies of neutrino oscillations with a powerful neutrino beam from Fermilab, as well as proton decay and supernova neutrino burst searches. In this paper we will focus on the underground physics with DUNE.

  18. Underground mineral extraction

    NASA Technical Reports Server (NTRS)

    Miller, C. G.; Stephens, J. B.

    1980-01-01

    A method was developed for extracting underground minerals such as coal, which avoids the need for sending personnel underground and which enables the mining of steeply pitched seams of the mineral. The method includes the use of a narrow vehicle which moves underground along the mineral seam and which is connected by pipes or hoses to water pumps at the surface of the Earth. The vehicle hydraulically drills pilot holes during its entrances into the seam, and then directs sideward jets at the seam during its withdrawal from each pilot hole to comminute the mineral surrounding the pilot hole and combine it with water into a slurry, so that the slurried mineral can flow to a location where a pump raises the slurry to the surface.

  19. Underground physics with DUNE

    SciTech Connect

    Kudryavtsev, Vitaly A.

    2016-01-01

    The Deep Underground Neutrino Experiment (DUNE) is a project to design, construct and operate a next-generation long-baseline neutrino detector with a liquid argon (LAr) target capable also of searching for proton decay and supernova neutrinos. It is a merger of previous efforts of the LBNE and LBNO collaborations, as well as other interested parties to pursue a broad programme with a staged 40-kt LAr detector at the Sanford Underground Research Facility (SURF) 1300 km from Fermilab. This programme includes studies of neutrino oscillations with a powerful neutrino beam from Fermilab, as well as proton decay and supernova neutrino burst searches. In this study, we will focus on the underground physics with DUNE.

  20. Underground physics with DUNE

    DOE PAGESBeta

    Kudryavtsev, Vitaly A.

    2016-01-01

    The Deep Underground Neutrino Experiment (DUNE) is a project to design, construct and operate a next-generation long-baseline neutrino detector with a liquid argon (LAr) target capable also of searching for proton decay and supernova neutrinos. It is a merger of previous efforts of the LBNE and LBNO collaborations, as well as other interested parties to pursue a broad programme with a staged 40-kt LAr detector at the Sanford Underground Research Facility (SURF) 1300 km from Fermilab. This programme includes studies of neutrino oscillations with a powerful neutrino beam from Fermilab, as well as proton decay and supernova neutrino burst searches.more » In this study, we will focus on the underground physics with DUNE.« less

  1. CASPAR - Nuclear Astrophysics Underground

    NASA Astrophysics Data System (ADS)

    Strieder, Frank; Robertson, Daniel; Couder, Manoel; Greife, Uwe; Wells, Doug; Wiescher, Michael

    2015-10-01

    The work of the LUNA Collaboration at the Laboratori Nationali del Gran Sasso demonstrated the research potential of an underground accelerator for the field of nuclear astrophysics. Several key reactions could be studied at LUNA, some directly at the Gamow peak for solar hydrogen burning. The CASPAR (Compact Accelerator System for Performing Astrophysical Research) Collaboration will implement a high intensity 1 MV accelerator at the Sanford Underground Research Facility (SURF) and overcome the current limitation at LUNA. The installation of the accelerator in the recently rehabilitated underground cavity at SURF started in Summer 2015 and first beam should be delivered by the end of the year. This project will primarily focus on the neutron sources for the s-process, e.g. 13C(α , n) 16O and 22Ne(α , n) 25Mg , and lead to unprecedented measurements compared to previous studies. A detailed overview of the science goals of CASPAR will be presented.

  2. 42 CFR 405.800 - Appeals of CMS or a CMS contractor.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 2 2014-10-01 2014-10-01 false Appeals of CMS or a CMS contractor. 405.800 Section 405.800 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE PROGRAM FEDERAL HEALTH INSURANCE FOR THE AGED AND DISABLED Appeals Under the Medicare Part B Program § 405.800 Appeals of CMS or a...

  3. Underground radial pipe network

    SciTech Connect

    Peterson, D.L.

    1984-04-24

    The network, useful in conducting fluids to underground sites, is an assembly of flexible pipes or tubes, suspended from and connected to a drill pipe. The flexible pipes, assembled in a bundle, are spring biased to flare outwardly in an arcuate manner when a releasable cap on the distal end of the bundle is removed. The assembled bundle is inserted into and lowered down a bore hole. When the cap is released, the pipes flare radially and outwardly. Fluid, pumped into and through the assembly, can be directed into the underground formation for various purposes.

  4. 23 CFR 500.109 - CMS.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... future demand management strategies and operational improvements that will maintain the functional... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION TRANSPORTATION INFRASTRUCTURE MANAGEMENT MANAGEMENT AND MONITORING SYSTEMS Management Systems § 500.109 CMS. (a) For purposes of this part, congestion means the level...

  5. The CMS central hadron calorimeter: Update

    SciTech Connect

    Freeman, J.

    1998-06-01

    The CMS central hadron calorimeter is a brass absorber/ scintillator sampling structure. We describe details of the mechanical and optical structure. We also discuss calibration techniques, and finally the anticipated construction schedule.

  6. File level provenance tracking in CMS

    SciTech Connect

    Jones, C.D.; Kowalkowski, J.; Paterno, M.; Sexton-Kennedy, L.; Tanenbaum, W.; Riley, D.S.; /Cornell U., LEPP

    2009-05-01

    The CMS off-line framework stores provenance information within CMS's standard ROOT event data files. The provenance information is used to track how each data product was constructed, including what other data products were read to do the construction. We will present how the framework gathers the provenance information, the efforts necessary to minimize the space used to store the provenance in the file and the tools that will be available to use the provenance.

  7. Virtual data in CMS analysis

    SciTech Connect

    A. Arbree et al.

    2003-10-01

    The use of virtual data for enhancing the collaboration between large groups of scientists is explored in several ways: by defining ''virtual'' parameter spaces which can be searched and shared in an organized way by a collaboration of scientists in the course of their analysis; by providing a mechanism to log the provenance of results and the ability to trace them back to the various stages in the analysis of real or simulated data; by creating ''check points'' in the course of an analysis to permit collaborators to explore their own analysis branches by refining selections, improving the signal to background ratio, varying the estimation of parameters, etc.; by facilitating the audit of an analysis and the reproduction of its results by a different group, or in a peer review context. We describe a prototype for the analysis of data from the CMS experiment based on the virtual data system Chimera and the object-oriented data analysis framework ROOT. The Chimera system is used to chain together several steps in the analysis process including the Monte Carlo generation of data, the simulation of detector response, the reconstruction of physics objects and their subsequent analysis, histogramming and visualization using the ROOT framework.

  8. The CMS Condition Database System

    NASA Astrophysics Data System (ADS)

    Di Guida, S.; Govi, G.; Ojeda, M.; Pfeiffer, A.; Sipos, R.

    2015-12-01

    The Condition Database plays a key role in the CMS computing infrastructure. The complexity of the detector and the variety of the sub-systems involved are setting tight requirements for handling the Conditions. In the last two years the collaboration has put a substantial effort in the re-design of the Condition Database system, with the aim at improving the scalability and the operability for the data taking starting in 2015. The re-design has focused on simplifying the architecture, using the lessons learned during the operation of the Run I data-taking period (20092013). In the new system the relational features of the database schema are mainly exploited to handle the metadata (Tag and Interval of Validity), allowing for a limited and controlled set of queries. The bulk condition data (Payloads) are stored as unstructured binary data, allowing the storage in a single table with a common layout for all of the condition data types. In this paper, we describe the full architecture of the system, including the services implemented for uploading payloads and the tools for browsing the database. Furthermore, the implementation choices for the core software will be discussed.

  9. Mars Underground News.

    NASA Astrophysics Data System (ADS)

    Edgett, K.

    Contents: Ten years Underground. Rover roundup (International Conference on Mobile Planetary Robots and Rover Roundup, Santa Monica, CA (USA), 29 Jan - 4 Feb 1997). Reaching the Red. Schedule of missions to Mars (as of April 1, 1997). Mars on the Web.

  10. Underground Tank Management.

    ERIC Educational Resources Information Center

    Bednar, Barbara A.

    1990-01-01

    The harm to human health and our environment caused by leaking underground storage tanks can be devastating. Schools can meet new federal waste management standards by instituting daily inventory monitoring, selecting a reliable volumetric testing company, locating and repairing leaks promptly, and removing and installing tanks appropriately. (MLH)

  11. Underground Coal Mining

    NASA Technical Reports Server (NTRS)

    Hill, G. M.

    1980-01-01

    Computer program models coal-mining production, equipment failure and equipment repair. Underground mine is represented as collection of work stations requiring service by production and repair crews alternately. Model projects equipment availability and productivity, and indicates proper balance of labor and equipment. Program is in FORTRAN IV for batch execution; it has been implemented on UNIVAC 1108.

  12. Collaborations in Underground Laboratories

    NASA Astrophysics Data System (ADS)

    Wang, Joseph S. Y.

    2011-04-01

    There are programs between underground physics labs into other studies. The Gran Sasso with large halls and dedicated tunnels in Italy and the Canfranc with newly completed space in Spain have geodynamic experiments (A. Bettini communication, 2011). The Low Noise Underground Lab (LSBB of Rustrel-pays d'Apt) converted a former French missiles launching command center to house a SQUID shielded electromagnetically above 10 Hz for global ionosphere and earthquake observations (G. Waysand et al. 2010). The China JingPing Lab has new physics room and tunnels excavated under 2.5 km overburden with rock mechanic changes evaluated (X. Feng, 2011). These are examples associated with tunnels through mountain ranges. In North America, we have Canada's SNO in an active mine with new space and the U.S. effort for reentry into the abandoned Homestake mine levels for physics and bio-geo-engineering studies. We also have underground research labs dedicated to nuclear waste research in Sweden, Switzerland, France, Germany, and candidate sites in Japan and China. All these underground labs are engaging in international collaborations to develop inter-disciplinary studies. The linkage/networking with International Physics is pursued.

  13. Global Pursuits: The Underground Railroad

    ERIC Educational Resources Information Center

    School Arts: The Art Education Magazine for Teachers, 2004

    2004-01-01

    This brief article describes Charles T. Webber's oil on canvas painting, "The Underground Railroad, 1893." The subject of this painting is the Underground Railroad, which today has become an American legend. The Underground Railroad was not a systematic means of transportation, but rather a secretive process that allowed fugitive slaves to escape…

  14. The CMS hadron calorimeter detector control system upgrade

    NASA Astrophysics Data System (ADS)

    Sahin, M. O.; Behrens, U.; Campbell, A.; Martens, I.; Melzer-Pellmann, I. A.; Saxena, P.

    2015-04-01

    The detector control system of the CMS hadron calorimeter provides the 40.0788 MHz LHC clock to the front end electronics and supplies synchronization signals and I2C communication. Pedestals and diagnostic bits are controlled, and temperatures and voltages are read out. SIPM temperatures are actively stabilized by temperature readback and generation of correction voltages to drive the Peltier regulation system. Overall control and interfacing to databases and experimental DAQ software is provided by the software CCM Server. We report on design and development status, and implementation schedule of this system.

  15. Higgs in bosonic channels (CMS)

    NASA Astrophysics Data System (ADS)

    Gori, Valentina

    2015-05-01

    The main Higgs boson decays into bosonic channels will be considered, presenting and discussing results from the latest reprocessing of data collected by the CMS experiment at the LHC, using the full dataset recorded at centre-of-mass energies of 7 and 8 TeV. For this purpose, results from the final Run-I papers for the H → ZZ → 4ℓ, H → γγ and H → WW analyses are presented, focusing on the Higgs boson properties, like the mass, the signal strenght, the couplings to fermions and vector bosons, the spin and parity properties. Furthermore, the Higgs boson width measurement exploiting the on-shell versus the off-shell cross section (in the H → ZZ → 4ℓ and H → ZZ → 2ℓ2ν decay channels) will be shown. All the investigated properties result to be fully consistent with the SM predictions: the signal strength and the signal strength modifiers are consistent with unity in all the bosonic channels considered; the hypothesis of a scalar particle is strongly favored, against the pseudoscalar or the vector/pseudovector or the spin-2 boson hypotheses (all excluded at 99% CL or higher in the H → ZZ → 4ℓ channel). The Higgs boson mass measurement from the combination of H → ZZ → 4ℓ and H → γγ channels gives a value mH = 125.03+0.26-0.27 (stat.) +0.13-0.15 (syst.). An upper limit ΓH < 22 MeV can be put on the Higgs boson width thanks to the new indirect method.

  16. Underground Nuclear Astrophysics at LUNA

    SciTech Connect

    Junker, Matthias

    2008-01-24

    Nuclear cross sections play a key role in understanding stellar evolution and elemental synthesis. Also in the field of astroparticle physics precise knowledge on thermonuclear cross sections is needed to extract the particle properties from the experimental data. While it is desirable to directly measure the relevant cross sections in the energy range of interest for the specific stellar environment this proves to be difficult, if not impossible, due to the effect of the Coulomb barrier, which causes an exponential drop of the cross sections at stellar energies. Consequently direct measurements are hampered by low counting rates and background caused by cosmic rays and environmental radioactivity. In addition background induced by the beam or the target itself can disturb the measurements.In this contribution I will discuss some of the reactions studied by LUNA in the past years to illustrate important aspects underground nuclear astrophysics.

  17. The CMS Beam Halo Monitor electronics

    NASA Astrophysics Data System (ADS)

    Tosi, N.; Dabrowski, A. E.; Fabbri, F.; Grassi, T.; Hughes, E.; Mans, J.; Montanari, A.; Orfanelli, S.; Rusack, R.; Torromeo, G.; Stickland, D. P.; Stifter, K.

    2016-02-01

    The CMS Beam Halo Monitor has been successfully installed in the CMS cavern in LHC Long Shutdown 1 for measuring the machine induced background for LHC Run II. The system is based on 40 detector units composed of synthetic quartz Cherenkov radiators coupled to fast photomultiplier tubes (PMTs). The readout electronics chain uses many components developed for the Phase 1 upgrade to the CMS Hadronic Calorimeter electronics, with dedicated firmware and readout adapted to the beam monitoring requirements. The PMT signal is digitized by a charge integrating ASIC (QIE10), providing both the signal rise time, with few nanosecond resolution, and the charge integrated over one bunch crossing. The backend electronics uses microTCA technology and receives data via a high-speed 5 Gbps asynchronous link. It records histograms with sub-bunch crossing timing resolution and is read out via IPbus using the newly designed CMS data acquisition for non-event based data. The data is processed in real time and published to CMS and the LHC, providing online feedback on the beam quality. A dedicated calibration monitoring system has been designed to generate short triggered pulses of light to monitor the efficiency of the system. The electronics has been in operation since the first LHC beams of Run II and has served as the first demonstration of the new QIE10, Microsemi Igloo2 FPGA and high-speed 5 Gbps link with LHC data.

  18. Underground waste barrier structure

    DOEpatents

    Saha, Anuj J.; Grant, David C.

    1988-01-01

    Disclosed is an underground waste barrier structure that consists of waste material, a first container formed of activated carbonaceous material enclosing the waste material, a second container formed of zeolite enclosing the first container, and clay covering the second container. The underground waste barrier structure is constructed by forming a recessed area within the earth, lining the recessed area with a layer of clay, lining the clay with a layer of zeolite, lining the zeolite with a layer of activated carbonaceous material, placing the waste material within the lined recessed area, forming a ceiling over the waste material of a layer of activated carbonaceous material, a layer of zeolite, and a layer of clay, the layers in the ceiling cojoining with the respective layers forming the walls of the structure, and finally, covering the ceiling with earth.

  19. Leaking underground storage tanks

    SciTech Connect

    McLearn, M.E.; Miller, M.J.; Kostecki, P.T.; Calabrese, E.J.; Presio, L.M.; Suyama, W.; Kucharski, W.A.

    1988-04-01

    Remedial options for leaking underground storage tanks were investigated in a joint project of the Electric Power Research Institute and the Underground Storage Tank Committee of the Utility Solid Waste Activities Group. Both existing and emerging technologies were examined. Thirteen remedial techniques were identified and initially characterized as in situ or non-in situ. In situ methods include volatilization, biodegradation, leaching and chemical reaction, vitrification, passive remediation, and isolation or containment. Non-in situ techniques include land treatment, thermal treatment, asphalt incorporation, solidification and stabilization, groundwater extraction and treatment, chemical extraction, and excavation. Soil and groundwater remediation problems have many site-specific consideration which must be considered in choosing an appropriate remedial option; these include cleanup goals, site and contaminant characteristics, cost, exposures pathways, and others. Appropriate remedial techniques are chosen by assessing technical, implementational, environmental and economic consideration of each available option to achieve the desired cleanup goal at the specified site.

  20. LUNA: Nuclear astrophysics underground

    SciTech Connect

    Best, A.

    2015-02-24

    Underground nuclear astrophysics with LUNA at the Laboratori Nazionali del Gran Sasso spans a history of 20 years. By using the rock overburden of the Gran Sasso mountain chain as a natural cosmic-ray shield very low signal rates compared to an experiment on the surface can be tolerated. The cross sectons of important astrophysical reactions directly in the stellar energy range have been successfully measured. In this proceeding we give an overview over the key accomplishments of the experiment and an outlook on its future with the expected addition of an additional accelerator to the underground facilities, enabling the coverage of a wider energy range and the measurement of previously inaccessible reactions.

  1. In focus: Underground haulage

    SciTech Connect

    Not Available

    1994-10-01

    New ideas to improve mining techniques and equipment play a vital part in achieving the productivity improvements and cost reductions necessary for the profitability, and often in hard times, for the survival of mining operations. This paper reviews the development and design of rubber-tired underground haulage equipment currently used in the US northwest. It then goes on to discuss new developments in communication and computerized control systems for these haulage units.

  2. Underground mining methods handbook

    SciTech Connect

    Hustrulid, W.A.

    1982-01-01

    Sections discuss: mine design considerations; stopes requiring minimum support (includes room-and-pillar mining and sublevel stoping); stopes requiring some additional support other than pillars (includes shrinkage stoping, cut-and-fill stoping, undercut-and-fill mining, timber-supported system, top-slice mining, longwall mining and shortwall mining); caving methods (sublevel and block caving); underground equipment; financial considerations; design; and mine ventilation.

  3. Outlook for Underground Science

    NASA Astrophysics Data System (ADS)

    Bowles, Thomas

    2003-04-01

    Nuclear and particle physics has a long history of carrying out experiments deep underground to search for rare processes such as proton decay and double beta decay and to observe neutrinos from a variety of astrophysical sources. This science program has recently resulted in remarkable evidence for neutrino mass as evidenced in atmospheric, solar, and terrestrial neutrino experiments. These discoveries have fueled a renewed effort in the United States to create a National Underground Science and Engineering Laboratory (NUSEL) that would provide the basis for an expanded program of science underground. The research issues that would be addressed at a NUSEL include not only nuclear and particle physics, but also a broad range of topics in geology, geoengineering, and geobiology. A NUSEL would also create new resources for applications of interest to industry and national defense as well as providing a significant new opportunity for education and outreach. In this talk I will present an overview of the scientific opportunities that could be addressed at a NUSEL. I will also provide an update on the status of efforts to create a NUSEL at various possible sites in the U.S.

  4. Radiation experience with the CMS pixel detector

    NASA Astrophysics Data System (ADS)

    Veszpremi, V.

    2015-04-01

    The CMS pixel detector is the innermost component of the CMS tracker occupying the region around the centre of CMS, where the LHC beams are crossed, between 4.3 cm and 30 cm in radius and 46.5 cm along the beam axis. It operates in a high-occupancy and high-radiation environment created by particle collisions. Studies of radiation damage effects to the sensors were performed throughout the first running period of the LHC . Leakage current, depletion voltage, pixel readout thresholds, and hit finding efficiencies were monitored as functions of the increasing particle fluence. The methods and results of these measurements will be described together with their implications to detector operation as well as to performance parameters in offline hit reconstruction.

  5. Calorimeter Simulation with Hadrons in CMS

    SciTech Connect

    Piperov, Stefan; /Sofiya, Inst. Nucl. Res. /Fermilab

    2008-11-01

    CMS is using Geant4 to simulate the detector setup for the forthcoming data from the LHC. Validation of physics processes inside Geant4 is a major concern in view of getting a proper description of jets and missing energy for signal and background events. This is done by carrying out an extensive studies with test beam using the prototypes or real detector modules of the CMS calorimeter. These data are matched with Geant4 predictions using the same framework that is used for the entire CMS detector. Tuning of the Geant4 models is carried out and steps to be used in reproducing detector signals are defined in view of measurements of energy response, energy resolution, transverse and longitudinal shower profiles for a variety of hadron beams over a broad energy spectrum between 2 to 300 GeV/c. The tuned Monte Carlo predictions match many of these measurements within systematic uncertainties.

  6. Commissioning of the Cms Tracker Outer Barrel

    NASA Astrophysics Data System (ADS)

    Bloch, Christoph

    2006-04-01

    Fully equipped final substructures of the CMS Tracker are installed in a dedicated mechanical support, the Cosmic Rack, providing a geometry suitable for tracking cosmic muons, and equipped with a dedicated trigger that allows the selection of tracks synchronous with the fast readout electronics. Data collected at room temperature and at the tracker operating temperature of -10°C can be used to test reconstruction and alignment algorithms for the tracker, as well as to perform a detailed qualification of the geometry and the functionality of the structures at different temperatures. The CMS Monte Carlo simulation has been adapted to the geometry of the cosmic rack, and the comparison with the data will provide a valuable test to improve the tracker simulation in CMS.

  7. The Physics of the CMS Experiment

    SciTech Connect

    Sanabria, J. C.

    2007-10-26

    The Large Hadron Collider (LHC) at CERN will start running 2008 producing proton-proton collisions with a center-of-mass energy of 14 TeV. Four large experiments will operate together with this accelerator: ALICE, ATLAS, CMS and LHCb. The main scientific goal of this project is to understand in detail the mechanism for electro-weak symmetry breaking and to search for physics beyond the standard model of particles. ATLAS and CMS are general purpose detectors designed for search and discovery of new physics, and optimized to search for Higgs and signals of supersymmetric matter (SUSY). In this paper the main features of the CMS detector will be presented and its potential for Higgs and SUSY discoveries will be discussed.

  8. Plans for Jet Energy Corrections at CMS

    NASA Astrophysics Data System (ADS)

    Mishra, Kalanand

    2009-05-01

    We present a plan for Jet Energy Corrections at CMS. Jet corrections at CMS will come initially from simulation tuned on test beam data, directly from collision data when available, and ultimately from a simulation tuned on collision data. The corrections will be factorized into a fixed sequence of sub-corrections associated with different detector and physics effects. The following three factors are minimum requirements for most analysis: offset corrections for pile-up and noise; correction for the response of the calorimeter as a function of jet pseudorapidity relative to the barrel; correction for the absolute response as a function of transverse momentum in the barrel. The required correction gives a jet Lorentz vector equivalent to the sum of particles in the jet cone emanating from a QCD hard collision. We discuss the status of these corrections, the planned data-driven techniques for their derivation, and their anticipated evolution with the stages of the CMS experiment.

  9. Fireworks: A physics event display for CMS

    SciTech Connect

    Kovalskyi, D.; Tadel, M.; Mrak-Tadel, A.; Bellenot, B.; Kuznetsov, V.; Jones, C.D.; Bauerdick, L. Case, M.; Mulmenstadt, J.; Yagil, A.; /UC, San Diego

    2010-01-01

    Fireworks is a CMS event display which is specialized for the physics studies case. This specialization allows us to use a stylized rather than 3D-accurate representation when appropriate. Data handling is greatly simplified by using only reconstructed information and ideal geometry. Fireworks provides an easy-to-use interface which allows a physicist to concentrate only on the data in which he is interested. Data is presented via graphical and textual views. Fireworks is built using the Eve subsystem of the CERN ROOT project and CMS's FWLite project. The FWLite project was part of CMS's recent code redesign which separates data classes into libraries separate from algorithms producing the data and uses ROOT directly for C++ object storage, thereby allowing the data classes to be used directly in ROOT.

  10. Power Studies for the CMS Pixel Tracker

    SciTech Connect

    Todri, A.; Turqueti, M.; Rivera, R.; Kwan, S.; /Fermilab

    2009-01-01

    The Electronic Systems Engineering Department of the Computing Division at the Fermi National Accelerator Laboratory is carrying out R&D investigations for the upgrade of the power distribution system of the Compact Muon Solenoid (CMS) Pixel Tracker at the Large Hadron Collider (LHC). Among the goals of this effort is that of analyzing the feasibility of alternative powering schemes for the forward tracker, including DC to DC voltage conversion techniques using commercially available and custom switching regulator circuits. Tests of these approaches are performed using the PSI46 pixel readout chip currently in use at the CMS Tracker. Performance measures of the detector electronics will include pixel noise and threshold dispersion results. Issues related to susceptibility to switching noise will be studied and presented. In this paper, we describe the current power distribution network of the CMS Tracker, study the implications of the proposed upgrade with DC-DC converters powering scheme and perform noise susceptibility analysis.

  11. Power distribution studies for CMS forward tracker

    SciTech Connect

    Todri, A.; Turqueti, M.; Rivera, R.; Kwan, S.

    2009-01-01

    The Electronic Systems Engineering Department of the Computing Division at the Fermi National Accelerator Laboratory is carrying out R&D investigations for the upgrade of the power distribution system of the Compact Muon Solenoid (CMS) Pixel Tracker at the Large Hadron Collider (LHC). Among the goals of this effort is that of analyzing the feasibility of alternative powering schemes for the forward tracker, including DC to DC voltage conversion techniques using commercially available and custom switching regulator circuits. Tests of these approaches are performed using the PSI46 pixel readout chip currently in use at the CMS Tracker. Performance measures of the detector electronics will include pixel noise and threshold dispersion results. Issues related to susceptibility to switching noise will be studied and presented. In this paper, we describe the current power distribution network of the CMS Tracker, study the implications of the proposed upgrade with DC-DC converters powering scheme and perform noise susceptibility analysis.

  12. Remote Operations for LHC and CMS

    SciTech Connect

    Gottschalk, E.E.; /Fermilab

    2007-04-01

    Commissioning the Large Hadron Collider (LHC) and its experiments will be a vital part of the worldwide high energy physics program beginning in 2007. A remote operations center has been built at Fermilab to contribute to commissioning and operations of the LHC and the Compact Muon Solenoid (CMS) experiment, and to develop new capabilities for real-time data analysis and monitoring for LHC, CMS, and grid computing. Remote operations will also be essential to a future International Linear Collider with its multiple, internationally distributed control rooms. In this paper we present an overview of Fermilab's LHC@FNAL remote operations center for LHC and CMS, describe what led up to the development of the center, and describe noteworthy features of the center.

  13. The CMS CERN Analysis Facility (CAF)

    NASA Astrophysics Data System (ADS)

    Buchmüller, O.; Bonacorsi, D.; Fanzago, F.; Gowdy, S.; Kreuzer, P.; Malgeri, L.; Mankel, R.; Metson, S.; Panzer-Steindel, B.; Afonso Sanches, J.; Schwickerath, U.; Spiga, D.; Teodoro, D.; Többicke, Rainer

    2010-04-01

    The CMS CERN Analysis Facility (CAF) was primarily designed to host a large variety of latency-critical workflows. These break down into alignment and calibration, detector commissioning and diagnosis, and high-interest physics analysis requiring fast-turnaround. In addition to the low latency requirement on the batch farm, another mandatory condition is the efficient access to the RAW detector data stored at the CERN Tier-0 facility. The CMS CAF also foresees resources for interactive login by a large number of CMS collaborators located at CERN, as an entry point for their day-by-day analysis. These resources will run on a separate partition in order to protect the high-priority use-cases described above. While the CMS CAF represents only a modest fraction of the overall CMS resources on the WLCG GRID, an appropriately sized user-support service needs to be provided. We will describe the building, commissioning and operation of the CMS CAF during the year 2008. The facility was heavily and routinely used by almost 250 users during multiple commissioning and data challenge periods. It reached a CPU capacity of 1.4MSI2K and a disk capacity at the Peta byte scale. In particular, we will focus on the performances in terms of networking, disk access and job efficiency and extrapolate prospects for the upcoming LHC first year data taking. We will also present the experience gained and the limitations observed in operating such a large facility, in which well controlled workflows are combined with more chaotic type analysis by a large number of physicists.

  14. Test beam analysis of the effect of highly ionizing particles on the CMS Silicon Strip Tracker

    NASA Astrophysics Data System (ADS)

    De Filippis, N.; CMS Collaboration

    2004-09-01

    Highly ionizing particles (HIPs) created by nuclear interactions in the silicon sensors cause a large signal which can saturate the APV readout chip used in the CMS Silicon Tracker system. This phenomenon was studied in two different beam-tests performed at PSI and at the CERN X5 experimental areas in 2002. The probability of a HIP-like event to occur per incident pion was measured and the dependence of the APV capability to detect a MIP signal on the time required to recover from such an event is derived. From these results, the expected inefficiency of the CMS Tracker due to HIPS is inferred.

  15. Experience with the CMS Event Data Model

    SciTech Connect

    Elmer, P.; Hegner, B.; Sexton-Kennedy, L.; /Fermilab

    2009-06-01

    The re-engineered CMS EDM was presented at CHEP in 2006. Since that time we have gained a lot of operational experience with the chosen model. We will present some of our findings, and attempt to evaluate how well it is meeting its goals. We will discuss some of the new features that have been added since 2006 as well as some of the problems that have been addressed. Also discussed is the level of adoption throughout CMS, which spans the trigger farm up to the final physics analysis. Future plans, in particular dealing with schema evolution and scaling, will be discussed briefly.

  16. 30 CFR 75.343 - Underground shops.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Underground shops. 75.343 Section 75.343... MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Ventilation § 75.343 Underground shops. (a) Underground...-3 through § 75.1107-16, or be enclosed in a noncombustible structure or area. (b) Underground...

  17. 45 CFR 150.203 - Circumstances requiring CMS enforcement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Determining Whether States Are Failing To Substantially Enforce HIPAA Requirements § 150.203 Circumstances requiring CMS enforcement. CMS enforces HIPAA requirements to the extent warranted (as determined by CMS) in... enacted legislation to enforce or that it is not otherwise enforcing HIPAA requirements. (b)...

  18. 45 CFR 150.203 - Circumstances requiring CMS enforcement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Determining Whether States Are Failing To Substantially Enforce HIPAA Requirements § 150.203 Circumstances requiring CMS enforcement. CMS enforces HIPAA requirements to the extent warranted (as determined by CMS) in... enacted legislation to enforce or that it is not otherwise enforcing HIPAA requirements. (b)...

  19. 45 CFR 150.203 - Circumstances requiring CMS enforcement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Determining Whether States Are Failing To Substantially Enforce HIPAA Requirements § 150.203 Circumstances requiring CMS enforcement. CMS enforces HIPAA requirements to the extent warranted (as determined by CMS) in... enacted legislation to enforce or that it is not otherwise enforcing HIPAA requirements. (b)...

  20. 42 CFR 426.517 - CMS' statement regarding new evidence.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false CMS' statement regarding new evidence. 426.517... DETERMINATIONS Review of an NCD § 426.517 CMS' statement regarding new evidence. (a) CMS may review any new... limited to new evidence: (1) Submitted with the initial complaint; (2) Submitted with an amended...

  1. Commissioning of CMS Endcap Muon System

    NASA Astrophysics Data System (ADS)

    Brownell, Elizabeth

    2009-05-01

    This talk is as an overview of the evolution and current state of commissioning work on the CMS endcap muon system. I intend to highlight the progress in operating the detector, some problems encountered and solutions developed, lessons learned in the process, points which still require action to be taken, and data taking results.

  2. Monte Carlo Production Management at CMS

    NASA Astrophysics Data System (ADS)

    Boudoul, G.; Franzoni, G.; Norkus, A.; Pol, A.; Srimanobhas, P.; Vlimant, J.-R.

    2015-12-01

    The analysis of the LHC data at the Compact Muon Solenoid (CMS) experiment requires the production of a large number of simulated events. During the RunI of LHC (20102012), CMS has produced over 12 Billion simulated events, organized in approximately sixty different campaigns each emulating specific detector conditions and LHC running conditions (pile up). In order to aggregate the information needed for the configuration and prioritization of the events production, assure the book-keeping of all the processing requests placed by the physics analysis groups, and to interface with the CMS production infrastructure, the web- based service Monte Carlo Management (McM) has been developed and put in production in 2013. McM is based on recent server infrastructure technology (CherryPy + AngularJS) and relies on a CouchDB database back-end. This contribution covers the one and half year of operational experience managing samples of simulated events for CMS, the evolution of its functionalities and the extension of its capability to monitor the status and advancement of the events production.

  3. The CMS Remote Analysis Builder (CRAB)

    SciTech Connect

    Spiga, D.; Cinquilli, M.; Servoli, L.; Lacaprara, S.; Fanzago, F.; Dorigo, A.; Merlo, M.; Farina, F.; Fanfani, A.; Codispoti, G.; Bacchi, W.; /INFN, Bologna /Bologna U /CERN /INFN, CNAF /INFN, Trieste /Fermilab

    2008-01-22

    The CMS experiment will produce several Pbytes of data every year, to be distributed over many computing centers geographically distributed in different countries. Analysis of this data will be also performed in a distributed way, using grid infrastructure. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that allows a transparent access to distributed data to end physicist. Very limited knowledge of underlying technicalities are required to the user. CRAB interacts with the local user environment, the CMS Data Management services and with the Grid middleware. It is able to use WLCG, gLite and OSG middleware. CRAB has been in production and in routine use by end-users since Spring 2004. It has been extensively used in studies to prepare the Physics Technical Design Report (PTDR) and in the analysis of reconstructed event samples generated during the Computing Software and Analysis Challenge (CSA06). This involved generating thousands of jobs per day at peak rates. In this paper we discuss the current implementation of CRAB, the experience with using it in production and the plans to improve it in the immediate future.

  4. The Tracker of the CMS Experiment

    SciTech Connect

    Migliore, Ernesto

    2005-10-12

    With more than 200 m2 the Silicon Strip Tracker of the Compact Muon Solenoid (CMS) experiment will be the largest silicon detector ever built. In this contribution the main design considerations and the status of the construction, at about one and a half year after the begin of the production of the modules, are reviewed.

  5. CMS results on exclusive and diffractive production

    SciTech Connect

    Alves, Gilvan A.

    2015-04-10

    We present recent CMS measurements of diffractive and exclusive processes, using data collected at 7 TeV at the LHC. Measurements of soft single- and double-diffractive cross sections are presented, as well as measurements of photon-induced processes including studies of exclusive WW production via photon-photon exchange.

  6. WLCG scale testing during CMS data challenges

    NASA Astrophysics Data System (ADS)

    Gutsche, O.; Hajdu, C.

    2008-07-01

    The CMS computing model to process and analyze LHC collision data follows a data-location driven approach and is using the WLCG infrastructure to provide access to GRID resources. As a preparation for data taking, CMS tests its computing model during dedicated data challenges. An important part of the challenges is the test of the user analysis which poses a special challenge for the infrastructure with its random distributed access patterns. The CMS Remote Analysis Builder (CRAB) handles all interactions with the WLCG infrastructure transparently for the user. During the 2006 challenge, CMS set its goal to test the infrastructure at a scale of 50,000 user jobs per day using CRAB. Both direct submissions by individual users and automated submissions by robots were used to achieve this goal. A report will be given about the outcome of the user analysis part of the challenge using both the EGEE and OSG parts of the WLCG. In particular, the difference in submission between both GRID middlewares (resource broker vs. direct submission) will be discussed. In the end, an outlook for the 2007 data challenge is given.

  7. Commissioning of the CMS Forward Pixel Detector

    SciTech Connect

    Kumar, Ashish; /SUNY, Buffalo

    2008-12-01

    The Compact Muon Solenoid (CMS) experiment is scheduled for physics data taking in summer 2009 after the commissioning of high energy proton-proton collisions at Large Hadron Collider (LHC). At the core of the CMS all-silicon tracker is the silicon pixel detector, comprising three barrel layers and two pixel disks in the forward and backward regions, accounting for a total of 66 million channels. The pixel detector will provide high-resolution, 3D tracking points, essential for pattern recognition and precise vertexing, while being embedded in a hostile radiation environment. The end disks of the pixel detector, known as the Forward Pixel detector, has been assembled and tested at Fermilab, USA. It has 18 million pixel cells with dimension 100 x 150 {micro}m{sup 2}. The complete forward pixel detector was shipped to CERN in December 2007, where it underwent extensive system tests for commissioning prior to the installation. The pixel system was put in its final place inside the CMS following the installation and bake out of the LHC beam pipe in July 2008. It has been integrated with other sub-detectors in the readout since September 2008 and participated in the cosmic data taking. This report covers the strategy and results from commissioning of CMS forward pixel detector at CERN.

  8. 23 CFR 500.109 - CMS.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... provides accurate, up-to-date information on transportation system operations and performance and assesses... and local officials may vary by type of transportation facility, geographic location (metropolitan... SYSTEMS Management Systems § 500.109 CMS. (a) For purposes of this part, congestion means the level...

  9. 23 CFR 500.109 - CMS.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... provides accurate, up-to-date information on transportation system operations and performance and assesses... and local officials may vary by type of transportation facility, geographic location (metropolitan... SYSTEMS Management Systems § 500.109 CMS. (a) For purposes of this part, congestion means the level...

  10. 23 CFR 500.109 - CMS.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... provides accurate, up-to-date information on transportation system operations and performance and assesses... and local officials may vary by type of transportation facility, geographic location (metropolitan... SYSTEMS Management Systems § 500.109 CMS. (a) For purposes of this part, congestion means the level...

  11. 23 CFR 500.109 - CMS.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... provides accurate, up-to-date information on transportation system operations and performance and assesses... and local officials may vary by type of transportation facility, geographic location (metropolitan... SYSTEMS Management Systems § 500.109 CMS. (a) For purposes of this part, congestion means the level...

  12. North American deep underground laboratories: Soudan Underground Laboratory, SNOLab, and the Sanford Underground Research Facility

    NASA Astrophysics Data System (ADS)

    Lesko, Kevin T.

    2015-08-01

    Over the past several decades, fundamental physics experiments have required access to deep underground laboratories to satisfy the increasingly strict requirements for ultra-low background environments and shielding from cosmic rays. In this presentation, I summarize the existing and anticipated physics programs and laboratory facilities of North America's deep facilities: The Soudan Underground Laboratory in Minnesota, SNOLab in Ontario, Canada, and the Sanford Underground Research Facility in Lead, South Dakota.

  13. Multinational underground nuclear parks

    SciTech Connect

    Myers, C.W.; Giraud, K.M.

    2013-07-01

    Newcomer countries expected to develop new nuclear power programs by 2030 are being encouraged by the International Atomic Energy Agency to explore the use of shared facilities for spent fuel storage and geologic disposal. Multinational underground nuclear parks (M-UNPs) are an option for sharing such facilities. Newcomer countries with suitable bedrock conditions could volunteer to host M-UNPs. M-UNPs would include back-end fuel cycle facilities, in open or closed fuel cycle configurations, with sufficient capacity to enable M-UNP host countries to provide for-fee waste management services to partner countries, and to manage waste from the M-UNP power reactors. M-UNP potential advantages include: the option for decades of spent fuel storage; fuel-cycle policy flexibility; increased proliferation resistance; high margin of physical security against attack; and high margin of containment capability in the event of beyond-design-basis accidents, thereby reducing the risk of Fukushima-like radiological contamination of surface lands. A hypothetical M-UNP in crystalline rock with facilities for small modular reactors, spent fuel storage, reprocessing, and geologic disposal is described using a room-and-pillar reference-design cavern. Underground construction cost is judged tractable through use of modern excavation technology and careful site selection. (authors)

  14. Kimballton Underground Research Facility

    NASA Astrophysics Data System (ADS)

    Rountree, Steven Derek

    2014-03-01

    The Kimballton Underground Research Facility (KURF) is an operating deep underground research facility with six active projects, and greater than 50 trained researchers. KURF is 30 minutes from the Virginia Tech (VT) campus in an operating limestone mine with drive-in access (eg: roll-back truck, motor coach), over 50 miles of drifts (all 40' × 20 +' the current lab is 35' × 22' × 100'), and 1700' of overburden (1450m.w.e.). The laboratory was built in 2007 and offers fiber optic internet, LN2, 480/220/110 V power, ample water, filtered air, 55 F constant temp, low Rn levels, low rock background activity, and a muon flux of only ~0.004 muons per square meter, per second, per steradian. The current users are funded by NSF, DOE, and NNSA. Current user group: 1) mini-LENS (VT, Louisiana State University, BNL); 2) Double Beta Decay to Excited States (Duke University); 3) HPGe Low-Background Screening (University of North Carolina (UNC), VT); 4) MALBEK (UNC); 5&6) Watchman - 5) Radionuclide Detector and 6) MARS detector (LLNL, SNL, UC-Davis, UC-Berkeley, UH, Hawaii Pacific, UC-Irvine, VT).

  15. Underground nuclear astrophysics: Why and how

    NASA Astrophysics Data System (ADS)

    Best, A.; Caciolli, A.; Fülöp, Zs.; Gyürky, Gy.; Laubenstein, M.; Napolitani, E.; Rigato, V.; Roca, V.; Szücs, T.

    2016-04-01

    The goal of nuclear astrophysics is to measure cross-sections of nuclear physics reactions of interest in astrophysics. At stars temperatures, these cross-sections are very low due to the suppression of the Coulomb barrier. Cosmic-ray-induced background can seriously limit the determination of reaction cross-sections at energies relevant to astrophysical processes and experimental setups should be arranged in order to improve the signal-to-noise ratio. Placing experiments in underground sites, however, reduces this background opening the way towards ultra low cross-section determination. LUNA (Laboratory for Underground Nuclear Astrophysics) was pioneer in this sense. Two accelerators were mounted at the INFN National Laboratories of Gran Sasso (LNGS) allowing to study nuclear reactions close to stellar energies. A summary of the relevant technology used, including accelerators, target production and characterisation, and background treatment is given.

  16. Underground nuclear astrophysics studies with CASPAR

    NASA Astrophysics Data System (ADS)

    Robertson, Daniel; Couder, Manoel; Greife, Uwe; Strieder, Frank; Wiescher, Michael

    2016-02-01

    The drive of low-energy nuclear astrophysics laboratories is to study the reactions of importance to stellar burning processes and elemental production through stellar nucleosynthesis, over the energy range of astrophysical interest. As laboratory measurements approach the stellar burning window, the rapid drop off of cross-sections is a significant barrier and drives the need to lower background interference. The natural background suppression of underground accelerator facilities enables the extension of current experimental data to lower energies. An example of such reactions of interest are those thought to be sources of neutrons for the s-process, the major production mechanism for elements above the iron peak. The reactions 13C(α,n)16O and 22Ne(α,n)25Mg are the proposed initial focus of the new nuclear astrophysics accelerator laboratory (CASPAR) currently under construction at the Sanford Underground Research Facility, Lead, South Dakota

  17. One Year of FOS Measurements in CMS Experiment at CERN

    NASA Astrophysics Data System (ADS)

    Szillási, Zoltán; Buontempo, Salvatore; Béni, Noémi; Breglio, Giovanni; Cusano, Andrea; Laudati, Armando; Giordano, Michele; Saccomanno, Andrea; Druzhkin, Dmitry; Tsirou, Andromachi

    Results are presented on the activity carried out by our research group, in collaboration with the SME Optosmart s.r.l. (an Italian spin-off company), on the application of Fiber Optic Sensor (FOS) techniques to monitor high-energy physics (HEP) detectors. Assuming that Fiber Bragg Grating sensors (FBGs) radiation hardness has been deeply studied for other field of application, we have applied the FBG technology to the HEP research domain. We present here the experimental evidences of the solid possibility to use such a class of sensors also in HEP detector very complex environmental side conditions. In particular we present more than one year data results of FBG measurements in the Compact Muon Solenoid (CMS) experiment set up at the CERN, where we have monitored temperatures (within CMS core) and strains in different locations by using FBG sensors during the detector operation with the Large Hadron Collider (LHC) collisions and high magnetic field. FOS data and FOS readout system stability and reliability is demonstrated, with continuous 24/24 h 7/7d data taking under severe and complex side conditions.

  18. The bakelite for the RPCs of the experiment CMS

    NASA Astrophysics Data System (ADS)

    Altieri, S.; Belli, G.; Bruno, G.; Guida, R.; Merlo, M.; Ratti, S. P.; Riccardi, C.; Torre, P.; Vitulo, P.; Mognaschi, E. R.; Abbrescia, M.; Colaleo, A.; Iaselli, G.; Loddo, F.; Maggi, M.; Marangelli, B.; Natali, S.; Nuzzo, S.; Pugliese, G.; Ranieri, A.; Romano, F.

    2000-12-01

    Results from aging tests on the bakelite used for the CMS RPCs are presented. Samples of melaminic bakelite were exposed to a heavy gamma and neutron radiation. Data on the bulk resistivity were collected while accumulating gamma and neutron doses and particles fluence up to values well beyond those expected in 10 years of RPCs operation in the barrel region of CMS. The test with gamma radiation was performed at the CERN Gamma Irradiation Facility (GIF) with a 20 Ci 137Cs source. A total absorbed dose of 5 Gy was accumulated during an irradiation period of about one month. The test with both neutron and gamma radiation was held at the Triga Mark II 250 kW reactor located in Pavia. A total of 80 h of exposure were accumulated integrating a neutron and gamma dose of about 80 Gy and a fast neutron fluence of some 10 11 cm-2. Experimental data on dose rate in both the test facilities have been compared to simulation output and show a good agreement.

  19. CMS dashboard for monitoring of the user analysis activities

    NASA Astrophysics Data System (ADS)

    Karavakis, Edward; Andreeva, Julia; Maier, Gerhild; Khan, Akram

    2012-12-01

    The CMS Virtual Organisation (VO) uses various fully distributed job submission methods and execution backends. The CMS jobs are processed on several middleware platforms such as the gLite, the ARC and the OSG. Up to 200,000 CMS jobs are submitted daily to the Worldwide LHC Computing Grid (WLCG) infrastructure and this number is steadily growing. These mentioned factors increase the complexity of the monitoring of the user analysis activities within the CMS VO. Reliable monitoring is an aspect of particular importance; it is a vital factor for the overall improvement of the quality of the CMS VO infrastructure.

  20. HAWAII LEAKING UNDERGROUND STORAGE TANKS

    EPA Science Inventory

    Point coverage of leaking underground storage tanks(LUST) for the state of Hawaii. The original database was developed and is maintained by the State of Hawaii, Dept. of Health. The point locations represent facilities where one or more leaking underground storage tank exists. ...

  1. A Course on Underground Processing.

    ERIC Educational Resources Information Center

    Miller, Clarence A.

    1981-01-01

    Discusses a one-semester course on recovering fossil fuels and minerals from underground formations. Includes course outline and information of its major divisions: (1) Geological Background; (2) Flow, Transport, and Interfacial Phenomena in Porous Media; and (3) Description of Underground Processes. (SK)

  2. A Case for Underground Schools.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Education, Oklahoma City.

    The underground school offers several advantages. Preliminary studies in Oklahoma have shown that these schools perform exceptionally well as learning environments. The lack of noise and distractions helps teachers keep the attention of their students. Underground structures can protect people against a broad range of natural and man-made…

  3. Trends in underground mining

    SciTech Connect

    Not Available

    1983-11-01

    This article presents some of the recent equipments developments in underground mining. Recent improvements in the transmission and braking systems of electric load-haul-dumpers are increasing cycle times and lowering maintenance costs. Torque produced by electric motors increases in response to load. This results in very high stall torque and fast cycling machines. Some of the other features becoming available include: improved switch-gear and adjustable solid-state overload relays, the new motor across the line starter device has vacuum contactors, power factor correction capacitors, a DC operating coil to eliminate contact chatter at low voltage, remote radio control of LHDs in mine areas, water-cooled and air cooled engines and hydraulic braking systems. The advantages of hydraulic drifters and roof bolters are presented.

  4. Underground pumped hydroelectric storage

    NASA Astrophysics Data System (ADS)

    Allen, R. D.; Doherty, T. J.; Kannberg, L. D.

    1984-07-01

    Underground pumped hydroelectric energy storage was conceived as a modification of surface pumped storage to eliminate dependence upon fortuitous topography, provide higher hydraulic heads, and reduce environmental concerns. A UPHS plant offers substantial savings in investment cost over coal-fired cycling plants and savings in system production costs over gas turbines. Potential location near load centers lowers transmission costs and line losses. Environmental impact is less than that for a coal-fired cycling plant. The inherent benefits include those of all pumped storage (i.e., rapid load response, emergency capacity, improvement in efficiency as pumps improve, and capacity for voltage regulation). A UPHS plant would be powered by either a coal-fired or nuclear baseload plant. The economic capacity of a UPHS plant would be in the range of 1000 to 3000 MW. This storage level is compatible with the load-velocity requirements of a greater metropolitan area with population of 1 million or more.

  5. Muon Reconstruction and Identification in CMS

    SciTech Connect

    Everett, A.

    2010-02-10

    We present the design strategies and status of the CMS muon reconstruction and identification identification software. Muon reconstruction and identification is accomplished through a variety of complementary algorithms. The CMS muon reconstruction software is based on a Kalman filter technique and reconstructs muons in the standalone muon system, using information from all three types of muon detectors, and links the resulting muon tracks with tracks reconstructed in the silicon tracker. In addition, a muon identification algorithm has been developed which tries to identify muons with high efficiency while maintaining a low probability of misidentification. The muon identification algorithm is complementary by design to the muon reconstruction algorithm that starts track reconstruction in the muon detectors. The identification algorithm accepts reconstructed tracks from the inner tracker and attempts to quantify the muon compatibility for each track using associated calorimeter and muon detector hit information. The performance status is based on detailed detector simulations as well as initial studies using cosmic muon data.

  6. Observation of hard diffraction with CMS

    SciTech Connect

    Obertino, M. M.

    2009-03-23

    Diffraction with a hard scale can be observed in the first LHC data. We present studies of single diffractive W-boson production (pp{yields}Xp, with X including a W boson) and of {upsilon} photoproduction (pp{yields}{upsilon}p, with {upsilon}{yields}{mu}{sup +}{mu}{sup -}). The feasibility of observing these processes with the CMS detector using the first 100 pb{sup -1} of collected integrated luminosity for single interactions is discussed.

  7. New Forward and Diffractive Physics at CMS

    NASA Astrophysics Data System (ADS)

    Santoro, Alberto

    2011-04-01

    Forward and Diffractive Physics (FWP) in LHC is a new open window to understand this type of strong interactions. We will present a didactic description of the topics being developed at CMS. As we know there still is no new results to present for FWP. We are accumulating data to have soon new results. We will show a number of topics and the detectors properties to do the observation of several topologies. We expect to give an optimistic view of the area.

  8. Studies of vector boson production at CMS

    NASA Astrophysics Data System (ADS)

    Dordevic, Milos; CMS Collaboration

    2016-07-01

    The most recent diboson production and electroweak physics results from CMS are presented. This overview is focused on the precise measurement of WW , WZ , ZZ and γ γ production, as well as W or Z production in association with a photon. These results are interpreted in terms of constraints on anomalous triple gauge couplings, while the study of WW γ and WZ γ production is used to set limits on anomalous quartic gauge couplings. Selection of the latest electroweak results is also presented.

  9. Operation of the CMS silicon strip tracker

    NASA Astrophysics Data System (ADS)

    Yuri, Gotra; CMS Collaboration

    2011-10-01

    The CMS Silicon Strip Tracker (SST), comprising 9.6 million readout channels from 15148 modules covering an area of about 200 m², needs to be precisely calibrated in order to correctly interpret and reconstruct the events recorded from the detector, ensuring that the SST performance fully meets the physics research program of the CMS experiment. Calibration constants may be derived from promptly reconstructed events as well as from pedestal runs gathered just before the acquisition of physics runs. These calibration procedures were exercised in summer and winter 2009, when the CMS detector was commissioned using cosmic muons and proton-proton collisions at a center-of-mass energies of 900 GeV and 2.36 TeV. During these data taking periods the performance of the SST was carefully studied: the noise of the detector, the data integrity, the signal-to-noise ratio, the hit reconstruction efficiency, the calibration workflows have been all checked for stability and for different conditions, at the module level. The calibration procedures and the detector performance results from recent physics runs are described.

  10. CMS High Level Trigger Timing Measurements

    NASA Astrophysics Data System (ADS)

    Richardson, Clint

    2015-12-01

    The two-level trigger system employed by CMS consists of the Level 1 (L1) Trigger, which is implemented using custom-built electronics, and the High Level Trigger (HLT), a farm of commercial CPUs running a streamlined version of the offline CMS reconstruction software. The operational L1 output rate of 100 kHz, together with the number of CPUs in the HLT farm, imposes a fundamental constraint on the amount of time available for the HLT to process events. Exceeding this limit impacts the experiment's ability to collect data efficiently. Hence, there is a critical need to characterize the performance of the HLT farm as well as the algorithms run prior to start up in order to ensure optimal data taking. Additional complications arise from the fact that the HLT farm consists of multiple generations of hardware and there can be subtleties in machine performance. We present our methods of measuring the timing performance of the CMS HLT, including the challenges of making such measurements. Results for the performance of various Intel Xeon architectures from 2009-2014 and different data taking scenarios are also presented.

  11. X13CMS: Global Tracking of Isotopic Labels in Untargeted Metabolomics

    PubMed Central

    2015-01-01

    Studies of isotopically labeled compounds have been fundamental to understanding metabolic pathways and fluxes. They have traditionally, however, been used in conjunction with targeted analyses that identify and quantify a limited number of labeled downstream metabolites. Here we describe an alternative workflow that leverages recent advances in untargeted metabolomic technologies to track the fates of isotopically labeled metabolites in a global, unbiased manner. This untargeted approach can be applied to discover novel biochemical pathways and characterize changes in the fates of labeled metabolites as a function of altered biological conditions such as disease. To facilitate the data analysis, we introduce X13CMS, an extension of the widely used mass spectrometry-based metabolomic software package XCMS. X13CMS uses the XCMS platform to detect metabolite peaks and perform retention-time alignment in liquid chromatography/mass spectrometry (LC/MS) data. With the use of the XCMS output, the program then identifies isotopologue groups that correspond to isotopically labeled compounds. The retrieval of these groups is done without any a priori knowledge besides the following input parameters: (i) the mass difference between the unlabeled and labeled isotopes, (ii) the mass accuracy of the instrument used in the analysis, and (iii) the estimated retention-time reproducibility of the chromatographic method. Despite its name, X13CMS can be used to track any isotopic label. Additionally, it detects differential labeling patterns in biological samples collected from parallel control and experimental conditions. We validated the ability of X13CMS to accurately retrieve labeled metabolites from complex biological matrices both with targeted LC/MS/MS analysis of a subset of the hits identified by the program and with labeled standards spiked into cell extracts. We demonstrate the full functionality of X13CMS with an analysis of cultured rat astrocytes treated with uniformly

  12. Open beauty measurements in pPb collisions with CMS

    NASA Astrophysics Data System (ADS)

    Kim, Hyunchul

    2014-11-01

    The B+, B0 and Bs0 mesons are exclusively reconstructed in proton-lead (pPb) collisions at √{sNN} = 5.02 TeV by the CMS Collaboration at the Large Hadron Collider (LHC). The cross sections are measured in the range of transverse momentum of 10 to 60 GeV / c and the center-of-mass rapidity smaller than 1.93. The nuclear modification factor, which is the ratio of the cross section in between pPb and proton-proton (pp) collisions, is estimated using the experimental data from pPb collisions and theoretical calculations as the pp reference. The calculated nuclear modification factors for each particle species are consistent with unity within the current uncertainties. The forward-to-backward asymmetry of B+ is also analyzed and does not show any nuclear effect in the measured rapidity range.

  13. Pumping carbon out of underground coal deposits

    SciTech Connect

    Steinberg, M.

    1999-07-01

    Thin steam and deep coal deposits are difficult and costly to mine. Underground coal gasification (UCG) with air or oxygen was thought to alleviate this problem. Experimental field tests were conducted in Wyoming and Illinois. Problems were encountered concerning a clear path for the team gasification to take place and removal of gas. The high endothermic heat of reaction requiring large quantities of steam and oxygen makes the process expensive. Safety problems due to incomplete reaction is also of concern. A new approach is proposed which can remedy most of these drawbacks for extracting energy from underground coal deposits. It is proposed to hydrogasify the coal underground with a heated hydrogen gas stream under pressure to produce a methane-rich gas effluent stream. The hydrogasification of coal is essentially exothermic so that no steam or oxygen is required. The gases formed are always in a reducing atmosphere making the process safe. The hydrogen is obtained by thermally decomposing the effluent methane above ground to elemental carbon and hydrogen. The hydrogen is returned underground for further hydrogasification of the coal seam. The small amount of oxygen and sulfur in the coal can be processed out above ground by removal as water and H{sub 2}S. Any CO can be removed by a methanation step returning the methane to process. The ash remains in the ground and the elemental carbon produced is the purest form of coal. The particulate carbon can be slurried with water to produce a fuel stream that can be fed to a turbine for efficient combined cycle power plants with lower CO{sub 2} emissions. Coal cannot be used for combined cycle because of its ash and sulfur content destroys the gas turbine. Depending on its composition of coal seam some excess hydrogen is also produced. Hydrogen is, thus, used to pump pure carbon out of the ground.

  14. German contributions to the CMS computing infrastructure

    NASA Astrophysics Data System (ADS)

    Scheurer, A.; German CMS Community

    2010-04-01

    The CMS computing model anticipates various hierarchically linked tier centres to counter the challenges provided by the enormous amounts of data which will be collected by the CMS detector at the Large Hadron Collider, LHC, at CERN. During the past years, various computing exercises were performed to test the readiness of the computing infrastructure, the Grid middleware and the experiment's software for the startup of the LHC which took place in September 2008. In Germany, several tier sites are set up to allow for an efficient and reliable way to simulate possible physics processes as well as to reprocess, analyse and interpret the numerous stored collision events of the experiment. It will be shown that the German computing sites played an important role during the experiment's preparation phase and during data-taking of CMS and, therefore, scientific groups in Germany will be ready to compete for discoveries in this new era of particle physics. This presentation focuses on the German Tier-1 centre GridKa, located at Forschungszentrum Karlsruhe, the German CMS Tier-2 federation DESY/RWTH with installations at the University of Aachen and the research centre DESY. In addition, various local computing resources in Aachen, Hamburg and Karlsruhe are briefly introduced as well. It will be shown that an excellent cooperation between the different German institutions and physicists led to well established computing sites which cover all parts of the CMS computing model. Therefore, the following topics are discussed and the achieved goals and the gained knowledge are depicted: data management and distribution among the different tier sites, Grid-based Monte Carlo production at the Tier-2 as well as Grid-based and locally submitted inhomogeneous user analyses at the Tier-3s. Another important task is to ensure a proper and reliable operation 24 hours a day, especially during the time of data-taking. For this purpose, the meta-monitoring tool "HappyFace", which was

  15. Assessment of Cosmic Background Attenuation at Building 3425 (Underground Laboratory)

    SciTech Connect

    Kouzes, Richard T.; Borgardt, James D.; Lintereur, Azaree T.; Panisko, Mark E.

    2009-10-01

    Specifications for the Underground Facility (building 3425) in the Radiation Detection and Nuclear Sciences complex presently under construction at Pacific Northwest National Laboratory mandate a 30 meters water equivalent shielding for cosmic background attenuation at the 30-foot underground depth of the laboratory. A set thickness of a specified fill material was determined; however a smaller thickness of a higher density material was used for the earthen bunker. Questions arose as to whether this altered configuration met the required shielding specifications. A series of measurements were made to address this concern using a 4”x4”x16” NaI(Tl) detector (Scionix Holland, 3.5N-E2-X). Cosmic ray data were taken at the surface, and at several locations within the underground facility in order to obtain an experimental value for the attenuation of the cosmic radiation. This experimental result was compared with the contracted attenuation.

  16. Environmental benefits of underground coal gasification.

    PubMed

    Liu, Shu-qin; Liu, Jun-hua; Yu, Li

    2002-04-01

    Environmental benefits of underground coal gasification are evaluated. The results showed that through underground coal gasification, gangue discharge is eliminated, sulfur emission is reduced, and the amount of ash, mercury, and tar discharge are decreased. Moreover, effect of underground gasification on underground water is analyzed and CO2 disposal method is put forward. PMID:12046301

  17. Underground pumped hydroelectric storage

    SciTech Connect

    Allen, R.D.; Doherty, T.J.; Kannberg, L.D.

    1984-07-01

    Underground pumped hydroelectric energy storage was conceived as a modification of surface pumped storage to eliminate dependence upon fortuitous topography, provide higher hydraulic heads, and reduce environmental concerns. A UPHS plant offers substantial savings in investment cost over coal-fired cycling plants and savings in system production costs over gas turbines. Potential location near load centers lowers transmission costs and line losses. Environmental impact is less than that for a coal-fired cycling plant. The inherent benefits include those of all pumped storage (i.e., rapid load response, emergency capacity, improvement in efficiency as pumps improve, and capacity for voltage regulation). A UPHS plant would be powered by either a coal-fired or nuclear baseload plant. The economic capacity of a UPHS plant would be in the range of 1000 to 3000 MW. This storage level is compatible with the load-leveling requirements of a greater metropolitan area with population of 1 million or more. The technical feasibility of UPHS depends upon excavation of a subterranean powerhouse cavern and reservoir caverns within a competent, impervious rock formation, and upon selection of reliable and efficient turbomachinery - pump-turbines and motor-generators - all remotely operable.

  18. Muon tracking underground

    NASA Astrophysics Data System (ADS)

    Battistoni, G.; Campana, P.; Chiarella, V.; Denni, U.; Iarocci, E.

    1986-04-01

    The design and performance of plastic streamer tubes for use in large underground particle-physics experiments such as the muon, astrophysics, and cosmic-ray observatory (MACRO) being developed for Gran Sasso Laboratory are reported. The large (1000 sq m or more) detector area required to achieve high-angular-resolution muon tracking in MACRO is covered by modules with eight 3 x 3-cm-cross section active streamer-tube cells each, similar to those used in the Mt. Blanc Laboratory detector. The MACRO modules have a maximum length of 12 m; and the cells have 60-micron-diameter wires, two conducting graphite sides, and two insulating sides (electrodeless electric-field shaping). The results of performance tests flowing 3:1 He:n-pentane through a tube module are presented graphically. Spatial resolution 1 cm and time resolution 100 ns are obtained, and the ability of the streamer tubes to detect large ionization losses with respect to the minimum is demonstrated.

  19. Research on Joint Parameter Inversion for an Integrated Underground Displacement 3D Measuring Sensor

    PubMed Central

    Shentu, Nanying; Qiu, Guohua; Li, Qing; Tong, Renyuan; Shentu, Nankai; Wang, Yanjie

    2015-01-01

    Underground displacement monitoring is a key means to monitor and evaluate geological disasters and geotechnical projects. There exist few practical instruments able to monitor subsurface horizontal and vertical displacements simultaneously due to monitoring invisibility and complexity. A novel underground displacement 3D measuring sensor had been proposed in our previous studies, and great efforts have been taken in the basic theoretical research of underground displacement sensing and measuring characteristics by virtue of modeling, simulation and experiments. This paper presents an innovative underground displacement joint inversion method by mixing a specific forward modeling approach with an approximate optimization inversion procedure. It can realize a joint inversion of underground horizontal displacement and vertical displacement for the proposed 3D sensor. Comparative studies have been conducted between the measured and inversed parameters of underground horizontal and vertical displacements under a variety of experimental and inverse conditions. The results showed that when experimentally measured horizontal displacements and vertical displacements are both varied within 0 ~ 30 mm, horizontal displacement and vertical displacement inversion discrepancies are generally less than 3 mm and 1 mm, respectively, under three kinds of simulated underground displacement monitoring circumstances. This implies that our proposed underground displacement joint inversion method is robust and efficient to predict the measuring values of underground horizontal and vertical displacements for the proposed sensor. PMID:25871714

  20. Underground caverns for hydrocarbon storage

    SciTech Connect

    Barron, T.F.

    1998-12-31

    Large, international gas processing projects and growing LPG imports in developing countries are driving the need to store large quantities of hydrocarbon liquids. Even though underground storage is common in the US, many people outside the domestic industry are not familiar with the technology and the benefits underground storage can offer. The latter include lower construction and operating costs than surface storage, added safety, security and greater environmental acceptance.

  1. CMS Jet and Missing $E_T$ Commissioning

    SciTech Connect

    Elvira, V.Daniel; /Fermilab

    2009-01-01

    We describe how jets and E{sub T} are defined, reconstructed, and calibrated in CMS, as well as how the CMS detector performs in measuring these physics objects. Performance results are derived from the CMS simulation application, based on Geant4, and also from noise and cosmic commissioning data taken before the first collision event was recorded by CMS in November 2009. A jet and E{sub T} startup plan is in place which includes a data quality monitoring and prompt analysis task force to identify and fix problems as they arise.

  2. Using the CMS threaded framework in a production environment

    SciTech Connect

    Jones, C. D.; Contreras, L.; Gartung, P.; Hufnagel, D.; Sexton-Kennedy, L.

    2015-12-23

    During 2014, the CMS Offline and Computing Organization completed the necessary changes to use the CMS threaded framework in the full production environment. We will briefly discuss the design of the CMS Threaded Framework, in particular how the design affects scaling performance. We will then cover the effort involved in getting both the CMSSW application software and the workflow management system ready for using multiple threads for production. Finally, we will present metrics on the performance of the application and workflow system as well as the difficulties which were uncovered. As a result, we will end with CMS' plans for using the threaded framework to do production for LHC Run 2.

  3. Grid Interoperation with ARC middleware for the CMS experiment

    NASA Astrophysics Data System (ADS)

    Edelmann, Erik; Field, Laurence; Frey, Jaime; Grønager, Michael; Happonen, Kalle; Johansson, Daniel; Kleist, Josva; Klem, Jukka; Koivumäki, Jesper; Lindén, Tomas; Pirinen, Antti; Qing, Di

    2010-04-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  4. Using the CMS Threaded Framework In A Production Environment

    NASA Astrophysics Data System (ADS)

    Jones, C. D.; Contreras, L.; Gartung, P.; Hufnagel, D.; Sexton-Kennedy, L.

    2015-12-01

    During 2014, the CMS Offline and Computing Organization completed the necessary changes to use the CMS threaded framework in the full production environment. We will briefly discuss the design of the CMS Threaded Framework, in particular how the design affects scaling performance. We will then cover the effort involved in getting both the CMSSW application software and the workflow management system ready for using multiple threads for production. Finally, we will present metrics on the performance of the application and workflow system as well as the difficulties which were uncovered. We will end with CMS' plans for using the threaded framework to do production for LHC Run 2.

  5. Underground Coal Gasification Program

    Energy Science and Technology Software Center (ESTSC)

    1994-12-01

    CAVSIM is a three-dimensional, axisymmetric model for resource recovery and cavity growth during underground coal gasification (UCG). CAVSIM is capable of following the evolution of the cavity from near startup to exhaustion, and couples explicitly wall and roof surface growth to material and energy balances in the underlying rubble zones. Growth mechanisms are allowed to change smoothly as the system evolves from a small, relatively empty cavity low in the coal seam to a large,more » almost completely rubble-filled cavity extending high into the overburden rock. The model is applicable to nonswelling coals of arbitrary seam thickness and can handle a variety of gas injection flow schedules or compositions. Water influx from the coal aquifer is calculated by a gravity drainage-permeation submodel which is integrated into the general solution. The cavity is considered to consist of up to three distinct rubble zones and a void space at the top. Resistance to gas flow injected from a stationary source at the cavity floor is assumed to be concentrated in the ash pile, which builds up around the source, and also the overburden rubble which accumulates on top of this ash once overburden rock is exposed at the cavity top. Char rubble zones at the cavity side and edges are assumed to be highly permeable. Flow of injected gas through the ash to char rubble piles and the void space is coupled by material and energy balances to cavity growth at the rubble/coal, void/coal and void/rock interfaces. One preprocessor and two postprocessor programs are included - SPALL calculates one-dimensional mean spalling rates of coal or rock surfaces exposed to high temperatures and generates CAVSIM input: TAB reads CAVSIM binary output files and generates ASCII tables of selected data for display; and PLOT produces dot matrix printer or HP printer plots from TAB output.« less

  6. The evolution of CMS software performance studies

    NASA Astrophysics Data System (ADS)

    Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.

    2011-12-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  7. Laser monitoring for the CMS ECAL

    NASA Astrophysics Data System (ADS)

    Rogan, Christopher; CMS ECAL Group

    2010-11-01

    The Compact Muon Solenoid (CMS) detector at the LHC is equipped with a high precision lead tungstate crystal electromagnetic calorimeter (ECAL). To ensure the stability of the calorimetric response at the level of a few per mille, every channel of the detector is monitored with a laser system. This system enables corrections for fluctuations in the detector response with high precision, in particular the expected radiation induced changes in the crystal transparency. We describe the implementation of the laser monitoring system and report results from tests on the fully equipped supermodules of the ECAL. Specifically, we discuss results concerning the dynamics of crystal transparency change from dedicated irradiation studies in test beams.

  8. The CMS High-Level Trigger

    SciTech Connect

    Covarelli, R.

    2009-12-17

    At the startup of the LHC, the CMS data acquisition is expected to be able to sustain an event readout rate of up to 100 kHz from the Level-1 trigger. These events will be read into a large processor farm which will run the 'High-Level Trigger'(HLT) selection algorithms and will output a rate of about 150 Hz for permanent data storage. In this report HLT performances are shown for selections based on muons, electrons, photons, jets, missing transverse energy, {tau} leptons and b quarks: expected efficiencies, background rates and CPU time consumption are reported as well as relaxation criteria foreseen for a LHC startup instantaneous luminosity.

  9. SUSY Search Strategies at Atlas and CMS

    SciTech Connect

    Autermann, Christian

    2008-11-23

    Supersymmetry is regarded as the most promising candidate for physics beyond the Standard Model. Various search strategies for SUSY are conducted at the Atlas and CMS experiments. In the early data inclusive searches, with different lepton multiplicities, are most sensitive and will be discussed here. The reach of both experiments is interpreted within the mSUGRA model.The LHC has started operation and the experiments are expected to collect of the order of 100 pb{sup -1} integrated luminosity within the first year.

  10. The upgrade of the CMS Global Trigger

    NASA Astrophysics Data System (ADS)

    Wittmann, J.; Arnold, B.; Bergauer, H.; Jeitler, M.; Matsushita, T.; Rabady, D.; Rahbaran, B.; Wulz, C.-E.

    2016-02-01

    The Global Trigger is the final step of the CMS Level-1 Trigger. Previously implemented in VME, it has been redesigned and completely rebuilt in MicroTCA technology, using the Virtex-7 FPGA chip family. It will allow to implement trigger algorithms close to the final physics selection. The new system is presented, together with performance tests undertaken in parallel operation with the legacy system during the initial months of Run II of the LHC at a beam energy of 13 TeV.

  11. CMS OnlineWeb-Based Monitoring

    NASA Astrophysics Data System (ADS)

    Badgett, William; Chakaberia, Irakli; Lopez-Perez, Juan Antonio; Maeshima, Kaori; Maruyama, Sho; Soha, Aron; Sulmanas, Balys; Wan, Zongru

    For large international High Energy Physics experiments, modern web technologies make the online monitoring of detector status, data acquisition status, trigger rates, luminosity, etc., accessible for the collaborators anywhere and anytime. This helps the collaborating experts monitor the status of the experiment, identify the problems and improve data taking efficiency. We present the online Web-Based Monitoring project of the CMS experiment at the LHC at CERN.The data sources are relational databasesandvarious messaging systems. The projectprovidesavast amountof in-depth information including real-time data, historical trends and correlations in a user-friendly way.

  12. Advancing Underground Nuclear Astrophysics with CASPAR

    NASA Astrophysics Data System (ADS)

    Robertson, Daniel; Couder, Manoel; Greife, Uwe; Strieder, Frank; Wells, Doug; Wiescher, Michael

    2015-04-01

    The advancement of experimental nuclear astrophysics techniques and the requirement of astrophysical network models for further nuclear data over greater energy ranges, has led to the requirement for the better understanding of nuclear reactions in stellar burning regimes. For those reactions of importance to stellar burning processes and elemental production through stellar nucleosynthesis, the energy range of astrophysical interest is always problematic to probe. As reaction measurements approach the burning window of interest, the rapid drop off in cross-section hampers laboratory investigation. The natural background suppression of underground accelerator facilities enables the extension of current experimental data to lower energies. An example of such reactions of interest are those thought to be sources of neutrons for the s-process, the major production mechanism for elements above the iron peak. The reactions 13 C(α,n)16 O and 22 Ne(α,n)25 Mg are the proposed initial focus of the new nuclear astrophysics accelerator laboratory (CASPAR) currently under construction at the Sanford Underground Research Facility, Lead, SD. With thanks to funding provided by South Dakota Science and Technology Authority and the NSF under Grant Number PHY-1419765.

  13. CMS distributed data analysis with CRAB3

    DOE PAGESBeta

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; et al

    2015-12-23

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks andmore » submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.« less

  14. Status of the CMS Detector Control System

    NASA Astrophysics Data System (ADS)

    Bauer, Gerry; Behrens, Ulf; Bowen, Matthew; Branson, James; Bukowiec, Sebastian; Cittolin, Sergio; Coarasa, Jose Antonio; Deldicque, Christian; Dobson, Marc; Dupont, Aymeric; Erhan, Samim; Flossdorf, Alexander; Gigi, Dominique; Glege, Frank; Gomez-Reino, Robert; Hartl, Christian; Hegeman, Jeroen; Holzner, Andre; Hwong, Yi Ling; Masetti, Lorenzo; Meijers, Frans; Meschi, Emilio; Mommsen, Remigius K.; O'Dell, Vivian; Orsini, Luciano; Paus, Christoph; Petrucci, Andrea; Pieri, Marco; Polese, Giovanni; Racz, Attila; Raginel, Olivier; Sakulin, Hannes; Sani, Matteo; Schwick, Christoph; Shpakov, Dennis; Simon, Michal; Cristian Spataru, Andrei; Sumorok, Konstanty

    2012-12-01

    The Compact Muon Solenoid (CMS) is a CERN multi-purpose experiment that exploits the physics of the Large Hadron Collider (LHC). The Detector Control System (DCS) is responsible for ensuring the safe, correct and efficient operation of the experiment, and has contributed to the recording of high quality physics data. The DCS is programmed to automatically react to the LHC operational mode. CMS sub-detectors’ bias voltages are set depending on the machine mode and particle beam conditions. An operator provided with a small set of screens supervises the system status summarized from the approximately 6M monitored parameters. Using the experience of nearly two years of operation with beam the DCS automation software has been enhanced to increase the system efficiency by minimizing the time required by sub-detectors to prepare for physics data taking. From the infrastructure point of view the DCS will be subject to extensive modifications in 2012. The current rack mounted control PCs will be replaced by a redundant pair of DELL Blade systems. These blade servers are a high-density modular solution that incorporates servers and networking into a single chassis that provides shared power, cooling and management. This infrastructure modification associated with the migration to blade servers will challenge the DCS software and hardware factorization capabilities. The on-going studies for this migration together with the latest modifications are discussed in the paper.

  15. CMS distributed data analysis with CRAB3

    NASA Astrophysics Data System (ADS)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  16. CMS distributed data analysis with CRAB3

    SciTech Connect

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-23

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  17. Performance of the CMS High Level Trigger

    NASA Astrophysics Data System (ADS)

    Perrotta, Andrea

    2015-12-01

    The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increases in center-of-mass energy and luminosity will raise the event rate to a level challenging for the HLT algorithms. The increase in the number of interactions per bunch crossing, on average 25 in 2012, and expected to be around 40 in Run II, will be an additional complication. We present here the expected performance of the main triggers that will be used during the 2015 data taking campaign, paying particular attention to the new approaches that have been developed to cope with the challenges of the new run. This includes improvements in HLT electron and photon reconstruction as well as better performing muon triggers. We will also present the performance of the improved tracking and vertexing algorithms, discussing their impact on the b-tagging performance as well as on the jet and missing energy reconstruction.

  18. Status of the CMS pixel project

    SciTech Connect

    Uplegger, Lorenzo; /Fermilab

    2008-01-01

    The Compact Muon Solenoid Experiment (CMS) will start taking data at the Large Hadron Collider (LHC) in 2008. The closest detector to the interaction point is the silicon pixel detector which is the heart of the tracking system. It consists of three barrel layers and two pixel disks on each side of the interaction point for a total of 66 million channels. Its proximity to the interaction point means there will be very large particle fluences and therefore a radiation-tolerant design is necessary. The pixel detector will be crucial to achieve a good vertex resolution and will play a key role in pattern recognition and track reconstruction. The results from test beam runs prove that the expected performances can be achieved. The detector is currently being assembled and will be ready for insertion into CMS in early 2008. During the assembly phase, a thorough electronic test is being done to check the functionality of each channel to guarantee the performance required to achieve the physics goals. This report will present the final detector design, the status of the production as well as results from test beam runs to validate the expected performance.

  19. Underground tank leak detection methods

    SciTech Connect

    Niaki, Shahzad; Broscious, J.A.

    1987-01-01

    In recent years, the increase in leaks from underground gasoline storage tanks has had a significant adverse environmental impact on the US. Current estimates from government and industry sources are that between 1.5 to 3.5 million underground storage tanks exist in the nation. Estimates of the number of leaking tanks range from 75,000 to 100,000; and 350,000 others may develop leaks within the next five years. The 1983 National Petroleum News Factbook Issue forecasts the existence of approximately 140,000 gasoline service stations in the US at the end of 1983. New York State estimates that 19% of its 83,000 active underground gasoline tanks are now leaking. Maine estimates that 25% of its 1,600 retail gasoline underground tanks are leaking approximately 11 million gallons yearly. In Michigan 39% of ground water contamination incidents are attributed to storage tanks. One of the primary causes of tank leakage is corrosion of the storage tanks. Product loss from leaking tanks may cause an adverse effect on the environment, endanger lives, reduce income, and require the expenditure of millions of dollars for cleanup. To prevent or reduce the adverse effects of gasoline leakage, an accurate method must be used to determine whether or not an underground tank is leaking.

  20. Low-cost, high-precision propagation delay measurement of 12-fibre MPO cables for the CMS DT electronics upgrade

    NASA Astrophysics Data System (ADS)

    Navarro-Tobar, Á.; Fernández-Bedoya, C.; Redondo, I.

    2013-02-01

    CMS DT electronics upgrade involves laying down 3500 optical links from the CMS experimental cavern to the service cavern, whose lengths must be matched to minimize skew, so that the present upstream electronics can be reused at an initial stage. In order to assess the cables' compliance, a high resolution and cost-effective system has been developed to measure the length uniformity of these fibres. Transit-time oscillation method has been implemented with matched MTP 12-channel fibre optic transmitter and receiver and a Spartan-6 FPGA. After proper corrections and averaging, millimetre-range accuracy has been achieved.

  1. Geotechnical basis for underground energy storage in hard rock

    NASA Astrophysics Data System (ADS)

    Farquhar, O. C.

    1982-03-01

    Underground pumped hydroelectric storage requires the excavation of caverns in hard rock. Hard rock caverns, also, are one option for compressed air stoage. Preliminary design studies for both technologies at a specific site were completed. The geotechnical aspects of these storage systems are discussed from a generic viewpoint. Information about effective use of hard rock openings, including tunnels and shafts, comes mainly from other types of underground projects. These are power houses for hydroelectric and conventional pumped storage schemes, as well as transportation facilities and mines. Rock strength, support, instrumentation, costs, management, and experimental work are among the items considered. Mapping of geologic structures, rock fragmentation, and rock mass properties is also discussed. The general conclusions are that rock types favorable for underground energy storage are present at suitable depths in many areas and that they can be identified by adequate geotechnical exploration prior to detailed design.

  2. 42 CFR 405.1834 - CMS reviewing official procedure.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false CMS reviewing official procedure. 405.1834 Section 405.1834 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE PROGRAM FEDERAL HEALTH INSURANCE FOR THE AGED AND DISABLED Provider Reimbursement Determinations and Appeals § 405.1834 CMS...

  3. 42 CFR 422.2264 - Guidelines for CMS review.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Guidelines for CMS review. 422.2264 Section 422.2264 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... § 422.2264 Guidelines for CMS review. In reviewing marketing material or election forms under §...

  4. 42 CFR 423.2264 - Guidelines for CMS review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Guidelines for CMS review. 423.2264 Section 423.2264 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Requirements § 423.2264 Guidelines for CMS review. In reviewing marketing material or enrollment forms...

  5. 42 CFR 423.2264 - Guidelines for CMS review.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Guidelines for CMS review. 423.2264 Section 423.2264 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Requirements § 423.2264 Guidelines for CMS review. In reviewing marketing material or enrollment forms...

  6. 42 CFR 422.2264 - Guidelines for CMS review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Guidelines for CMS review. 422.2264 Section 422.2264 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... § 422.2264 Guidelines for CMS review. In reviewing marketing material or election forms under §...

  7. 42 CFR 411.379 - When CMS accepts a request.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 2 2013-10-01 2013-10-01 false When CMS accepts a request. 411.379 Section 411.379 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE... receiving a request for an advisory opinion, CMS promptly makes an initial determination of whether...

  8. 42 CFR 411.379 - When CMS accepts a request.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 2 2014-10-01 2014-10-01 false When CMS accepts a request. 411.379 Section 411.379 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE... receiving a request for an advisory opinion, CMS promptly makes an initial determination of whether...

  9. 42 CFR 411.379 - When CMS accepts a request.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false When CMS accepts a request. 411.379 Section 411.379 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE... receiving a request for an advisory opinion, CMS promptly makes an initial determination of whether...

  10. 42 CFR 411.379 - When CMS accepts a request.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false When CMS accepts a request. 411.379 Section 411.379 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE... receiving a request for an advisory opinion, CMS promptly makes an initial determination of whether...

  11. 42 CFR 411.379 - When CMS accepts a request.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 2 2012-10-01 2012-10-01 false When CMS accepts a request. 411.379 Section 411.379 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE... receiving a request for an advisory opinion, CMS promptly makes an initial determination of whether...

  12. 42 CFR 422.210 - Assurances to CMS.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Assurances to CMS. 422.210 Section 422.210 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM MEDICARE ADVANTAGE PROGRAM Relationships With Providers § 422.210 Assurances to CMS....

  13. Higgs Boson Search at LHC (and LHC/CMS status)

    SciTech Connect

    Korytov, Andrey

    2008-11-23

    Presented are the results of the most recent studies by the CMS and ATLAS collaborations on the expected sensitivity of their detectors to observing a Higgs boson at LHC. The overview is preceded with a brief summary of the LHC and the CMS Experiment status.

  14. Prospects for SUSY searches in CMS and ATLAS

    SciTech Connect

    Jong, Paul de

    2008-11-23

    We discuss how the CMS and ATLAS experiments are preparing for the analysis of first LHC data with emphasis on the search for supersymmetry. We will show the importance of the understanding of detector, trigger, reconstruction and backgrounds, and we will present realistic estimates of the reach of CMS and ATLAS.

  15. Proteomic Analysis of Male-Fertility Restoration in CMS Onion

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The production of hybrid-onion seed is dependent on cytoplasmic-genic male sterility (CMS) systems. For the most commonly used CMS, male-sterile (S) cytoplasm interacts with a dominant allele at one nuclear male-fertility restoration locus (Ms) to condition male fertility. We are using proteomics ...

  16. Proteomic analyses of male-fertility restoration in CMS onion

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The production of hybrid-onion seed is dependent on cytoplasmic-genic male sterility (CMS) systems. For the most commonly used CMS, male-sterile (S) cytoplasm interacts with a dominant allele at one nuclear male-fertility restoration locus (Ms) to condition male fertility. We are using a proteomics ...

  17. Displacement Parameter Inversion for a Novel Electromagnetic Underground Displacement Sensor

    PubMed Central

    Shentu, Nanying; Li, Qing; Li, Xiong; Tong, Renyuan; Shentu, Nankai; Jiang, Guoqing; Qiu, Guohua

    2014-01-01

    Underground displacement monitoring is an effective method to explore deep into rock and soil masses for execution of subsurface displacement measurements. It is not only an important means of geological hazards prediction and forecasting, but also a forefront, hot and sophisticated subject in current geological disaster monitoring. In previous research, the authors had designed a novel electromagnetic underground horizontal displacement sensor (called the H-type sensor) by combining basic electromagnetic induction principles with modern sensing techniques and established a mutual voltage measurement theoretical model called the Equation-based Equivalent Loop Approach (EELA). Based on that work, this paper presents an underground displacement inversion approach named “EELA forward modeling-approximate inversion method”. Combining the EELA forward simulation approach with the approximate optimization inversion theory, it can deduce the underground horizontal displacement through parameter inversion of the H-type sensor. Comprehensive and comparative studies have been conducted between the experimentally measured and theoretically inversed values of horizontal displacement under counterpart conditions. The results show when the measured horizontal displacements are in the 0–100 mm range, the horizontal displacement inversion discrepancy is generally tested to be less than 3 mm under varied tilt angles and initial axial distances conditions, which indicates that our proposed parameter inversion method can predict underground horizontal displacement measurements effectively and robustly for the H-type sensor and the technique is applicable for practical geo-engineering applications. PMID:24858960

  18. Displacement parameter inversion for a novel electromagnetic underground displacement sensor.

    PubMed

    Shentu, Nanying; Li, Qing; Li, Xiong; Tong, Renyuan; Shentu, Nankai; Jiang, Guoqing; Qiu, Guohua

    2014-01-01

    Underground displacement monitoring is an effective method to explore deep into rock and soil masses for execution of subsurface displacement measurements. It is not only an important means of geological hazards prediction and forecasting, but also a forefront, hot and sophisticated subject in current geological disaster monitoring. In previous research, the authors had designed a novel electromagnetic underground horizontal displacement sensor (called the H-type sensor) by combining basic electromagnetic induction principles with modern sensing techniques and established a mutual voltage measurement theoretical model called the Equation-based Equivalent Loop Approach (EELA). Based on that work, this paper presents an underground displacement inversion approach named "EELA forward modeling-approximate inversion method". Combining the EELA forward simulation approach with the approximate optimization inversion theory, it can deduce the underground horizontal displacement through parameter inversion of the H-type sensor. Comprehensive and comparative studies have been conducted between the experimentally measured and theoretically inversed values of horizontal displacement under counterpart conditions. The results show when the measured horizontal displacements are in the 0-100 mm range, the horizontal displacement inversion discrepancy is generally tested to be less than 3 mm under varied tilt angles and initial axial distances conditions, which indicates that our proposed parameter inversion method can predict underground horizontal displacement measurements effectively and robustly for the H-type sensor and the technique is applicable for practical geo-engineering applications. PMID:24858960

  19. Underground at Black Diamond Mines

    SciTech Connect

    Higgins, C.T.

    1989-10-01

    Although California is noted for its mining history and annually leads the nation in total monetary value of minerals produced, there a few opportunities for the public to tour underground mines. One reason is that nearly all mining in the state today is done above ground in open pits. Another reason is that active underground mines are not commonly favorable to public tours. There is one place, Black Diamond Mines Regional Preserve, where the public can safely tour a formerly active underground mine. Black Diamond Mines Regional Preserve is a 3,600-acre parkland about 5 miles southwest of Antioch in Contra Costa County. The Preserve was established in the early 1970s and is administered by the East Bay Regional Park District. Black Diamond Mines Preserve is noteworthy for its mining history as well as its natural history, both of which are briefly described here.

  20. Logistics background study: underground mining

    SciTech Connect

    Hanslovan, J. J.; Visovsky, R. G.

    1982-02-01

    Logistical functions that are normally associated with US underground coal mining are investigated and analyzed. These functions imply all activities and services that support the producing sections of the mine. The report provides a better understanding of how these functions impact coal production in terms of time, cost, and safety. Major underground logistics activities are analyzed and include: transportation and personnel, supplies and equipment; transportation of coal and rock; electrical distribution and communications systems; water handling; hydraulics; and ventilation systems. Recommended areas for future research are identified and prioritized.

  1. Initial Measurement of the Inclusive Jet Cross Section at 10 TeV with CMS

    NASA Astrophysics Data System (ADS)

    Rose, Keith

    2010-02-01

    A plan for the measurement of the differential inclusive jet production cross section at the Compact Muon Solenoid experiment (CMS) assuming 10/pb of integrated luminosity from proton-proton collisions at a center of mass energy of 10 TeV is presented. The reach in transverse jet momentum is beyond any previous collider experiment and the TeV scale of jet physics can be probed. The analysis is performed on fully simulated CMS events which are adopted as pseudo data. Jets are reconstructed from calorimeter energy depositions with two different algorithms; Inclusive kT and Seedless Infrared-Safe Cone. The steps for the spectrum construction from triggered events are described in detail and the major experimental and theoretical uncertainties are discussed. A simple noise rejection cut is also proposed for the purpose of event cleanup. )

  2. Underground Coal Thermal Treatment

    SciTech Connect

    Smith, P.; Deo, M.; Eddings, E.; Sarofim, A.; Gueishen, K.; Hradisky, M.; Kelly, K.; Mandalaparty, P.; Zhang, H.

    2012-01-11

    The long-term objective of this work is to develop a transformational energy production technology by insitu thermal treatment of a coal seam for the production of substitute natural gas (SNG) while leaving much of the coal's carbon in the ground. This process converts coal to a high-efficiency, low-GHG emitting gas fuel. It holds the potential of providing environmentally acceptable access to previously unusable coal resources. This topical report discusses the development of experimental capabilities, the collection of available data, and the development of simulation tools to obtain process thermo-chemical and geo-thermal parameters in preparation for the eventual demonstration in a coal seam. It also includes experimental and modeling studies of CO2 sequestration.

  3. Deployment of the CMS Tracker AMC as backend for the CMS pixel detector

    NASA Astrophysics Data System (ADS)

    Auzinger, G.

    2016-01-01

    The silicon pixel detector of the CMS experiment at CERN will be replaced with an upgraded version at the beginning of 2017 with the new detector featuring an additional barrel- and end-cap layer resulting in an increased number of fully digital read-out links running at 400 Mbps. New versions of the PSI46 Read-Out Chip and Token Bit Manager have been developed to operate at higher rates and reduce data loss. Front-End Controller and Front-End Driver boards, based on the μTCA compatible CMS Tracker AMC, a variant of the FC7 card, are being developed using different mezzanines to host the optical links for the digital read-out and control system. An overview of the system architecture is presented, with details on the implementation, and first results obtained from test systems.

  4. Optimizing CMS build infrastructure via Apache Mesos

    NASA Astrophysics Data System (ADS)

    Abdurachmanov, David; Degano, Alessandro; Elmer, Peter; Eulisse, Giulio; Mendez, David; Muzaffar, Shahzad

    2015-12-01

    The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux. Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora, and other applications on a dynamically shared pool of nodes. We present how we migrated our continuous integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.

  5. Radiation hard avalanche photodiodes for CMS ECAL

    NASA Astrophysics Data System (ADS)

    Grahl, J.; Kronquist, I.; Rusack, R.; Singovski, A.; Kuznetsov, A.; Musienko, Y.; Reucroft, S.; Swain, J.; Deiters, K.; Ingram, Q.; Renker, D.; Sakhelashvili, T.

    2003-05-01

    The photo detectors of the CMS electromagnetic calorimeter have to operate in a rather hostile environment, in a strong magnetic field of 4 T and under unprecedented radiation levels. Avalanche Photo Diodes (APDs) have been chosen to detect the scintillation light of the 62,000 lead tungstate crystals in the barrel part of the calorimeter. After a 6 year long R&D work Hamamatsu Photonics produces APDs with a structure that is basically radiation hard. Only a few percent of the delivered APDs are weak due to defects at the surface caused by dust particles in the production process. Since a reliability of 99.9% is required, a method to detect weak APDs before they are built into the detector had to be developed. The described screening method is a combination of 60Co irradiations and annealing under bias of all APDs and irradiations with hadrons on a sampling basis.

  6. Confusion ahead as CMS changes inpatient criteria.

    PubMed

    2013-10-01

    In the Inpatient Prospective Payment System final rule for 2014, the Centers for Medicare & Medicaid Services established a benchmark of two midnights for an inpatient admission and issued robust requirements for documentation. Case managers must work closely with physicians to ensure that the documentation includes the expected length of stay, the rationale for hospital treatment, the treatment plan, and a written order for admission. Case managers must review every admission within 24 hours to make sure the hospital doesn't lose reimbursement. Auditors will be looking for incidents where hospitals keep patients over two midnights when it's not medically necessary in order to get inpatient reimbursement. CMS continues to emphasize quality in care. PMID:24195133

  7. CMS data quality monitoring web service

    NASA Astrophysics Data System (ADS)

    Tuura, L.; Eulisse, G.; Meyer, A.

    2010-04-01

    A central component of the data quality monitoring system of the CMS experiment at the Large Hadron Collider is a web site for browsing data quality histograms. The production servers in data taking provide access to several hundred thousand histograms per run, both live in online as well as for up to several terabytes of archived histograms for the online data taking, Tier-0 prompt reconstruction, prompt calibration and analysis activities, for re-reconstruction at Tier-1s and for release validation. At the present usage level the servers currently handle in total around a million authenticated HTTP requests per day. We describe the main features and components of the system, our implementation for web-based interactive rendering, and the server design. We give an overview of the deployment and maintenance procedures. We discuss the main technical challenges and our solutions to them, with emphasis on functionality, long-term robustness and performance.

  8. Estimating job runtime for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Sfiligoi, I.

    2014-06-01

    The basic premise of pilot systems is to create an overlay scheduling system on top of leased resources. And by definition, leases have a limited lifetime, so any job that is scheduled on such resources must finish before the lease is over, or it will be killed and all the computation is wasted. In order to effectively schedule jobs to resources, the pilot system thus requires the expected runtime of the users' jobs. Past studies have shown that relying on user provided estimates is not a valid strategy, so the system should try to make an estimate by itself. This paper provides a study of the historical data obtained from the Compact Muon Solenoid (CMS) experiment's Analysis Operations submission system. Clear patterns are observed, suggesting that making prediction of an expected job lifetime range is achievable with high confidence level in this environment.

  9. Earthquake damage to underground facilities

    SciTech Connect

    Pratt, H.R.; Stephenson, D.E.; Zandt, G.; Bouchon, M.; Hustrulid, W.A.

    1980-01-01

    In order to assess the seismic risk for an underground facility, a data base was established and analyzed to evaluate the potential for seismic disturbance. Substantial damage to underground facilities is usually the result of displacements primarily along pre-existing faults and fractures, or at the surface entrance to these facilities. Evidence of this comes from both earthquakes and large explosions. Therefore, the displacement due to earthquakes as a function of depth is important in the evaluation of the hazard to underground facilities. To evaluate potential displacements due to seismic effects of block motions along pre-existing or induced fractures, the displacement fields surrounding two types of faults were investigated. Analytical models were used to determine relative displacements of shafts and near-surface displacement of large rock masses. Numerical methods were used to determine the displacement fields associated with pure strike-slip and vertical normal faults. Results are presented as displacements for various fault lengths as a function of depth and distance. This provides input to determine potential displacements in terms of depth and distance for underground facilities, important for assessing potential sites and design parameters.

  10. High Temperature Superconducting Underground Cable

    SciTech Connect

    Farrell, Roger, A.

    2010-02-28

    The purpose of this Project was to design, build, install and demonstrate the technical feasibility of an underground high temperature superconducting (HTS) power cable installed between two utility substations. In the first phase two HTS cables, 320 m and 30 m in length, were constructed using 1st generation BSCCO wire. The two 34.5 kV, 800 Arms, 48 MVA sections were connected together using a superconducting joint in an underground vault. In the second phase the 30 m BSCCO cable was replaced by one constructed with 2nd generation YBCO wire. 2nd generation wire is needed for commercialization because of inherent cost and performance benefits. Primary objectives of the Project were to build and operate an HTS cable system which demonstrates significant progress towards commercial progress and addresses real world utility concerns such as installation, maintenance, reliability and compatibility with the existing grid. Four key technical areas addressed were the HTS cable and terminations (where the cable connects to the grid), cryogenic refrigeration system, underground cable-to-cable joint (needed for replacement of cable sections) and cost-effective 2nd generation HTS wire. This was the world’s first installation and operation of an HTS cable underground, between two utility substations as well as the first to demonstrate a cable-to-cable joint, remote monitoring system and 2nd generation HTS.

  11. Underground technology benefits surface operations

    SciTech Connect

    Swaim, M.

    2008-09-15

    Sensitive ground fault relays (GFRs) on high voltage underground electrical equipment have been in used for a number of years to improve mine safety. Advanced GFRs do more than just interrupt fault current flow. They can also reveal linkages as they develop so ground faults are detected before they become critical. 3 figs.

  12. CALIFORNIA LEAKING UNDERGROUND STORAGE TANKS

    EPA Science Inventory

    Points represent Leaking Underground Storage Tanks (LUST) for the State of California. This database was developed and is maintained by the California State Water Resources Control Board (SWRCB). Point locations represent tanks where leak events have occurred. Tank latitude-long...

  13. Slavery and the Underground Railroad.

    ERIC Educational Resources Information Center

    Anderson, Nancy Comfort

    2000-01-01

    Presents a bibliography of sources to help children understand slavery and the Underground Railroad and recommends a combination of fiction and nonfiction for a better understanding. Includes picture books, biographies of people who played prominent roles during the time of slavery, nonfiction books for older readers, and videotape. (LRW)

  14. 49 CFR 192.325 - Underground clearance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Underground clearance. 192.325 Section 192.325... Lines and Mains § 192.325 Underground clearance. (a) Each transmission line must be installed with at least 12 inches (305 millimeters) of clearance from any other underground structure not associated...

  15. 30 CFR 57.4761 - Underground shops.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Underground shops. 57.4761 Section 57.4761... SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Fire Prevention and Control Ventilation Control Measures § 57.4761 Underground shops. To confine or prevent the spread...

  16. CMS centres for control, monitoring, offline operations and prompt analysis

    NASA Astrophysics Data System (ADS)

    Taylor, L.; Gottschalk, E.; Maeshima, K.; McBride, P.

    2008-07-01

    The CMS experiment is about to embark on its first physics run at the LHC. To maximize the effectiveness of physicists and technical experts at CERN and worldwide and to facilitate their communications, CMS has established several dedicated and inter-connected operations and monitoring centres. These include a traditional 'Control Room' at the CMS site in France, a 'CMS Centre' for up to fifty people on the CERN main site in Switzerland, and remote operations centres, such as the 'LHC@FNAL' centre at Fermilab. We describe how this system of centres coherently supports the following activities: (1) CMS data quality monitoring, prompt sub-detector calibrations, and time-critical data analysis of express-line and calibration streams; and (2) operation of the CMS computing systems for processing, storage and distribution of real CMS data and simulated data, both at CERN and at offsite centres. We describe the physical infrastructure that has been established, the computing and software systems, the operations model, and the communications systems that are necessary to make such a distributed system coherent and effective.

  17. Experimental investigations and geochemical modelling of site-specific fluid-fluid and fluid-rock interactions in underground storage of CO2/H2/CH4 mixtures: the H2STORE project

    NASA Astrophysics Data System (ADS)

    De Lucia, Marco; Pilz, Peter

    2015-04-01

    Underground gas storage is increasingly regarded as a technically viable option for meeting the energy demand and environmental targets of many industrialized countries. Besides the long-term CO2 sequestration, energy can be chemically stored in form of CO2/CH4/H2 mixtures, for example resulting from excess wind energy. A precise estimation of the impact of such gas mixtures on the mineralogical, geochemical and petrophysical properties of specific reservoirs and caprocks is crucial for site selection and optimization of storage depth. Underground gas storage is increasingly regarded as a technically viable option for meeting environmental targets and the energy demand through storage in form of H2 or CH4, i.e. resulting from excess wind energy. Gas storage in salt caverns is nowadays a mature technology; in regions where favorable geologic structures such as salt diapires are not available, however, gas storage can only be implemented in porous media such as depleted gas and oil reservoirs or suitable saline aquifers. In such settings, a significant amount of in-situ gas components such as CO2, CH4 (and N2) will always be present, making the CO2/CH4/H2 system of particular interest. A precise estimation of the impact of their gas mixtures on the mineralogical, geochemical and petrophysical properties of specific reservoirs and caprocks is therefore crucial for site selection and optimization of storage depth. In the framework of the collaborative research project H2STORE, the feasibility of industrial-scale gas storage in porous media in several potential siliciclastic depleted gas and oil reservoirs or suitable saline aquifers is being investigated by means of experiments and modelling on actual core materials from the evaluated sites. Among them are the Altmark depleted gas reservoir in Saxony-Anhalt and the Ketzin pilot site for CO2 storage in Brandenburg (Germany). Further sites are located in the Molasse basin in South Germany and Austria. In particular, two

  18. Quantum cryptography over underground optical fibers

    SciTech Connect

    Hughes, R.J.; Luther, G.G.; Morgan, G.L.; Peterson, C.G.; Simmons, C.

    1996-05-01

    Quantum cryptography is an emerging technology in which two parties may simultaneously generated shared, secret cryptographic key material using the transmission of quantum states of light whose security is based on the inviolability of the laws of quantum mechanics. An adversary can neither successfully tap the key transmissions, nor evade detection, owing to Heisenberg`s uncertainty principle. In this paper the authors describe the theory of quantum cryptography, and the most recent results from their experimental system with which they are generating key material over 14-km of underground optical fiber. These results show that optical-fiber based quantum cryptography could allow secure, real-time key generation over ``open`` multi-km node-to-node optical fiber communications links between secure ``islands.``

  19. Numerical Study on the Effects of pH Buffering Minerals for CO2 Sequestration in Geology by CMS Technology

    NASA Astrophysics Data System (ADS)

    Miyoshi, S.; Suzuki, K.

    2014-12-01

    Among the technical alternatives of CCS, Koide and Xue (2009) proposed the technology, CO2 Microbubble Storage (CMS). In CMS, CO2 is injected into shallow geology as microbubbles in water. Suzuki et al. (2012) studied on the legal, economic, and effective operation of CMS and showed an appropriate one as follows. CO2 microbubbles are injected in a vertical injection well with water withdrawn from the surrounding wells. When the microbubbles are mixed with water, the water is quickly saturated with CO2 because of the large surface area of the bubbles. As a result, CO2 is injected into geology as the solute, mostly bicarbonate ion. The cost efficiency of this operation is reasonbale. There is an issue that should be assessed when CO2 dissolved water is injected in shallow geology. That is the migration of low pH groundwater. The low pH pore solution might have impacts on the underground environment. Here the preliminary numerical study on the pH buffering capacity for CMS was done. In the numerical study, groundwater migration dominated by Darcy law, the advective and dispersive transport of the ions, and the reaction between pore water and minerals were considered. One dimensional geology model was used. There are various mechanisms that bring pH buffering effect in an underground circumstance. Here calcite dissolution by low pH groundwater was considered because it is one of the most casual phenomena that can bring pH buffering effect. The results of the case studies where the different mass fractions of calcite in the model layer were supposed show that the pore solution in the case with calcite was neutralized and that in the case without calcite was not. They also showed that the mass fraction of calcite does not change much the pH buffering effect. Those results were brought by the principle that the pH of the pore solution is controlled by the equilibrium constant of the reaction between minerals and the pore solution. That is, the excess portion of calcite is

  20. Using the CMS High Level Trigger as a Cloud Resource

    NASA Astrophysics Data System (ADS)

    Colling, David; Huffman, Adam; McCrae, Alison; Lahiff, Andrew; Grandi, Claudio; Cinquilli, Mattia; Gowdy, Stephen; Coarasa, Jose Antonio; Tiradani, Anthony; Ozga, Wojciech; Chaze, Olivier; Sgaravatto, Massimo; Bauer, Daniela

    2014-06-01

    The CMS High Level Trigger is a compute farm of more than 10,000 cores. During data taking this resource is heavily used and is an integral part of the experiment's triggering system. However, outside of data taking periods this resource is largely unused. We describe why CMS wants to use the HLT as a cloud resource (outside of data taking periods) and how this has been achieved. In doing this we have turned a single-use cluster into an agile resource for CMS production computing. While we are able to use the HLT as a production cloud resource, there is still considerable further work that CMS needs to carry out before this resource can be used with the desired agility. This report, therefore, represents a snapshot of this activity at the time of CHEP 2013.

  1. 42 CFR 460.18 - CMS evaluation of applications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.18 CMS evaluation of...

  2. 42 CFR 460.20 - Notice of CMS determination.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.20 Notice of CMS determination....

  3. 42 CFR 460.18 - CMS evaluation of applications.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.18 CMS evaluation of...

  4. 42 CFR 460.20 - Notice of CMS determination.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.20 Notice of CMS determination....

  5. 42 CFR 460.20 - Notice of CMS determination.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.20 Notice of CMS determination....

  6. 42 CFR 460.18 - CMS evaluation of applications.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.18 CMS evaluation of...

  7. 42 CFR 460.20 - Notice of CMS determination.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.20 Notice of CMS determination....

  8. 42 CFR 460.18 - CMS evaluation of applications.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.18 CMS evaluation of...

  9. 42 CFR 460.18 - CMS evaluation of applications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.18 CMS evaluation of...

  10. 42 CFR 460.20 - Notice of CMS determination.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... (CONTINUED) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PROGRAMS OF ALL-INCLUSIVE CARE FOR THE ELDERLY (PACE) PACE Organization Application and Waiver Process § 460.20 Notice of CMS determination....