Sample records for cms software framework

  1. Using the CMS threaded framework in a production environment

    DOE PAGES

    Jones, C. D.; Contreras, L.; Gartung, P.; ...

    2015-12-23

    During 2014, the CMS Offline and Computing Organization completed the necessary changes to use the CMS threaded framework in the full production environment. We will briefly discuss the design of the CMS Threaded Framework, in particular how the design affects scaling performance. We will then cover the effort involved in getting both the CMSSW application software and the workflow management system ready for using multiple threads for production. Finally, we will present metrics on the performance of the application and workflow system as well as the difficulties which were uncovered. As a result, we will end with CMS' plans formore » using the threaded framework to do production for LHC Run 2.« less

  2. Event Display for the Visualization of CMS Events

    NASA Astrophysics Data System (ADS)

    Bauerdick, L. A. T.; Eulisse, G.; Jones, C. D.; Kovalskyi, D.; McCauley, T.; Mrak Tadel, A.; Muelmenstaedt, J.; Osborne, I.; Tadel, M.; Tu, Y.; Yagil, A.

    2011-12-01

    During the last year the CMS experiment engaged in consolidation of its existing event display programs. The core of the new system is based on the Fireworks event display program which was by-design directly integrated with the CMS Event Data Model (EDM) and the light version of the software framework (FWLite). The Event Visualization Environment (EVE) of the ROOT framework is used to manage a consistent set of 3D and 2D views, selection, user-feedback and user-interaction with the graphics windows; several EVE components were developed by CMS in collaboration with the ROOT project. In event display operation simple plugins are registered into the system to perform conversion from EDM collections into their visual representations which are then managed by the application. Full event navigation and filtering as well as collection-level filtering is supported. The same data-extraction principle can also be applied when Fireworks will eventually operate as a service within the full software framework.

  3. Monitoring the CMS strip tracker readout system

    NASA Astrophysics Data System (ADS)

    Mersi, S.; Bainbridge, R.; Baulieu, G.; Bel, S.; Cole, J.; Cripps, N.; Delaere, C.; Drouhin, F.; Fulcher, J.; Giassi, A.; Gross, L.; Hahn, K.; Mirabito, L.; Nikolic, M.; Tkaczyk, S.; Wingham, M.

    2008-07-01

    The CMS Silicon Strip Tracker at the LHC comprises a sensitive area of approximately 200 m2 and 10 million readout channels. Its data acquisition system is based around a custom analogue front-end chip. Both the control and the readout of the front-end electronics are performed by off-detector VME boards in the counting room, which digitise the raw event data and perform zero-suppression and formatting. The data acquisition system uses the CMS online software framework to configure, control and monitor the hardware components and steer the data acquisition. The first data analysis is performed online within the official CMS reconstruction framework, which provides many services, such as distributed analysis, access to geometry and conditions data, and a Data Quality Monitoring tool based on the online physics reconstruction. The data acquisition monitoring of the Strip Tracker uses both the data acquisition and the reconstruction software frameworks in order to provide real-time feedback to shifters on the operational state of the detector, archiving for later analysis and possibly trigger automatic recovery actions in case of errors. Here we review the proposed architecture of the monitoring system and we describe its software components, which are already in place, the various monitoring streams available, and our experiences of operating and monitoring a large-scale system.

  4. The GMOD Drupal bioinformatic server framework.

    PubMed

    Papanicolaou, Alexie; Heckel, David G

    2010-12-15

    Next-generation sequencing technologies have led to the widespread use of -omic applications. As a result, there is now a pronounced bioinformatic bottleneck. The general model organism database (GMOD) tool kit (http://gmod.org) has produced a number of resources aimed at addressing this issue. It lacks, however, a robust online solution that can deploy heterogeneous data and software within a Web content management system (CMS). We present a bioinformatic framework for the Drupal CMS. It consists of three modules. First, GMOD-DBSF is an application programming interface module for the Drupal CMS that simplifies the programming of bioinformatic Drupal modules. Second, the Drupal Bioinformatic Software Bench (biosoftware_bench) allows for a rapid and secure deployment of bioinformatic software. An innovative graphical user interface (GUI) guides both use and administration of the software, including the secure provision of pre-publication datasets. Third, we present genes4all_experiment, which exemplifies how our work supports the wider research community. Given the infrastructure presented here, the Drupal CMS may become a powerful new tool set for bioinformaticians. The GMOD-DBSF base module is an expandable community resource that decreases development time of Drupal modules for bioinformatics. The biosoftware_bench module can already enhance biologists' ability to mine their own data. The genes4all_experiment module has already been responsible for archiving of more than 150 studies of RNAi from Lepidoptera, which were previously unpublished. Implemented in PHP and Perl. Freely available under the GNU Public License 2 or later from http://gmod-dbsf.googlecode.com.

  5. The CMS High Level Trigger System: Experience and Future Development

    NASA Astrophysics Data System (ADS)

    Bauer, G.; Behrens, U.; Bowen, M.; Branson, J.; Bukowiec, S.; Cittolin, S.; Coarasa, J. A.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Flossdorf, A.; Gigi, D.; Glege, F.; Gomez-Reino, R.; Hartl, C.; Hegeman, J.; Holzner, A.; Hwong, Y. L.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Polese, G.; Racz, A.; Raginel, O.; Sakulin, H.; Sani, M.; Schwick, C.; Shpakov, D.; Simon, S.; Spataru, A. C.; Sumorok, K.

    2012-12-01

    The CMS experiment at the LHC features a two-level trigger system. Events accepted by the first level trigger, at a maximum rate of 100 kHz, are read out by the Data Acquisition system (DAQ), and subsequently assembled in memory in a farm of computers running a software high-level trigger (HLT), which selects interesting events for offline storage and analysis at a rate of order few hundred Hz. The HLT algorithms consist of sequences of offline-style reconstruction and filtering modules, executed on a farm of 0(10000) CPU cores built from commodity hardware. Experience from the operation of the HLT system in the collider run 2010/2011 is reported. The current architecture of the CMS HLT, its integration with the CMS reconstruction framework and the CMS DAQ, are discussed in the light of future development. The possible short- and medium-term evolution of the HLT software infrastructure to support extensions of the HLT computing power, and to address remaining performance and maintenance issues, are discussed.

  6. The GMOD Drupal Bioinformatic Server Framework

    PubMed Central

    Papanicolaou, Alexie; Heckel, David G.

    2010-01-01

    Motivation: Next-generation sequencing technologies have led to the widespread use of -omic applications. As a result, there is now a pronounced bioinformatic bottleneck. The general model organism database (GMOD) tool kit (http://gmod.org) has produced a number of resources aimed at addressing this issue. It lacks, however, a robust online solution that can deploy heterogeneous data and software within a Web content management system (CMS). Results: We present a bioinformatic framework for the Drupal CMS. It consists of three modules. First, GMOD-DBSF is an application programming interface module for the Drupal CMS that simplifies the programming of bioinformatic Drupal modules. Second, the Drupal Bioinformatic Software Bench (biosoftware_bench) allows for a rapid and secure deployment of bioinformatic software. An innovative graphical user interface (GUI) guides both use and administration of the software, including the secure provision of pre-publication datasets. Third, we present genes4all_experiment, which exemplifies how our work supports the wider research community. Conclusion: Given the infrastructure presented here, the Drupal CMS may become a powerful new tool set for bioinformaticians. The GMOD-DBSF base module is an expandable community resource that decreases development time of Drupal modules for bioinformatics. The biosoftware_bench module can already enhance biologists' ability to mine their own data. The genes4all_experiment module has already been responsible for archiving of more than 150 studies of RNAi from Lepidoptera, which were previously unpublished. Availability and implementation: Implemented in PHP and Perl. Freely available under the GNU Public License 2 or later from http://gmod-dbsf.googlecode.com Contact: alexie@butterflybase.org PMID:20971988

  7. Achieving High Performance With TCP Over 40 GbE on NUMA Architectures for CMS Data Acquisition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bawej, Tomasz; et al.

    2014-01-01

    TCP and the socket abstraction have barely changed over the last two decades, but at the network layer there has been a giant leap from a few megabits to 100 gigabits in bandwidth. At the same time, CPU architectures have evolved into the multicore era and applications are expected to make full use of all available resources. Applications in the data acquisition domain based on the standard socket library running in a Non-Uniform Memory Access (NUMA) architecture are unable to reach full efficiency and scalability without the software being adequately aware about the IRQ (Interrupt Request), CPU and memory affinities.more » During the first long shutdown of LHC, the CMS DAQ system is going to be upgraded for operation from 2015 onwards and a new software component has been designed and developed in the CMS online framework for transferring data with sockets. This software attempts to wrap the low-level socket library to ease higher-level programming with an API based on an asynchronous event driven model similar to the DAT uDAPL API. It is an event-based application with NUMA optimizations, that allows for a high throughput of data across a large distributed system. This paper describes the architecture, the technologies involved and the performance measurements of the software in the context of the CMS distributed event building.« less

  8. Challenges to Software/Computing for Experimentation at the LHC

    NASA Astrophysics Data System (ADS)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  9. LCG Persistency Framework (CORAL, COOL, POOL): Status and Outlook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; /CERN; Clemencic, M.

    2012-04-19

    The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less

  10. NoSQL technologies for the CMS Conditions Database

    NASA Astrophysics Data System (ADS)

    Sipos, Roland

    2015-12-01

    With the restart of the LHC in 2015, the growth of the CMS Conditions dataset will continue, therefore the need of consistent and highly available access to the Conditions makes a great cause to revisit different aspects of the current data storage solutions. We present a study of alternative data storage backends for the Conditions Databases, by evaluating some of the most popular NoSQL databases to support a key-value representation of the CMS Conditions. The definition of the database infrastructure is based on the need of storing the conditions as BLOBs. Because of this, each condition can reach the size that may require special treatment (splitting) in these NoSQL databases. As big binary objects may be problematic in several database systems, and also to give an accurate baseline, a testing framework extension was implemented to measure the characteristics of the handling of arbitrary binary data in these databases. Based on the evaluation, prototypes of a document store, using a column-oriented and plain key-value store, are deployed. An adaption layer to access the backends in the CMS Offline software was developed to provide transparent support for these NoSQL databases in the CMS context. Additional data modelling approaches and considerations in the software layer, deployment and automatization of the databases are also covered in the research. In this paper we present the results of the evaluation as well as a performance comparison of the prototypes studied.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; Clemencic, M.; Dykstra, D.

    The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less

  12. Evaluation of Using Course-Management Software: Supplementing a Course that Requires a Group Research Project

    ERIC Educational Resources Information Center

    Korchmaros, Josephine D.; Gump, Nathaniel W.

    2009-01-01

    The benefits of course-management software (CMS) will not be realized if it is underused. The authors investigated one possible barrier to CMS use, students' perceptions of using CMS. After taking a course requiring a group research project, college students reported their perceptions of the use of CMS for the course. Overall, students did not…

  13. Development of an expert system prototype for determining software functional requirements for command management activities at NASA Goddard

    NASA Technical Reports Server (NTRS)

    Liebowitz, J.

    1985-01-01

    The development of an expert system prototype for determining software functional requirements for NASA Goddard's Command Management System (CMS) is described. The role of the CMS is to transform general requests into specific spacecraft commands with command execution conditions. The CMS is part of the NASA Data System which entails the downlink of science and engineering data from NASA near-earth satellites to the user, and the uplink of command and control data to the spacecraft. Subjects covered include: the problem environment of determining CMS software functional requirements; the expert system approach for handling CMS requirements development; validation and evaluation procedures for the expert system.

  14. Unclassified Information Sharing and Coordination in Security, Stabilization, Transition and Reconstruction Efforts

    DTIC Science & Technology

    2008-03-01

    is implemented using the Drupal (2007) content management system (CMS) and many of the baseline information sharing and collaboration tools have...been contributed through the Dru- pal open source community. Drupal is a very modular open source software written in PHP hypertext processor...needed to suit the particular problem domain. While other frameworks have the potential to provide similar advantages (“Ruby,” 2007), Drupal was

  15. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    NASA Astrophysics Data System (ADS)

    Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.

    2011-12-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  16. tkLayout: a design tool for innovative silicon tracking detectors

    NASA Astrophysics Data System (ADS)

    Bianchi, G.

    2014-03-01

    A new CMS tracker is scheduled to become operational for the LHC Phase 2 upgrade in the early 2020's. tkLayout is a software package developed to create 3d models for the design of the CMS tracker and to evaluate its fundamental performance figures. The new tracker will have to cope with much higher luminosity conditions, resulting in increased track density, harsher radiation exposure and, especially, much higher data acquisition bandwidth, such that equipping the tracker with triggering capabilities is envisaged. The design of an innovative detector involves deciding on an architecture offering the best trade-off among many figures of merit, such as tracking resolution, power dissipation, bandwidth, cost and so on. Quantitatively evaluating these figures of merit as early as possible in the design phase is of capital importance and it is best done with the aid of software models. tkLayout is a flexible modeling tool: new performance estimates and support for different detector geometries can be quickly added, thanks to its modular structure. Besides, the software executes very quickly (about two minutes), so that many possible architectural variations can be rapidly modeled and compared, to help in the choice of a viable detector layout and then to optimize it. A tracker geometry is generated from simple configuration files, defining the module types, layout and materials. Support structures are automatically added and services routed to provide a realistic tracker description. The tracker geometries thus generated can be exported to the standard CMS simulation framework (CMSSW) for full Monte Carlo studies. tkLayout has proven essential in giving guidance to CMS in studying different detector layouts and exploring the feasibility of innovative solutions for tracking detectors, in terms of design, performance and projected costs. This tool has been one of the keys to making important design decisions for over five years now and has also enabled project engineers and simulation experts to focus their efforts on other important or specific issues. Even if tkLayout was designed for the CMS tracker upgrade project, its flexibility makes it experiment-agnostic, so that it could be easily adapted to model other tracking detectors. The technology behind tkLayout is presented, as well as some of the results obtained in the context of the CMS silicon tracker design studies.

  17. Intelligent Systems and Advanced User Interfaces for Design, Operation, and Maintenance of Command Management Systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1998-01-01

    Historically Command Management Systems (CMS) have been large, expensive, spacecraft-specific software systems that were costly to build, operate, and maintain. Current and emerging hardware, software, and user interface technologies may offer an opportunity to facilitate the initial formulation and design of a spacecraft-specific CMS as well as a to develop a more generic or a set of core components for CMS systems. Current MOC (mission operations center) hardware and software include Unix workstations, the C/C++ and Java programming languages, and X and Java window interfaces representations. This configuration provides the power and flexibility to support sophisticated systems and intelligent user interfaces that exploit state-of-the-art technologies in human-machine systems engineering, decision making, artificial intelligence, and software engineering. One of the goals of this research is to explore the extent to which technologies developed in the research laboratory can be productively applied in a complex system such as spacecraft command management. Initial examination of some of the issues in CMS design and operation suggests that application of technologies such as intelligent planning, case-based reasoning, design and analysis tools from a human-machine systems engineering point of view (e.g., operator and designer models) and human-computer interaction tools, (e.g., graphics, visualization, and animation), may provide significant savings in the design, operation, and maintenance of a spacecraft-specific CMS as well as continuity for CMS design and development across spacecraft with varying needs. The savings in this case is in software reuse at all stages of the software engineering process.

  18. Mapping modern software process engineering techniques onto an HEP development environment

    NASA Astrophysics Data System (ADS)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  19. Development and Evaluation of Vectorised and Multi-Core Event Reconstruction Algorithms within the CMS Software Framework

    NASA Astrophysics Data System (ADS)

    Hauth, T.; Innocente and, V.; Piparo, D.

    2012-12-01

    The processing of data acquired by the CMS detector at LHC is carried out with an object-oriented C++ software framework: CMSSW. With the increasing luminosity delivered by the LHC, the treatment of recorded data requires extraordinary large computing resources, also in terms of CPU usage. A possible solution to cope with this task is the exploitation of the features offered by the latest microprocessor architectures. Modern CPUs present several vector units, the capacity of which is growing steadily with the introduction of new processor generations. Moreover, an increasing number of cores per die is offered by the main vendors, even on consumer hardware. Most recent C++ compilers provide facilities to take advantage of such innovations, either by explicit statements in the programs sources or automatically adapting the generated machine instructions to the available hardware, without the need of modifying the existing code base. Programming techniques to implement reconstruction algorithms and optimised data structures are presented, that aim to scalable vectorization and parallelization of the calculations. One of their features is the usage of new language features of the C++11 standard. Portions of the CMSSW framework are illustrated which have been found to be especially profitable for the application of vectorization and multi-threading techniques. Specific utility components have been developed to help vectorization and parallelization. They can easily become part of a larger common library. To conclude, careful measurements are described, which show the execution speedups achieved via vectorised and multi-threaded code in the context of CMSSW.

  20. Implementation of an object oriented track reconstruction model into multiple LHC experiments*

    NASA Astrophysics Data System (ADS)

    Gaines, Irwin; Gonzalez, Saul; Qian, Sijin

    2001-10-01

    An Object Oriented (OO) model (Gaines et al., 1996; 1997; Gaines and Qian, 1998; 1999) for track reconstruction by the Kalman filtering method has been designed for high energy physics experiments at high luminosity hadron colliders. The model has been coded in the C++ programming language and has been successfully implemented into the OO computing environments of both the CMS (1994) and ATLAS (1994) experiments at the future Large Hadron Collider (LHC) at CERN. We shall report: how the OO model was adapted, with largely the same code, to different scenarios and serves the different reconstruction aims in different experiments (i.e. the level-2 trigger software for ATLAS and the offline software for CMS); how the OO model has been incorporated into different OO environments with a similar integration structure (demonstrating the ease of re-use of OO program); what are the OO model's performance, including execution time, memory usage, track finding efficiency and ghost rate, etc.; and additional physics performance based on use of the OO tracking model. We shall also mention the experience and lessons learned from the implementation of the OO model into the general OO software framework of the experiments. In summary, our practice shows that the OO technology really makes the software development and the integration issues straightforward and convenient; this may be particularly beneficial for the general non-computer-professional physicists.

  1. CMS Distributed Computing Integration in the LHC sustained operations era

    NASA Astrophysics Data System (ADS)

    Grandi, C.; Bockelman, B.; Bonacorsi, D.; Fisk, I.; González Caballero, I.; Farina, F.; Hernández, J. M.; Padhi, S.; Sarkar, S.; Sciabà, A.; Sfiligoi, I.; Spiga, F.; Úbeda García, M.; Van Der Ster, D. C.; Zvada, M.

    2011-12-01

    After many years of preparation the CMS computing system has reached a situation where stability in operations limits the possibility to introduce innovative features. Nevertheless it is the same need of stability and smooth operations that requires the introduction of features that were considered not strategic in the previous phases. Examples are: adequate authorization to control and prioritize the access to storage and computing resources; improved monitoring to investigate problems and identify bottlenecks on the infrastructure; increased automation to reduce the manpower needed for operations; effective process to deploy in production new releases of the software tools. We present the work of the CMS Distributed Computing Integration Activity that is responsible for providing a liaison between the CMS distributed computing infrastructure and the software providers, both internal and external to CMS. In particular we describe the introduction of new middleware features during the last 18 months as well as the requirements to Grid and Cloud software developers for the future.

  2. Uses of the Drupal CMS Collaborative Framework in the Woods Hole Scientific Community (Invited)

    NASA Astrophysics Data System (ADS)

    Maffei, A. R.; Chandler, C. L.; Work, T. T.; Shorthouse, D.; Furfey, J.; Miller, H.

    2010-12-01

    Organizations that comprise the Woods Hole scientific community (Woods Hole Oceanographic Institution, Marine Biological Laboratory, USGS Woods Hole Coastal and Marine Science Center, Woods Hole Research Center, NOAA NMFS Northeast Fisheries Science Center, SEA Education Association) have a long history of collaborative activity regarding computing, computer network and information technologies that support common, inter-disciplinary science needs. Over the past several years there has been growing interest in the use of the Drupal Content Management System (CMS) playing a variety of roles in support of research projects resident at several of these organizations. Many of these projects are part of science programs that are national and international in scope. Here we survey the current uses of Drupal within the Woods Hole scientific community and examine reasons it has been adopted. The promise of emerging semantic features in the Drupal framework is examined and projections of how pre-existing Drupal-based websites might benefit are made. Closer examination of Drupal software design exposes it as more than simply a content management system. The flexibility of its architecture; the power of its taxonomy module; the care taken in nurturing the open-source developer community that surrounds it (including organized and often well-attended code sprints); the ability to bind emerging software technologies as Drupal modules; the careful selection process used in adopting core functionality; multi-site hosting and cross-site deployment of updates and a recent trend towards development of use-case inspired Drupal distributions casts Drupal as a general-purpose application deployment framework. Recent work in the semantic arena casts Drupal as an emerging RDF framework as well. Examples of roles played by Drupal-based websites within the Woods Hole scientific community that will be discussed include: science data metadata database, organization main website, biological taxonomy development, bibliographic database, physical media data archive inventory manager, disaster-response website development framework, science project task management, science conference planning, and spreadsheet-to-database converter.

  3. GSC configuration management plan

    NASA Technical Reports Server (NTRS)

    Withers, B. Edward

    1990-01-01

    The tools and methods used for the configuration management of the artifacts (including software and documentation) associated with the Guidance and Control Software (GCS) project are described. The GCS project is part of a software error studies research program. Three implementations of GCS are being produced in order to study the fundamental characteristics of the software failure process. The Code Management System (CMS) is used to track and retrieve versions of the documentation and software. Application of the CMS for this project is described and the numbering scheme is delineated for the versions of the project artifacts.

  4. Intelligent systems and advanced user interfaces for design, operation, and maintenance of command management systems

    NASA Technical Reports Server (NTRS)

    Potter, William J.; Mitchell, Christine M.

    1993-01-01

    Historically, command management systems (CMS) have been large and expensive spacecraft-specific software systems that were costly to build, operate, and maintain. Current and emerging hardware, software, and user interface technologies may offer an opportunity to facilitate the initial formulation and design of a spacecraft-specific CMS as well as to develop a more generic CMS system. New technologies, in addition to a core CMS common to a range of spacecraft, may facilitate the training and enhance the efficiency of CMS operations. Current mission operations center (MOC) hardware and software include Unix workstations, the C/C++ programming languages, and an X window interface. This configuration provides the power and flexibility to support sophisticated and intelligent user interfaces that exploit state-of-the-art technologies in human-machine interaction, artificial intelligence, and software engineering. One of the goals of this research is to explore the extent to which technologies developed in the research laboratory can be productively applied in a complex system such as spacecraft command management. Initial examination of some of these issues in CMS design and operation suggests that application of technologies such as intelligent planning, case-based reasoning, human-machine systems design and analysis tools (e.g., operator and designer models), and human-computer interaction tools (e.g., graphics, visualization, and animation) may provide significant savings in the design, operation, and maintenance of the CMS for a specific spacecraft as well as continuity for CMS design and development across spacecraft. The first six months of this research saw a broad investigation by Georgia Tech researchers into the function, design, and operation of current and planned command management systems at Goddard Space Flight Center. As the first step, the researchers attempted to understand the current and anticipated horizons of command management systems at Goddard. Preliminary results are given on CMS commonalities and causes of low re-use, and methods are proposed to facilitate increased re-use.

  5. The CMS tracker control system

    NASA Astrophysics Data System (ADS)

    Dierlamm, A.; Dirkes, G. H.; Fahrer, M.; Frey, M.; Hartmann, F.; Masetti, L.; Militaru, O.; Shah, S. Y.; Stringer, R.; Tsirou, A.

    2008-07-01

    The Tracker Control System (TCS) is a distributed control software to operate about 2000 power supplies for the silicon modules of the CMS Tracker and monitor its environmental sensors. TCS must thus be able to handle about 104 power supply parameters, about 103 environmental probes from the Programmable Logic Controllers of the Tracker Safety System (TSS), about 105 parameters read via DAQ from the DCUs in all front end hybrids and from CCUs in all control groups. TCS is built on top of an industrial SCADA program (PVSS) extended with a framework developed at CERN (JCOP) and used by all LHC experiments. The logical partitioning of the detector is reflected in the hierarchical structure of the TCS, where commands move down to the individual hardware devices, while states are reported up to the root which is interfaced to the broader CMS control system. The system computes and continuously monitors the mean and maximum values of critical parameters and updates the percentage of currently operating hardware. Automatic procedures switch off selected parts of the detector using detailed granularity and avoiding widespread TSS intervention.

  6. CMS Analysis School Model

    NASA Astrophysics Data System (ADS)

    Malik, S.; Shipsey, I.; Cavanaugh, R.; Bloom, K.; Chan, Kai-Feng; D'Hondt, J.; Klima, B.; Narain, M.; Palla, F.; Rolandi, G.; Schörner-Sadenius, T.

    2014-06-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  7. Transitions in Classroom Technology: Instructor Implementation of Classroom Management Software

    ERIC Educational Resources Information Center

    Ackerman, David; Chung, Christina; Sun, Jerry Chih-Yuan

    2014-01-01

    The authors look at how business instructor needs are fulfilled by classroom management software (CMS), such as Moodle, and why instructors are sometimes slow to implement it. Instructors at different universities provided both qualitative and quantitative responses regarding their use of CMS. The results indicate that the top needs fulfilled by…

  8. CMS Configuration Editor: GUI based application for user analysis job

    NASA Astrophysics Data System (ADS)

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  9. Data Quality Monitoring System for New GEM Muon Detectors for the CMS Experiment Upgrade

    NASA Astrophysics Data System (ADS)

    King, Robert; CMS Muon Group Team

    2017-01-01

    The Gas Electron Multiplier (GEM) detectors are novel detectors designed to improve the muon trigger and tracking performance in CMS experiment for the high luminosity upgrade of the LHC. Partial installation of GEM detectors is planned during the 2016-2017 technical stop. Before the GEM system is installed underground, its data acquisition (DAQ) electronics must be thoroughly tested. The DAQ system includes several commercial and custom-built electronic boards running custom firmware. The front-end electronics are radiation-hard and communicate via optical fibers. The data quality monitoring (DQM) software framework has been designed to provide online verification of the integrity of the data produced by the detector electronics, and to promptly identify potential hardware or firmware malfunctions in the system. Local hits reconstruction and clustering algorithms allow quality control of the data produced by each GEM chamber. Once the new detectors are installed, the DQM will monitor the stability and performance of the system during normal data-taking operations. We discuss the design of the DQM system, the software being developed to read out and process the detector data, and the methods used to identify and report hardware and firmware malfunctions of the system.

  10. Optimizing CMS build infrastructure via Apache Mesos

    NASA Astrophysics Data System (ADS)

    Abdurachmanov, David; Degano, Alessandro; Elmer, Peter; Eulisse, Giulio; Mendez, David; Muzaffar, Shahzad

    2015-12-01

    The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux. Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora, and other applications on a dynamically shared pool of nodes. We present how we migrated our continuous integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.

  11. CMS Analysis School Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malik, S.; Shipsey, I.; Cavanaugh, R.

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals ofmore » CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trentadue, R.; Clemencic, M.; Dykstra, D.

    The LCG Persistency Framework consists of three software packages (CORAL, COOL and POOL) that address the data access requirements of the LHC experiments in several different areas. The project is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that are using some or all of the Persistency Framework components to access their data. POOL is a hybrid technology store for C++ objects, using a mixture of streaming and relational technologies to implement both object persistency and object metadata catalogs and collections. CORAL is an abstraction layer with an SQL-free APImore » for accessing data stored using relational database technologies. COOL provides specific software components and tools for the handling of the time variation and versioning of the experiment conditions data. This presentation reports on the status and outlook in each of the three sub-projects at the time of the CHEP2012 conference, reviewing the usage of each package in the three LHC experiments.« less

  13. A Content Markup Language for Data Services

    NASA Astrophysics Data System (ADS)

    Noviello, C.; Acampa, P.; Mango Furnari, M.

    Network content delivery and documents sharing is possible using a variety of technologies, such as distributed databases, service-oriented applications, and so forth. The development of such systems is a complex job, because document life cycle involves a strong cooperation between domain experts and software developers. Furthermore, the emerging software methodologies, such as the service-oriented architecture and knowledge organization (e.g., semantic web) did not really solve the problems faced in a real distributed and cooperating settlement. In this chapter the authors' efforts to design and deploy a distribute and cooperating content management system are described. The main features of the system are a user configurable document type definition and a management middleware layer. It allows CMS developers to orchestrate the composition of specialized software components around the structure of a document. In this chapter are also reported some of the experiences gained on deploying the developed framework in a cultural heritage dissemination settlement.

  14. One lens missing? Clarifying the clinical microsystem framework with learning theories.

    PubMed

    Norman, Ann-Charlott; Fritzen, Lena; Fridh, Marianne Lindblad

    2013-01-01

    The clinical microsystem (CMS) approach is widely used and is perceived as helpful in practice but, we ask the question: "Is its learning potential sufficiently utilized?" To scrutinize aspects of learning within the CMS framework and to clarify the learning aspects the framework includes and thereby support the framework with the enhanced learning perspective that becomes visible. Literature on the CMS framework was systematically searched and selected using inclusion criteria. An analytical tool was constructed in the form of a theoretical lens that was used to clarify learning aspects that are associated with the framework. The analysis revealed 3 learning aspects: (1) The CMS framework describes individual and social learning but not how to adapt learning strategies for purposes of change. (2) The metaphorical language of how to reach a holistic health care system for each patient has developed over time but can still be improved by naming social interactions to transcend organizational boundaries. (3) Power structures are recognized but not as a characteristic that restricts learning due to asymmetric communication. The "lens" perspective reveals new meanings to learning that enhance our understanding of health care as a social system and provides new practical learning strategies.

  15. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance is good.

  16. The evolution of CMS software performance studies

    NASA Astrophysics Data System (ADS)

    Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.

    2011-12-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  17. MonALISA, an agent-based monitoring and control system for the LHC experiments

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.

    2017-10-01

    MonALISA, which stands for Monitoring Agents using a Large Integrated Services Architecture, has been developed over the last fifteen years by California Insitute of Technology (Caltech) and its partners with the support of the software and computing program of the CMS and ALICE experiments at the Large Hadron Collider (LHC). The framework is based on Dynamic Distributed Service Architecture and is able to provide complete system monitoring, performance metrics of applications, Jobs or services, system control and global optimization services for complex systems. A short overview and status of MonALISA is given in this paper.

  18. Optimizing CMS build infrastructure via Apache Mesos

    DOE PAGES

    Abdurachmanov, David; Degano, Alessandro; Elmer, Peter; ...

    2015-12-23

    The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux.Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora,more » and other applications on a dynamically shared pool of nodes. Lastly, we present how we migrated our continuous integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.« less

  19. Optimizing CMS build infrastructure via Apache Mesos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdurachmanov, David; Degano, Alessandro; Elmer, Peter

    The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux.Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora,more » and other applications on a dynamically shared pool of nodes. Lastly, we present how we migrated our continuous integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.« less

  20. CMS-2 Reverse Engineering and ENCORE/MODEL Integration

    DTIC Science & Technology

    1992-05-01

    Automated extraction of design information from an existing software system written in CMS-2 can be used to document that system as-built, and that I The...extracted information is provided by a commer- dally available CASE tool. * Information describing software system design is automatically extracted...the displays in Figures 1, 2, and 3. T achiev ths GE 11 b iuo w as rjcs CM-2t Aa nsltr(M2da 1 n Joia Reverse EwngiernTcnlg 5RT [2GRE] . Two xampe fD

  1. A simulation framework for the CMS Track Trigger electronics

    NASA Astrophysics Data System (ADS)

    Amstutz, C.; Magazzù, G.; Weber, M.; Palla, F.

    2015-03-01

    A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows co-simulation with models developed in Hardware Description Languages, e.g. VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system.

  2. Integration and validation testing for PhEDEx, DBS and DAS with the PhEDEx LifeCycle agent

    NASA Astrophysics Data System (ADS)

    Boeser, C.; Chwalek, T.; Giffels, M.; Kuznetsov, V.; Wildish, T.

    2014-06-01

    The ever-increasing amount of data handled by the CMS dataflow and workflow management tools poses new challenges for cross-validation among different systems within CMS experiment at LHC. To approach this problem we developed an integration test suite based on the LifeCycle agent, a tool originally conceived for stress-testing new releases of PhEDEx, the CMS data-placement tool. The LifeCycle agent provides a framework for customising the test workflow in arbitrary ways, and can scale to levels of activity well beyond those seen in normal running. This means we can run realistic performance tests at scales not likely to be seen by the experiment for some years, or with custom topologies to examine particular situations that may cause concern some time in the future. The LifeCycle agent has recently been enhanced to become a general purpose integration and validation testing tool for major CMS services. It allows cross-system integration tests of all three components to be performed in controlled environments, without interfering with production services. In this paper we discuss the design and implementation of the LifeCycle agent. We describe how it is used for small-scale debugging and validation tests, and how we extend that to large-scale tests of whole groups of sub-systems. We show how the LifeCycle agent can emulate the action of operators, physicists, or software agents external to the system under test, and how it can be scaled to large and complex systems.

  3. CMS Centres Worldwide - a New Collaborative Infrastructure

    NASA Astrophysics Data System (ADS)

    Taylor, Lucas

    2011-12-01

    The CMS Experiment at the LHC has established a network of more than fifty inter-connected "CMS Centres" at CERN and in institutes in the Americas, Asia, Australasia, and Europe. These facilities are used by people doing CMS detector and computing grid operations, remote shifts, data quality monitoring and analysis, as well as education and outreach. We present the computing, software, and collaborative tools and videoconferencing systems. These include permanently running "telepresence" video links (hardware-based H.323, EVO and Vidyo), Webcasts, and generic Web tools such as CMS-TV for broadcasting live monitoring and outreach information. Being Web-based and experiment-independent, these systems could easily be extended to other organizations. We describe the experiences of using CMS Centres Worldwide in the CMS data-taking operations as well as for major media events with several hundred TV channels, radio stations, and many more press journalists simultaneously around the world.

  4. "Dropbox" Brings Course Management Back to Teachers

    ERIC Educational Resources Information Center

    Niles, Thaddeus M.

    2013-01-01

    Course management software (CMS) allows teachers to deliver content electronically and manage collaborative coursework, either blending with face-to-face interactions or as the core of an entirely virtual classroom environment. CMS often takes the form of an electronic storehouse of course materials with which students can interact, a virtual…

  5. XRootD popularity on hadoop clusters

    NASA Astrophysics Data System (ADS)

    Meoni, Marco; Boccali, Tommaso; Magini, Nicolò; Menichetti, Luca; Giordano, Domenico; CMS Collaboration

    2017-10-01

    Performance data and metadata of the computing operations at the CMS experiment are collected through a distributed monitoring infrastructure, currently relying on a traditional Oracle database system. This paper shows how to harness Big Data architectures in order to improve the throughput and the efficiency of such monitoring. A large set of operational data - user activities, job submissions, resources, file transfers, site efficiencies, software releases, network traffic, machine logs - is being injected into a readily available Hadoop cluster, via several data streamers. The collected metadata is further organized running fast arbitrary queries; this offers the ability to test several Map&Reduce-based frameworks and measure the system speed-up when compared to the original database infrastructure. By leveraging a quality Hadoop data store and enabling an analytics framework on top, it is possible to design a mining platform to predict dataset popularity and discover patterns and correlations.

  6. CMS event processing multi-core efficiency status

    NASA Astrophysics Data System (ADS)

    Jones, C. D.; CMS Collaboration

    2017-10-01

    In 2015, CMS was the first LHC experiment to begin using a multi-threaded framework for doing event processing. This new framework utilizes Intel’s Thread Building Block library to manage concurrency via a task based processing model. During the 2015 LHC run period, CMS only ran reconstruction jobs using multiple threads because only those jobs were sufficiently thread efficient. Recent work now allows simulation and digitization to be thread efficient. In addition, during 2015 the multi-threaded framework could run events in parallel but could only use one thread per event. Work done in 2016 now allows multiple threads to be used while processing one event. In this presentation we will show how these recent changes have improved CMS’s overall threading and memory efficiency and we will discuss work to be done to further increase those efficiencies.

  7. A Bayesian approach for calibrating probability judgments

    NASA Astrophysics Data System (ADS)

    Firmino, Paulo Renato A.; Santana, Nielson A.

    2012-10-01

    Eliciting experts' opinions has been one of the main alternatives for addressing paucity of data. In the vanguard of this area is the development of calibration models (CMs). CMs are models dedicated to overcome miscalibration, i.e. judgment biases reflecting deficient strategies of reasoning adopted by the expert when inferring about an unknown. One of the main challenges of CMs is to determine how and when to intervene against miscalibration, in order to enhance the tradeoff between costs (time spent with calibration processes) and accuracy of the resulting models. The current paper dedicates special attention to this issue by presenting a dynamic Bayesian framework for monitoring, diagnosing, and handling miscalibration patterns. The framework is based on Beta-, Uniform, or Triangular-Bernoulli models and classes of judgmental calibration theories. Issues regarding the usefulness of the proposed framework are discussed and illustrated via simulation studies.

  8. Opportunistic Resource Usage in CMS

    NASA Astrophysics Data System (ADS)

    Kreuzer, Peter; Hufnagel, Dirk; Dykstra, D.; Gutsche, O.; Tadel, M.; Sfiligoi, I.; Letts, J.; Wuerthwein, F.; McCrea, A.; Bockelman, B.; Fajardo, E.; Linares, L.; Wagner, R.; Konstantinov, P.; Blumenfeld, B.; Bradley, D.; Cms Collaboration

    2014-06-01

    CMS is using a tiered setup of dedicated computing resources provided by sites distributed over the world and organized in WLCG. These sites pledge resources to CMS and are preparing them especially for CMS to run the experiment's applications. But there are more resources available opportunistically both on the GRID and in local university and research clusters which can be used for CMS applications. We will present CMS' strategy to use opportunistic resources and prepare them dynamically to run CMS applications. CMS is able to run its applications on resources that can be reached through the GRID, through EC2 compliant cloud interfaces. Even resources that can be used through ssh login nodes can be harnessed. All of these usage modes are integrated transparently into the GlideIn WMS submission infrastructure, which is the basis of CMS' opportunistic resource usage strategy. Technologies like Parrot to mount the software distribution via CVMFS and xrootd for access to data and simulation samples via the WAN are used and will be described. We will summarize the experience with opportunistic resource usage and give an outlook for the restart of LHC data taking in 2015.

  9. An expert system prototype for aiding in the development of software functional requirements for NASA Goddard's command management system: A case study and lessons learned

    NASA Technical Reports Server (NTRS)

    Liebowitz, Jay

    1986-01-01

    At NASA Goddard, the role of the command management system (CMS) is to transform general requests for spacecraft opeerations into detailed operational plans to be uplinked to the spacecraft. The CMS is part of the NASA Data System which entails the downlink of science and engineering data from NASA near-earth satellites to the user, and the uplink of command and control data to the spacecraft. Presently, it takes one to three years, with meetings once or twice a week, to determine functional requirements for CMS software design. As an alternative approach to the present technique of developing CMS software functional requirements, an expert system prototype was developed to aid in this function. Specifically, the knowledge base was formulated through interactions with domain experts, and was then linked to an existing expert system application generator called 'Knowledge Engineering System (Version 1.3).' Knowledge base development focused on four major steps: (1) develop the problem-oriented attribute hierachy; (2) determine the knowledge management approach; (3) encode the knowledge base; and (4) validate, test, certify, and evaluate the knowledge base and the expert system prototype as a whole. Backcasting was accomplished for validating and testing the expert system prototype. Knowledge refinement, evaluation, and implementation procedures of the expert system prototype were then transacted.

  10. Novel Analysis Software for Detecting and Classifying Ca2+ Transient Abnormalities in Stem Cell-Derived Cardiomyocytes

    PubMed Central

    Penttinen, Kirsi; Siirtola, Harri; Àvalos-Salguero, Jorge; Vainio, Tiina; Juhola, Martti; Aalto-Setälä, Katriina

    2015-01-01

    Comprehensive functioning of Ca2+ cycling is crucial for excitation–contraction coupling of cardiomyocytes (CMs). Abnormal Ca2+ cycling is linked to arrhythmogenesis, which is associated with cardiac disorders and heart failure. Accordingly, we have generated spontaneously beating CMs from induced pluripotent stem cells (iPSC) derived from patients with catecholaminergic polymorphic ventricular tachycardia (CPVT), which is an inherited and severe cardiac disease. Ca2+ cycling studies have revealed substantial abnormalities in these CMs. Ca2+ transient analysis performed manually lacks accepted analysis criteria, and has both low throughput and high variability. To overcome these issues, we have developed a software tool, AnomalyExplorer based on interactive visualization, to assist in the classification of Ca2+ transient patterns detected in CMs. Here, we demonstrate the usability and capability of the software, and we also compare the analysis efficiency to manual analysis. We show that AnomalyExplorer is suitable for detecting normal and abnormal Ca2+ transients; furthermore, this method provides more defined and consistent information regarding the Ca2+ abnormality patterns and cell line specific differences when compared to manual analysis. This tool will facilitate and speed up the analysis of CM Ca2+ transients, making it both more accurate and user-independent. AnomalyExplorer can be exploited in Ca2+ cycling analysis to study basic disease pathology and the effects of different drugs. PMID:26308621

  11. Novel Analysis Software for Detecting and Classifying Ca2+ Transient Abnormalities in Stem Cell-Derived Cardiomyocytes.

    PubMed

    Penttinen, Kirsi; Siirtola, Harri; Àvalos-Salguero, Jorge; Vainio, Tiina; Juhola, Martti; Aalto-Setälä, Katriina

    2015-01-01

    Comprehensive functioning of Ca2+ cycling is crucial for excitation-contraction coupling of cardiomyocytes (CMs). Abnormal Ca2+ cycling is linked to arrhythmogenesis, which is associated with cardiac disorders and heart failure. Accordingly, we have generated spontaneously beating CMs from induced pluripotent stem cells (iPSC) derived from patients with catecholaminergic polymorphic ventricular tachycardia (CPVT), which is an inherited and severe cardiac disease. Ca2+ cycling studies have revealed substantial abnormalities in these CMs. Ca2+ transient analysis performed manually lacks accepted analysis criteria, and has both low throughput and high variability. To overcome these issues, we have developed a software tool, AnomalyExplorer based on interactive visualization, to assist in the classification of Ca2+ transient patterns detected in CMs. Here, we demonstrate the usability and capability of the software, and we also compare the analysis efficiency to manual analysis. We show that AnomalyExplorer is suitable for detecting normal and abnormal Ca2+ transients; furthermore, this method provides more defined and consistent information regarding the Ca2+ abnormality patterns and cell line specific differences when compared to manual analysis. This tool will facilitate and speed up the analysis of CM Ca2+ transients, making it both more accurate and user-independent. AnomalyExplorer can be exploited in Ca2+ cycling analysis to study basic disease pathology and the effects of different drugs.

  12. Opportunistic Resource Usage in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreuzer, Peter; Hufnagel, Dirk; Dykstra, D.

    2014-01-01

    CMS is using a tiered setup of dedicated computing resources provided by sites distributed over the world and organized in WLCG. These sites pledge resources to CMS and are preparing them especially for CMS to run the experiment's applications. But there are more resources available opportunistically both on the GRID and in local university and research clusters which can be used for CMS applications. We will present CMS' strategy to use opportunistic resources and prepare them dynamically to run CMS applications. CMS is able to run its applications on resources that can be reached through the GRID, through EC2 compliantmore » cloud interfaces. Even resources that can be used through ssh login nodes can be harnessed. All of these usage modes are integrated transparently into the GlideIn WMS submission infrastructure, which is the basis of CMS' opportunistic resource usage strategy. Technologies like Parrot to mount the software distribution via CVMFS and xrootd for access to data and simulation samples via the WAN are used and will be described. We will summarize the experience with opportunistic resource usage and give an outlook for the restart of LHC data taking in 2015.« less

  13. Software Description for the O’Hare Runway Configuration Management System. Volume I. Technical Description,

    DTIC Science & Technology

    1982-10-01

    spent in preparing this document. 00. EXECUTIVE SUMMARY The O’Hare Runway Configuration Management System (CMS) is an interactive multi-user computer ...MITRE Washington’s Computer Center. Currently, CMS is housed in an IBM 4341 computer with VM/SP operating system. CMS employs the IBM’s Display...iV 0O, o 0 .r4L /~ wA 0U 00 00 0 w vi O’Hare, it will operate on a dedicated mini- computer which permits multi-tasking (that is, multiple users

  14. Volcano Modelling and Simulation gateway (VMSg): A new web-based framework for collaborative research in physical modelling and simulation of volcanic phenomena

    NASA Astrophysics Data System (ADS)

    Esposti Ongaro, T.; Barsotti, S.; de'Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.

    2009-12-01

    Physical and numerical modelling is becoming of increasing importance in volcanology and volcanic hazard assessment. However, new interdisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures. Therefore new frameworks are needed for sharing knowledge, software codes, and datasets among scientists. Here we present the Volcano Modelling and Simulation gateway (VMSg, accessible at http://vmsg.pi.ingv.it), a new electronic infrastructure for promoting knowledge growth and transfer in the field of volcanological modelling and numerical simulation. The new web portal, developed in the framework of former and ongoing national and European projects, is based on a dynamic Content Manager System (CMS) and was developed to host and present numerical models of the main volcanic processes and relationships including magma properties, magma chamber dynamics, conduit flow, plume dynamics, pyroclastic flows, lava flows, etc. Model applications, numerical code documentation, simulation datasets as well as model validation and calibration test-cases are also part of the gateway material.

  15. Open access to high-level data and analysis tools in the CMS experiment at the LHC

    DOE PAGES

    Calderon, A.; Colling, D.; Huffman, A.; ...

    2015-12-23

    The CMS experiment, in recognition of its commitment to data preservation and open access as well as to education and outreach, has made its first public release of high-level data under the CC0 waiver: up to half of the proton-proton collision data (by volume) at 7 TeV from 2010 in CMS Analysis Object Data format. CMS has prepared, in collaboration with CERN and the other LHC experiments, an open-data web portal based on Invenio. The portal provides access to CMS public data as well as to analysis tools and documentation for the public. The tools include an event display andmore » histogram application that run in the browser. In addition a virtual machine containing a CMS software environment along with XRootD access to the data is available. Within the virtual machine the public can analyse CMS data, example code is provided. As a result, we describe the accompanying tools and documentation and discuss the first experiences of data use.« less

  16. A Population WB-PBPK Model of Colistin and its Prodrug CMS in Pigs: Focus on the Renal Distribution and Excretion.

    PubMed

    Viel, Alexis; Henri, Jérôme; Bouchène, Salim; Laroche, Julian; Rolland, Jean-Guy; Manceau, Jacqueline; Laurentie, Michel; Couet, William; Grégoire, Nicolas

    2018-03-12

    The objective was the development of a whole-body physiologically-based pharmacokinetic (WB-PBPK) model for colistin, and its prodrug colistimethate sodium (CMS), in pigs to explore their tissue distribution, especially in kidneys. Plasma and tissue concentrations of CMS and colistin were measured after systemic administrations of different dosing regimens of CMS in pigs. The WB-PBPK model was developed based on these data according to a non-linear mixed effect approach and using NONMEM software. A detailed sub-model was implemented for kidneys to handle the complex disposition of CMS and colistin within this organ. The WB-PBPK model well captured the kinetic profiles of CMS and colistin in plasma. In kidneys, an accumulation and slow elimination of colistin were observed and well described by the model. Kidneys seemed to have a major role in the elimination processes, through tubular secretion of CMS and intracellular degradation of colistin. Lastly, to illustrate the usefulness of the PBPK model, an estimation of the withdrawal periods after veterinary use of CMS in pigs was made. The WB-PBPK model gives an insight into the renal distribution and elimination of CMS and colistin in pigs; it may be further developed to explore the colistin induced-nephrotoxicity in humans.

  17. Collision models in quantum optics

    NASA Astrophysics Data System (ADS)

    Ciccarello, Francesco

    2017-12-01

    Quantum collision models (CMs) provide advantageous case studies for investigating major issues in open quantum systems theory, and especially quantum non-Markovianity. After reviewing their general definition and distinctive features, we illustrate the emergence of a CM in a familiar quantum optics scenario. This task is carried out by highlighting the close connection between the well-known input-output formalism and CMs. Within this quantum optics framework, usual assumptions in the CMs' literature - such as considering a bath of noninteracting yet initially correlated ancillas - have a clear physical origin.

  18. Simulation of Top Quark Pair Production as a Background for Higgs Events at the Compact Muon Solenoid

    NASA Astrophysics Data System (ADS)

    Justus, Christopher

    2005-04-01

    In this study, we simulated top-antitop (tt-bar) quark events at the Compact Muon Solenoid (CMS), an experiment presently being constructed at the Large Hadron Collider in Geneva, Switzerland. The tt-bar process is an important background for Higgs events. We used a chain of software to simulate and reconstruct processes that will occur inside the detector. CMKIN was used to generate and store Monte Carlo Events. OSCAR, a GEANT4 based CMS detector simulator, was used to simulate the CMS detector and how particles would interact with the detector. Next, we used ORCA to simulate the response of the readout electronics at CMS. Last, we used the Jet/MET Root maker to create root files of jets and missing energy. We are now using this software analysis chain to complete a systematic study of initial state radiation at hadron colliders. This study is essential because tt-bar is the main background for the Higgs boson and these processes are extremely sensitive to initial state radiation. Results of our initial state radiation study will be presented. We started this study at the new LHC Physics Center (LPC) located at Fermi National Accelerator Laboratory, and we are now completing the study at the University of Rochester.

  19. Education in the Workplace for the Physician: Clinical Management States as an Organizing Framework.

    ERIC Educational Resources Information Center

    Greenes, Robert A.

    2000-01-01

    Trends in health information technology include (1) improved access to patient care information; (2) methods for patient-doctor interaction and decision making; (3) computerized practice guidelines; and (4) the concept of patients being in clinical management states (CMS). Problem-specific environments and CMS-related resources should be the focus…

  20. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  1. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE PAGES

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  2. Implementation of a multi-threaded framework for large-scale scientific applications

    DOE PAGES

    Sexton-Kennedy, E.; Gartung, Patrick; Jones, C. D.; ...

    2015-05-22

    The CMS experiment has recently completed the development of a multi-threaded capable application framework. In this paper, we will discuss the design, implementation and application of this framework to production applications in CMS. For the 2015 LHC run, this functionality is particularly critical for both our online and offline production applications, which depend on faster turn-around times and a reduced memory footprint relative to before. These applications are complex codes, each including a large number of physics-driven algorithms. While the framework is capable of running a mix of thread-safe and 'legacy' modules, algorithms running in our production applications need tomore » be thread-safe for optimal use of this multi-threaded framework at a large scale. Towards this end, we discuss the types of changes, which were necessary for our algorithms to achieve good performance of our multithreaded applications in a full-scale application. Lastly performance numbers for what has been achieved for the 2015 run are presented.« less

  3. Benchmarking high performance computing architectures with CMS’ skeleton framework

    NASA Astrophysics Data System (ADS)

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    2017-10-01

    In 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta, Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.

  4. Study of inclusive single-jet production in the framework of kt-factorization unintegrated parton distributions

    NASA Astrophysics Data System (ADS)

    Aminzadeh Nik, R.; Modarres, M.; Masouminia, M. R.

    2018-05-01

    The present work is intended to study the double-differential cross section of the inclusive single-jet production as the functions of the transverse momentum and the rapidity of the jet in the high-energy hadron-hadron collisions. The angular-ordering-constraint kt-factorization framework is used to calculate the above cross section that is available experimentally. The conditions are taken in accordance with the LHC experiments. The results are compared and analyzed using the existing CMS LHC data. The scheme-dependent unintegrated parton distribution functions (UPDF) of Kimber-Martin-Ryskin (KMR) and Martin-Ryskin-Watt (MRW) in the leading-order and the next-to-leading order (NLO) are used to predict the input partonic UPDF. The utilized phenomenological frameworks prove to be relatively successful in generating satisfactory results compared to the different experiment data, such as CMS (8 and 13 TeV). Extensive discussions and comparisons are made regarding the behavior of the contributing partonic subprocesses. Finally, it is shown that the application of the KMR UPDF to the single-jet differential cross sections have better agreement with the CMS data; on the other hand, they are very similar to those of NLO-MRW.

  5. Bringing Together Users and Developers of Forest Biomass Maps

    NASA Technical Reports Server (NTRS)

    Brown, Molly Elizabeth; Macauley, Molly K.

    2012-01-01

    Forests store carbon and thus represent important sinks for atmospheric carbon dioxide. Reducing uncertainty in current estimates of the amount of carbon in standing forests will improve precision of estimates of anthropogenic contributions to carbon dioxide in the atmosphere due to deforestation. Although satellite remote sensing has long been an important tool for mapping land cover, until recently aboveground forest biomass estimates have relied mostly on systematic ground sampling of forests. In alignment with fiscal year 2010 congressional direction, NASA has initiated work toward a carbon monitoring system (CMS) that includes both maps of forest biomass and total carbon flux estimates. A goal of the project is to ensure that the products are useful to a wide community of scientists, managers, and policy makers, as well as to carbon cycle scientists. Understanding the needs and requirements of these data users is helpful not just to the NASA CMS program but also to the entire community working on carbon-related activities. To that end, this meeting brought together a small group of natural resource managers and policy makers who use information on forests in their work with NASA scientists who are working to create aboveground forest biomass maps. These maps, derived from combining remote sensing and ground plots, aim to be more accurate than current inventory approaches when applied at local and regional scales. Meeting participants agreed that users of biomass information will look to the CMS effort not only to provide basic data for carbon or biomass measurements but also to provide data to help serve a broad range of goals, such as forest watershed management for water quality, habitat management for biodiversity and ecosystem services, and potential use for developing payments for ecosystem service projects. Participants also reminded the CMS group that potential users include not only public sector agencies and nongovernmental organizations but also the private sector because much forest acreage in the United States is privately held and needs data for forest management. Additional key outcomes identified by meeting participants include the following: (1) Priority should be given to building into the biomass product ease of use and low costs (including costs of hardware, software, and analysis requirements), (2) CMS products should also be relevant to other biomass measures for forest watershed management, habitat protection for biodiversity, and assessment of markets for ecosystem services, (3) CMS leadership should engage with the Subsidiary Body for Scientific and Technological Advice of the United Nations Framework Convention on Climate Change as they establish measuring, reporting, and verification standards, and (4) CMS leadership should continue to keep sister agencies and other organizations informed as CMS develops, particularly via the agencies active in the U.S. Global Change Research Program Carbon Cycle Interagency Working Group (U.S. Geological Survey, U.S. Department of Agriculture, and National Oceanic and Atmospheric Administration) and nongovernmental organizations.

  6. Efficient monitoring of CRAB jobs at CMS

    NASA Astrophysics Data System (ADS)

    Silva, J. M. D.; Balcas, J.; Belforte, S.; Ciangottini, D.; Mascheroni, M.; Rupeika, E. A.; Ivanov, T. T.; Hernandez, J. M.; Vaandering, E.

    2017-10-01

    CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates the design choices and gives a report on our experience with the tools we developed and the external ones we used.

  7. Efficient Monitoring of CRAB Jobs at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, J. M.D.; Balcas, J.; Belforte, S.

    CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates themore » design choices and gives a report on our experience with the tools we developed and the external ones we used.« less

  8. Benchmarking high performance computing architectures with CMS’ skeleton framework

    DOE PAGES

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    2017-11-23

    Here, in 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta,more » Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.« less

  9. Benchmarking high performance computing architectures with CMS’ skeleton framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    Here, in 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta,more » Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.« less

  10. CMS results in the Combined Computing Readiness Challenge CCRC'08

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Bauerdick, L.; CMS Collaboration

    2009-12-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed workflows - are presented and discussed.

  11. Carbon Monitoring System Applications Framework: Lessons Learned from Stakeholder Engagement Activities

    NASA Astrophysics Data System (ADS)

    Sepulveda Carlo, E.; Escobar, V. M.; Delgado Arias, S.; Forgotson, C.

    2017-12-01

    The NASA Carbon Monitoring System initiated by U.S. Congress in 2010 is developing products that characterize and quantify carbon sources and sinks in the United States and the global tropics. In 2013, an applications effort was selected to engage potential end users and gather feedback about their data needs. For the past four years the CMS applications efforts has expanded and implemented a number of strategies to connect carbon scientists to decision-makers, contributing to the societal benefits of CMS data products. The applications efforts use crowd sourcing to collects feedback from stakeholders on challenges and lessons learned in the use of CMS data products. Some of the most common data needs from engaged organizations include above and below-ground biomass and fluxes in forestlands and wetlands, and greenhouse gas (GHG) emissions across all land use/cover and land use changes. Stakeholder organizations' needs for CMS data products support national GHG inventories following the Paris Agreement, carbon markets, and sub-national natural resources management and policies. The lessons learned report presents stakeholder specific applications, challenges, and successes from using CMS data products. To date, the most common uses of CMS products include: conservation efforts, emissions inventory, forestry and land cover applications, and carbon offset projects. The most common challenges include: the need for familiar and consistent products over time, budget constraints, and concern with uncertainty of modeled results. Recurrent recommendations from stakeholder indicate that CMS should provide high resolution (30m) and frequent data products updates (annually). The applications efforts have also helped identified success stories from different CMS projects, including the development of the GHG emissions inventory from Providence, RI, the improvement of the U.S. GHG Inventory though the use of satellite data, and the use of high resolution canopy cover maps for forestry, conservation, and ecosystem services applications in the tristate area of Maryland, Delaware and Pennsylvania. The presentation will discuss the applications framework methodology and strategy, as well as highlight some of the results and lessons learned from these applications efforts.

  12. Tracking at High Level Trigger in CMS

    NASA Astrophysics Data System (ADS)

    Tosi, M.

    2016-04-01

    The trigger systems of the LHC detectors play a crucial role in determining the physics capabilities of experiments. A reduction of several orders of magnitude of the event rate is needed to reach values compatible with detector readout, offline storage and analysis capability. The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger (L1T), implemented on custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a trade-off between the complexity of the algorithms, the sustainable output rate, and the selection efficiency. With the computing power available during the 2012 data taking the maximum reconstruction time at HLT was about 200 ms per event, at the nominal L1T rate of 100 kHz. Track reconstruction algorithms are widely used in the HLT, for the reconstruction of the physics objects as well as in the identification of b-jets and lepton isolation. Reconstructed tracks are also used to distinguish the primary vertex, which identifies the hard interaction process, from the pileup ones. This task is particularly important in the LHC environment given the large number of interactions per bunch crossing: on average 25 in 2012, and expected to be around 40 in Run II. We will present the performance of HLT tracking algorithms, discussing its impact on CMS physics program, as well as new developments done towards the next data taking in 2015.

  13. CMS users data management service integration and first experiences with its NoSQL data storage

    NASA Astrophysics Data System (ADS)

    Riahi, H.; Spiga, D.; Boccali, T.; Ciangottini, D.; Cinquilli, M.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Santocchia, A.

    2014-06-01

    The distributed data analysis workflow in CMS assumes that jobs run in a different location to where their results are finally stored. Typically the user outputs must be transferred from one site to another by a dedicated CMS service, AsyncStageOut. This new service is originally developed to address the inefficiency in using the CMS computing resources when transferring the analysis job outputs, synchronously, once they are produced in the job execution node to the remote site. The AsyncStageOut is designed as a thin application relying only on the NoSQL database (CouchDB) as input and data storage. It has progressed from a limited prototype to a highly adaptable service which manages and monitors the whole user files steps, namely file transfer and publication. The AsyncStageOut is integrated with the Common CMS/Atlas Analysis Framework. It foresees the management of nearly nearly 200k users' files per day of close to 1000 individual users per month with minimal delays, and providing a real time monitoring and reports to users and service operators, while being highly available. The associated data volume represents a new set of challenges in the areas of database scalability and service performance and efficiency. In this paper, we present an overview of the AsyncStageOut model and the integration strategy with the Common Analysis Framework. The motivations for using the NoSQL technology are also presented, as well as data design and the techniques used for efficient indexing and monitoring of the data. We describe deployment model for the high availability and scalability of the service. We also discuss the hardware requirements and the results achieved as they were determined by testing with actual data and realistic loads during the commissioning and the initial production phase with the Common Analysis Framework.

  14. The CMS High-Level Trigger and Trigger Menus

    NASA Astrophysics Data System (ADS)

    Avetisyan, Aram

    2008-04-01

    The CMS experiment is one of the two general-purpose experiments due to start operation soon at the Large Hadron Collider (LHC). The LHC will collide protons at a centre of mass energy of 14 TeV, with a bunch-crossing rate of 40 MHz. The online event selection for the CMS experiment is carried out in two distinct stages. At Level-1 the trigger electronics reduces the 40 MHz collision rate to provide up to 100 kHz of interesting events, based on objects found using its calorimeter and muon subsystems. The High Level Trigger (HLT) that runs in the Filter Farm of the CMS experiment is a set of sophisticated software tools that run in a real-time environment to make a further selection and archive few hundred Hz of interesting events. The coherent tuning of the HLT algorithms to accommodate multiple physics channels is a key issue for CMS, one that literally defines the reach of the experiment's physics program. In this presentation we will discuss the strategies and trigger configuration developed for startup physics program of the CMS experiment, up to a luminosity of 10^31 s-1cm-2. Emphasis will be given to the full trigger menus, including physics and calibration triggers.

  15. RooStatsCms: A tool for analysis modelling, combination and statistical studies

    NASA Astrophysics Data System (ADS)

    Piparo, D.; Schott, G.; Quast, G.

    2010-04-01

    RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwan, Simon; Lei, CM; Menasce, Dario

    An all silicon pixel telescope has been assembled and used at the Fermilab Test Beam Facility (FTBF) since 2009 to provide precise tracking information for different test beam experiments with a wide range of Detectors Under Test (DUTs) requiring high resolution measurement of the track impact point. The telescope is based on CMS pixel modules left over from the CMS forward pixel production. Eight planes are arranged to achieve a resolution of less than 8 μm on the 120 GeV proton beam transverse coordinate at the DUT position. In order to achieve such resolution with 100 × 150 μm 2more » pixel cells, the planes were tilted to 25 degrees to maximize charge sharing between pixels. Crucial for obtaining this performance is the alignment software, called Monicelli, specifically designed and optimized for this system. This paper will describe the telescope hardware, the data acquisition system and the alignment software constituting this particle tracking system for test beam users.« less

  17. Association Between Medicare Summary Star Ratings for Patient Experience and Clinical Outcomes in US Hospitals.

    PubMed

    Trzeciak, Stephen; Gaughan, John P; Bosire, Joshua; Mazzarelli, Anthony J

    2016-03-01

    In 2015, the Centers for Medicare and Medicaid Services (CMS) released new summary star ratings for US hospitals based on patient experience. We aimed to test the association between CMS patient experience star ratings and clinical outcomes. We analyzed risk-adjusted data for more than 3000 US hospitals from CMS Hospital Compare using linear regression. We found that better patient experience was associated with favorable clinical outcomes. Specifically, a higher number of stars for patient experience had a statistically significant association with lower rates of many in-hospital complications. A higher patient experience star rating also had a statistically significant association with lower rates of unplanned readmissions to the hospital within 30 days. Better patient experience according to the CMS star ratings is associated with favorable clinical outcomes. These results support the inclusion of patient experience data in the framework of how hospitals are paid for services.

  18. Towards a centralized Grid Speedometer

    NASA Astrophysics Data System (ADS)

    Dzhunov, I.; Andreeva, J.; Fajardo, E.; Gutsche, O.; Luyckx, S.; Saiz, P.

    2014-06-01

    Given the distributed nature of the Worldwide LHC Computing Grid and the way CPU resources are pledged and shared around the globe, Virtual Organizations (VOs) face the challenge of monitoring the use of these resources. For CMS and the operation of centralized workflows, the monitoring of how many production jobs are running and pending in the Glidein WMS production pools is very important. The Dashboard Site Status Board (SSB) provides a very flexible framework to collect, aggregate and visualize data. The CMS production monitoring team uses the SSB to define the metrics that have to be monitored and the alarms that have to be raised. During the integration of CMS production monitoring into the SSB, several enhancements to the core functionality of the SSB were required; They were implemented in a generic way, so that other VOs using the SSB can exploit them. Alongside these enhancements, there were a number of changes to the core of the SSB framework. This paper presents the details of the implementation and the advantages for current and future usage of the new features in SSB.

  19. HTTP as a Data Access Protocol: Trials with XrootD in CMS’s AAA Project

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B. P.; Kcira, D.; Newman, H.; Vlimant, J.; Hendricks, T. W.; CMS Collaboration

    2017-10-01

    The main goal of the project to demonstrate the ability of using HTTP data federations in a manner analogous to the existing AAA infrastructure of the CMS experiment. An initial testbed at Caltech has been built and changes in the CMS software (CMSSW) are being implemented in order to improve HTTP support. The testbed consists of a set of machines at the Caltech Tier2 that improve the support infrastructure for data federations at CMS. As a first step, we are building systems that produce and ingest network data transfers up to 80 Gbps. In collaboration with AAA, HTTP support is enabled at the US redirector and the Caltech testbed. A plugin for CMSSW is being developed for HTTP access based on the DaviX software. It will replace the present fork/exec or curl for HTTP access. In addition, extensions to the XRootD HTTP implementation are being developed to add functionality to it, such as client-based monitoring identifiers. In the future, patches will be developed to better integrate HTTP-over-XRootD with the Open Science Grid (OSG) distribution. First results of the transfer tests using HTTP are presented in this paper together with details about the initial setup.

  20. Comparing 2 methods of assessing 30-day readmissions: what is the impact on hospital profiling in the veterans health administration?

    PubMed

    Mull, Hillary J; Chen, Qi; O'Brien, William J; Shwartz, Michael; Borzecki, Ann M; Hanchate, Amresh; Rosen, Amy K

    2013-07-01

    The Centers for Medicare and Medicaid Services' (CMS) all-cause readmission measure and the 3M Health Information System Division Potentially Preventable Readmissions (PPR) measure are both used for public reporting. These 2 methods have not been directly compared in terms of how they identify high-performing and low-performing hospitals. To examine how consistently the CMS and PPR methods identify performance outliers, and explore how the PPR preventability component impacts hospital readmission rates, public reporting on CMS' Hospital Compare website, and pay-for-performance under CMS' Hospital Readmission Reduction Program for 3 conditions (acute myocardial infarction, heart failure, and pneumonia). We applied the CMS all-cause model and the PPR software to VA administrative data to calculate 30-day observed FY08-10 VA hospital readmission rates and hospital profiles. We then tested the effect of preventability on hospital readmission rates and outlier identification for reporting and pay-for-performance by replacing the dependent variable in the CMS all-cause model (Yes/No readmission) with the dichotomous PPR outcome (Yes/No preventable readmission). The CMS and PPR methods had moderate correlations in readmission rates for each condition. After controlling for all methodological differences but preventability, correlations increased to >90%. The assessment of preventability yielded different outlier results for public reporting in 7% of hospitals; for 30% of hospitals there would be an impact on Hospital Readmission Reduction Program reimbursement rates. Despite uncertainty over which readmission measure is superior in evaluating hospital performance, we confirmed that there are differences in CMS-generated and PPR-generated hospital profiles for reporting and pay-for-performance, because of methodological differences and the PPR's preventability component.

  1. Comparative meta-analysis and experimental kinetic investigation of column and batch bottle microcosm treatability studies informing in situ groundwater remedial design.

    PubMed

    Driver, Erin M; Roberts, Jeff; Dollar, Peter; Charles, Maurissa; Hurst, Paul; Halden, Rolf U

    2017-02-05

    A systematic comparison was performed between batch bottle and continuous-flow column microcosms (BMs and CMs, respectively) commonly used for in situ groundwater remedial design. Review of recent literature (2000-2014) showed a preference for reporting batch kinetics, even when corresponding column data were available. Additionally, CMs produced higher observed rate constants, exceeding those of BMs by a factor of 6.1±1.1 standard error. In a subsequent laboratory investigation, 12 equivalent microcosm pairs were constructed from fractured bedrock and perchloroethylene (PCE) impacted groundwater. First-order PCE transformation kinetics of CMs were 8.0±4.8 times faster than BMs (rates: 1.23±0.87 vs. 0.16±0.05d -1 , respectively). Additionally, CMs transformed 16.1±8.0-times more mass than BMs owing to continuous-feed operation. CMs are concluded to yield more reliable kinetic estimates because of much higher data density stemming from long-term, steady-state conditions. Since information from BMs and CMs is valuable and complementary, treatability studies should report kinetic data from both when available. This first systematic investigation of BMs and CMs highlights the need for a more unified framework for data use and reporting in treatability studies informing decision-making for field-scale groundwater remediation. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Exploiting volatile opportunistic computing resources with Lobster

    NASA Astrophysics Data System (ADS)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  3. “Exploring High Energy Interactions with CMS at the LHC”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sulak, Lawrence R.

    2016-08-01

    This High Energy Physics research project achieved its goal of exploring high-energy interactions with 7, 8 and 13 TeV data accumulated by CMS at the Energy Frontier. For the original hadron calorimeter (HCAL) and for its upgrade during Long Shutdown 1 (LS1), the PI helped propose and implement the upgrading the phototubes, new electronics, and fast timing of the hadronic forward (HF) and hadronic outer (HO) calorimeters of CMS, projects which he had forcefully advocated since the inception of CMS. The PI and his colleagues Prof. J. Rohlf and chief electronics engineer E. Hazen, his post-docs A. Heister and S.more » Girgis, and his graduate students (P. Lawson and D. Arcaro) contributed software tools used in perfecting of μTCA and Advanced Mezzanine Card (AMC13) electronics, the PC board that provides clock, timing and DAQ service for HCAL (and now many other subdetectors and central systems in the upgraded CMS detector). This Task reaped the benefits of these hardware contributions 1) to hermiticity for missing energy searches, and 2) to forward tagging jets for Vector Boson Fusion processes by analyzing and publishing early data, including that for the Higgs discovery and for exotic and supersymmetric searches.« less

  4. 78 FR 23690 - Airworthiness Directives; The Boeing Company

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-22

    ... management system (CMS) configuration database; and installing new operational program software (OPS) for the CSCP, zone management unit (ZMU), passenger address controller, cabin interphone controller, cabin area... on the Internet at http://www.regulations.gov ; or in person at the Docket Management Facility...

  5. Performance of the CMS muon detector and muon reconstruction with proton-proton collisions at $$\\sqrt{s}=$$ 13 TeV

    DOE PAGES

    Sirunyan, Albert M; et al.

    2018-06-19

    The CMS muon detector system, muon reconstruction software, and high-level trigger underwent significant changes in 2013-2014 in preparation for running at higher LHC collision energy and instantaneous luminosity. The performance of the modified system is studied using proton-proton collision data at center-of-mass energymore » $$\\sqrt{s}=$$ 13 TeV, collected at the LHC in 2015 and 2016. The measured performance parameters, including spatial resolution, efficiency, and timing, are found to meet all design specifications and are well reproduced by simulation. Despite the more challenging running conditions, the modified muon system is found to perform as well as, and in many aspects better than, previously. We dedicate this paper to the memory of Prof. Alberto Benvenuti, whose work was fundamental for the CMS muon detector.« less

  6. Performance of the CMS muon detector and muon reconstruction with proton-proton collisions at $$\\sqrt{s}=$$ 13 TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sirunyan, Albert M; et al.

    The CMS muon detector system, muon reconstruction software, and high-level trigger underwent significant changes in 2013-2014 in preparation for running at higher LHC collision energy and instantaneous luminosity. The performance of the modified system is studied using proton-proton collision data at center-of-mass energymore » $$\\sqrt{s}=$$ 13 TeV, collected at the LHC in 2015 and 2016. The measured performance parameters, including spatial resolution, efficiency, and timing, are found to meet all design specifications and are well reproduced by simulation. Despite the more challenging running conditions, the modified muon system is found to perform as well as, and in many aspects better than, previously. We dedicate this paper to the memory of Prof. Alberto Benvenuti, whose work was fundamental for the CMS muon detector.« less

  7. Toward a public analysis database for LHC new physics searches using M ADA NALYSIS 5

    NASA Astrophysics Data System (ADS)

    Dumont, B.; Fuks, B.; Kraml, S.; Bein, S.; Chalons, G.; Conte, E.; Kulkarni, S.; Sengupta, D.; Wymant, C.

    2015-02-01

    We present the implementation, in the MadAnalysis 5 framework, of several ATLAS and CMS searches for supersymmetry in data recorded during the first run of the LHC. We provide extensive details on the validation of our implementations and propose to create a public analysis database within this framework.

  8. CMS-dependent prognostic impact of KRAS and BRAFV600E mutations in primary colorectal cancer.

    PubMed

    Smeby, J; Sveen, A; Merok, M A; Danielsen, S A; Eilertsen, I A; Guren, M G; Dienstmann, R; Nesbakken, A; Lothe, R A

    2018-05-01

    The prognostic impact of KRAS and BRAFV600E mutations in primary colorectal cancer (CRC) varies with microsatellite instability (MSI) status. The gene expression-based consensus molecular subtypes (CMSs) of CRC define molecularly and clinically distinct subgroups, and represent a novel stratification framework in biomarker analysis. We investigated the prognostic value of these mutations within the CMS groups. Totally 1197 primary tumors from a Norwegian series of CRC stage I-IV were analyzed for MSI and mutation status in hotspots in KRAS (codons 12, 13 and 61) and BRAF (codon 600). A subset was analyzed for gene expression and confident CMS classification was obtained for 317 samples. This cohort was expanded with clinical and molecular data, including CMS classification, from 514 patients in the publically available dataset GSE39582. Gene expression signatures associated with KRAS and BRAFV600E mutations were used to evaluate differential impact of mutations on gene expression among the CMS groups. BRAFV600E and KRAS mutations were both associated with inferior 5-year overall survival (OS) exclusively in MSS tumors (BRAFV600E mutation versus KRAS/BRAF wild-type: Hazard ratio (HR) 2.85, P < 0.001; KRAS mutation versus KRAS/BRAF wild-type: HR 1.30, P = 0.013). BRAFV600E-mutated MSS tumors were strongly enriched and associated with metastatic disease in CMS1, leading to negative prognostic impact in this subtype (OS: BRAFV600E mutation versus wild-type: HR 7.73, P = 0.001). In contrast, the poor prognosis of KRAS mutations was limited to MSS tumors with CMS2/CMS3 epithelial-like gene expression profiles (OS: KRAS mutation versus wild-type: HR 1.51, P = 0.011). The subtype-specific prognostic associations were substantiated by differential effects of BRAFV600E and KRAS mutations on gene expression signatures according to the MSI status and CMS group. BRAFV600E mutations are enriched and associated with metastatic disease in CMS1 MSS tumors, leading to poor prognosis in this subtype. KRAS mutations are associated with adverse outcome in epithelial (CMS2/CMS3) MSS tumors.

  9. Evolution of CMS workload management towards multicore job support

    NASA Astrophysics Data System (ADS)

    Pérez-Calero Yzquierdo, A.; Hernández, J. M.; Khan, F. A.; Letts, J.; Majewski, K.; Rodrigues, A. M.; McCrea, A.; Vaandering, E.

    2015-12-01

    The successful exploitation of multicore processor architectures is a key element of the LHC distributed computing system in the coming era of the LHC Run 2. High-pileup complex-collision events represent a challenge for the traditional sequential programming in terms of memory and processing time budget. The CMS data production and processing framework is introducing the parallel execution of the reconstruction and simulation algorithms to overcome these limitations. CMS plans to execute multicore jobs while still supporting singlecore processing for other tasks difficult to parallelize, such as user analysis. The CMS strategy for job management thus aims at integrating single and multicore job scheduling across the Grid. This is accomplished by employing multicore pilots with internal dynamic partitioning of the allocated resources, capable of running payloads of various core counts simultaneously. An extensive test programme has been conducted to enable multicore scheduling with the various local batch systems available at CMS sites, with the focus on the Tier-0 and Tier-1s, responsible during 2015 of the prompt data reconstruction. Scale tests have been run to analyse the performance of this scheduling strategy and ensure an efficient use of the distributed resources. This paper presents the evolution of the CMS job management and resource provisioning systems in order to support this hybrid scheduling model, as well as its deployment and performance tests, which will enable CMS to transition to a multicore production model for the second LHC run.

  10. Evolution of CMS Workload Management Towards Multicore Job Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Calero Yzquierdo, A.; Hernández, J. M.; Khan, F. A.

    The successful exploitation of multicore processor architectures is a key element of the LHC distributed computing system in the coming era of the LHC Run 2. High-pileup complex-collision events represent a challenge for the traditional sequential programming in terms of memory and processing time budget. The CMS data production and processing framework is introducing the parallel execution of the reconstruction and simulation algorithms to overcome these limitations. CMS plans to execute multicore jobs while still supporting singlecore processing for other tasks difficult to parallelize, such as user analysis. The CMS strategy for job management thus aims at integrating single andmore » multicore job scheduling across the Grid. This is accomplished by employing multicore pilots with internal dynamic partitioning of the allocated resources, capable of running payloads of various core counts simultaneously. An extensive test programme has been conducted to enable multicore scheduling with the various local batch systems available at CMS sites, with the focus on the Tier-0 and Tier-1s, responsible during 2015 of the prompt data reconstruction. Scale tests have been run to analyse the performance of this scheduling strategy and ensure an efficient use of the distributed resources. This paper presents the evolution of the CMS job management and resource provisioning systems in order to support this hybrid scheduling model, as well as its deployment and performance tests, which will enable CMS to transition to a multicore production model for the second LHC run.« less

  11. Giving pandas ROOT to chew on: experiences with the XENON1T Dark Matter experiment

    NASA Astrophysics Data System (ADS)

    Remenska, D.; Tunnell, C.; Aalbers, J.; Verhoeven, S.; Maassen, J.; Templon, J.

    2017-10-01

    In preparation for the XENON1T Dark Matter data acquisition, we have prototyped and implemented a new computing model. The XENON signal and data processing software is developed fully in Python 3, and makes extensive use of generic scientific data analysis libraries, such as the SciPy stack. A certain tension between modern “Big Data” solutions and existing HEP frameworks is typically experienced in smaller particle physics experiments. ROOT is still the “standard” data format in our field, defined by large experiments (ATLAS, CMS). To ease the transition, our computing model caters to both analysis paradigms, leaving the choice of using ROOT-specific C++ libraries, or alternatively, Python and its data analytics tools, as a front-end choice of developing physics algorithms. We present our path on harmonizing these two ecosystems, which allowed us to use off-the-shelf software libraries (e.g., NumPy, SciPy, scikit-learn, matplotlib) and lower the cost of development and maintenance. To analyse the data, our software allows researchers to easily create “mini-trees” small, tabular ROOT structures for Python analysis, which can be read directly into pandas DataFrame structures. One of our goals was making ROOT available as a cross-platform binary for an easy installation from the Anaconda Cloud (without going through the “dependency hell”). In addition to helping us discover dark matter interactions, lowering this barrier helps shift the particle physics toward non-domain-specific code.

  12. The pixel tracking telescope at the Fermilab Test Beam Facility

    DOE PAGES

    Kwan, Simon; Lei, CM; Menasce, Dario; ...

    2016-03-01

    An all silicon pixel telescope has been assembled and used at the Fermilab Test Beam Facility (FTBF) since 2009 to provide precise tracking information for different test beam experiments with a wide range of Detectors Under Test (DUTs) requiring high resolution measurement of the track impact point. The telescope is based on CMS pixel modules left over from the CMS forward pixel production. Eight planes are arranged to achieve a resolution of less than 8 μm on the 120 GeV proton beam transverse coordinate at the DUT position. In order to achieve such resolution with 100 × 150 μm 2more » pixel cells, the planes were tilted to 25 degrees to maximize charge sharing between pixels. Crucial for obtaining this performance is the alignment software, called Monicelli, specifically designed and optimized for this system. This paper will describe the telescope hardware, the data acquisition system and the alignment software constituting this particle tracking system for test beam users.« less

  13. The CMS electron and photon trigger for the LHC Run 2

    NASA Astrophysics Data System (ADS)

    Dezoort, Gage; Xia, Fan

    2017-01-01

    The CMS experiment implements a sophisticated two-level triggering system composed of Level-1, instrumented by custom-design hardware boards, and a software High-Level-Trigger. A new Level-1 trigger architecture with improved performance is now being used to maintain the thresholds that were used in LHC Run I for the more challenging luminosity conditions experienced during Run II. The upgrades to the calorimetry trigger will be described along with performance data. The algorithms for the selection of final states with electrons and photons, both for precision measurements and for searches of new physics beyond the Standard Model, will be described in detail.

  14. Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks

    NASA Astrophysics Data System (ADS)

    Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.

    2010-12-01

    Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC).

  15. Technical Communications in OSS Content Management Systems: An Academic Institutional Case Study

    ERIC Educational Resources Information Center

    Cripps, Michael J.

    2011-01-01

    Single sourcing through a content management system (CMS) is altering technical communication practices in many organizations, including institutions of higher education. Open source software (OSS) solutions are currently among the most popular content management platforms adopted by colleges and universities in the United States and abroad. The…

  16. 20 CFR 30.710 - How are payments for inpatient medical services determined?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... services determined? 30.710 Section 30.710 Employees' Benefits OFFICE OF WORKERS' COMPENSATION PROGRAMS... services determined? (a) OWCP will pay for inpatient medical services according to pre-determined... discharges will be classified according to the DRGs prescribed by CMS in the form of the DRG Grouper software...

  17. Further Developments of BEM for Micro and Macromechanical Analyses of Composites: Boundary Element Software Technology-Composite User's Manual

    NASA Technical Reports Server (NTRS)

    Banerjee, P. K.; Henry, D. P.; Hopkins, D. A.; Goldberg, R. K.

    1997-01-01

    BEST-CMS (Boundary Element Solution Technology - Composite Modeling System) is an advanced engineering system for the micro-analysis of fiber composite structures. BEST-CMS is based upon the boundary element program BEST3D which was developed for NASA by Pratt and Whitney Aircraft and the State University of New York at Buffalo under contract NAS3-23697. BEST-CMS presently has the capabilities for elastostatic analysis, steady-state and transient heat transfer analysis, steady-state and transient concurrent thermoelastic analysis and elastoplastic and creep analysis. The fibers are assumed to be perfectly bonded to the composite matrix, or in the case of static or steady-state analysis, the fibers may be assumed to have spring connections, thermal resistance, and/or frictional sliding between the fibers and the composite matrix. The primary objective of this User's Manual is to provide an overview of all BEST-CMS capabilities, along with detailed descriptions of the input data requirements. A brief review of the theoretical background is presented for each analysis category. Then, Chapter 3 discusses the key aspects of the numerical implementation, while Chapter 4 provides a tutorial for the beginning BEST-CMS user. The heart of the manual, however, is in Chapter 5, where a complete description of all data input items is provided. Within this chapter, the individual entries are grouped on a functional basis for a more coherent presentation. Chapter 6 includes sample problems and should be of considerable assistance to the novice. Chapter 7 includes capsules of a number of fiber-composite analysis problems that have been solved using BEST-CMS. This chapter is primarily descriptive in nature and is intended merely to illustrate the level of analysis that is possible within the present BEST-CMS system. Chapter 8 contains a detailed description of the BEST-CMS Neutral File which is helpful in writing an interface between BEST- CMS and any graphic post-processor program. Finally, all pertinent references are listed in Chapter 9.

  18. Surface Pressure Dependencies in the GEOS-Chem-Adjoint System and the Impact of the GEOS-5 Surface Pressure on CO2 Model Forecast

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Weidner, Richard

    2016-01-01

    In the GEOS-Chem Adjoint (GCA) system, the total (wet) surface pressure of the GEOS meteorology is employed as dry surface pressure, ignoring the presence of water vapor. The Jet Propulsion Laboratory (JPL) Carbon Monitoring System (CMS) research team has been evaluating the impact of the above discrepancy on the CO2 model forecast and the CO2 flux inversion. The JPL CMS research utilizes a multi-mission assimilation framework developed by the Multi-Mission Observation Operator (M2O2) research team at JPL extending the GCA system. The GCA-M2O2 framework facilitates mission-generic 3D and 4D-variational assimilations streamlining the interfaces to the satellite data products and prior emission inventories. The GCA-M2O2 framework currently integrates the GCA system version 35h and provides a dry surface pressure setup to allow the CO2 model forecast to be performed with the GEOS-5 surface pressure directly or after converting it to dry surface pressure.

  19. Surface Pressure Dependencies in the Geos-Chem-Adjoint System and the Impact of the GEOS-5 Surface Pressure on CO2 Model Forecast

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Weidner, Richard

    2016-01-01

    In the GEOS-Chem Adjoint (GCA) system, the total (wet) surface pressure of the GEOS meteorology is employed as dry surface pressure, ignoring the presence of water vapor. The Jet Propulsion Laboratory (JPL) Carbon Monitoring System (CMS) research team has been evaluating the impact of the above discrepancy on the CO2 model forecast and the CO2 flux inversion. The JPL CMS research utilizes a multi-mission assimilation framework developed by the Multi-Mission Observation Operator (M2O2) research team at JPL extending the GCA system. The GCA-M2O2 framework facilitates mission-generic 3D and 4D-variational assimilations streamlining the interfaces to the satellite data products and prior emission inventories. The GCA-M2O2 framework currently integrates the GCA system version 35h and provides a dry surface pressure setup to allow the CO2 model forecast to be performed with the GEOS-5 surface pressure directly or after converting it to dry surface pressure.

  20. Whole-Genome Sequencing Uncovers the Genetic Basis of Chronic Mountain Sickness in Andean Highlanders

    PubMed Central

    Zhou, Dan; Udpa, Nitin; Ronen, Roy; Stobdan, Tsering; Liang, Junbin; Appenzeller, Otto; Zhao, Huiwen W.; Yin, Yi; Du, Yuanping; Guo, Lixia; Cao, Rui; Wang, Yu; Jin, Xin; Huang, Chen; Jia, Wenlong; Cao, Dandan; Guo, Guangwu; Gamboa, Jorge L.; Villafuerte, Francisco; Callacondo, David; Xue, Jin; Liu, Siqi; Frazer, Kelly A.; Li, Yingrui; Bafna, Vineet; Haddad, Gabriel G.

    2013-01-01

    The hypoxic conditions at high altitudes present a challenge for survival, causing pressure for adaptation. Interestingly, many high-altitude denizens (particularly in the Andes) are maladapted, with a condition known as chronic mountain sickness (CMS) or Monge disease. To decode the genetic basis of this disease, we sequenced and compared the whole genomes of 20 Andean subjects (10 with CMS and 10 without). We discovered 11 regions genome-wide with significant differences in haplotype frequencies consistent with selective sweeps. In these regions, two genes (an erythropoiesis regulator, SENP1, and an oncogene, ANP32D) had a higher transcriptional response to hypoxia in individuals with CMS relative to those without. We further found that downregulating the orthologs of these genes in flies dramatically enhanced survival rates under hypoxia, demonstrating that suppression of SENP1 and ANP32D plays an essential role in hypoxia tolerance. Our study provides an unbiased framework to identify and validate the genetic basis of adaptation to high altitudes and identifies potentially targetable mechanisms for CMS treatment. PMID:23954164

  1. The CMS trigger system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khachatryan, Vardan

    This paper describes the CMS trigger system and its performance during Run 1 of the LHC. The trigger system consists of two levels designed to select events of potential physics interest from a GHz (MHz) interaction rate of proton-proton (heavy ion) collisions. The first level of the trigger is implemented in hardware, and selects events containing detector signals consistent with an electron, photon, muon, tau lepton, jet, or missing transverse energy. A programmable menu of up to 128 object-based algorithms is used to select events for subsequent processing. The trigger thresholds are adjusted to the LHC instantaneous luminosity during datamore » taking in order to restrict the output rate to 100 kHz, the upper limit imposed by the CMS readout electronics. The second level, implemented in software, further refines the purity of the output stream, selecting an average rate of 400 Hz for offline event storage. The objectives, strategy and performance of the trigger system during the LHC Run 1 are described.« less

  2. The CMS trigger system

    NASA Astrophysics Data System (ADS)

    Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.; Adam, W.; Asilar, E.; Bergauer, T.; Brandstetter, J.; Brondolin, E.; Dragicevic, M.; Erö, J.; Flechl, M.; Friedl, M.; Frühwirth, R.; Ghete, V. M.; Hartl, C.; Hörmann, N.; Hrubec, J.; Jeitler, M.; Knünz, V.; König, A.; Krammer, M.; Krätschmer, I.; Liko, D.; Matsushita, T.; Mikulec, I.; Rabady, D.; Rahbaran, B.; Rohringer, H.; Schieck, J.; Schöfbeck, R.; Strauss, J.; Treberer-Treberspurg, W.; Waltenberger, W.; Wulz, C.-E.; Mossolov, V.; Shumeiko, N.; Suarez Gonzalez, J.; Alderweireldt, S.; Cornelis, T.; De Wolf, E. A.; Janssen, X.; Knutsson, A.; Lauwers, J.; Luyckx, S.; Van De Klundert, M.; Van Haevermaet, H.; Van Mechelen, P.; Van Remortel, N.; Van Spilbeeck, A.; Abu Zeid, S.; Blekman, F.; D'Hondt, J.; Daci, N.; De Bruyn, I.; Deroover, K.; Heracleous, N.; Keaveney, J.; Lowette, S.; Moreels, L.; Olbrechts, A.; Python, Q.; Strom, D.; Tavernier, S.; Van Doninck, W.; Van Mulders, P.; Van Onsem, G. P.; Van Parijs, I.; Barria, P.; Brun, H.; Caillol, C.; Clerbaux, B.; De Lentdecker, G.; Fasanella, G.; Favart, L.; Grebenyuk, A.; Karapostoli, G.; Lenzi, T.; Léonard, A.; Maerschalk, T.; Marinov, A.; Perniè, L.; Randle-conde, A.; Reis, T.; Seva, T.; Vander Velde, C.; Vanlaer, P.; Yonamine, R.; Zenoni, F.; Zhang, F.; Beernaert, K.; Benucci, L.; Cimmino, A.; Crucy, S.; Dobur, D.; Fagot, A.; Garcia, G.; Gul, M.; Mccartin, J.; Ocampo Rios, A. A.; Poyraz, D.; Ryckbosch, D.; Salva, S.; Sigamani, M.; Strobbe, N.; Tytgat, M.; Van Driessche, W.; Yazgan, E.; Zaganidis, N.; Basegmez, S.; Beluffi, C.; Bondu, O.; Brochet, S.; Bruno, G.; Caudron, A.; Ceard, L.; Da Silveira, G. G.; Delaere, C.; Favart, D.; Forthomme, L.; Giammanco, A.; Hollar, J.; Jafari, A.; Jez, P.; Komm, M.; Lemaitre, V.; Mertens, A.; Musich, M.; Nuttens, C.; Perrini, L.; Pin, A.; Piotrzkowski, K.; Popov, A.; Quertenmont, L.; Selvaggi, M.; Vidal Marono, M.; Beliy, N.; Hammad, G. H.; Aldá Júnior, W. L.; Alves, F. L.; Alves, G. A.; Brito, L.; Correa Martins Junior, M.; Hamer, M.; Hensel, C.; Mora Herrera, C.; Moraes, A.; Pol, M. E.; Rebello Teles, P.; Belchior Batista Das Chagas, E.; Carvalho, W.; Chinellato, J.; Custódio, A.; Da Costa, E. M.; Damiao, D. De Jesus; De Oliveira Martins, C.; Fonseca De Souza, S.; Huertas Guativa, L. M.; Malbouisson, H.; Matos Figueiredo, D.; Mundim, L.; Nogima, H.; Prado Da Silva, W. L.; Santoro, A.; Sznajder, A.; Tonelli Manganote, E. J.; Vilela Pereira, A.; Ahuja, S.; Bernardes, C. A.; De Souza Santos, A.; Dogra, S.; Fernandez Perez Tomei, T. R.; Gregores, E. M.; Mercadante, P. G.; Moon, C. S.; Novaes, S. F.; Padula, Sandra S.; Romero Abad, D.; Ruiz Vargas, J. C.; Aleksandrov, A.; Hadjiiska, R.; Iaydjiev, P.; Rodozov, M.; Stoykova, S.; Sultanov, G.; Vutova, M.; Dimitrov, A.; Glushkov, I.; Litov, L.; Pavlov, B.; Petkov, P.; Ahmad, M.; Bian, J. G.; Chen, G. M.; Chen, H. S.; Chen, M.; Cheng, T.; Du, R.; Jiang, C. H.; Plestina, R.; Romeo, F.; Shaheen, S. M.; Spiezia, A.; Tao, J.; Wang, C.; Wang, Z.; Zhang, H.; Asawatangtrakuldee, C.; Ban, Y.; Li, Q.; Liu, S.; Mao, Y.; Qian, S. J.; Wang, D.; Xu, Z.; Avila, C.; Cabrera, A.; Chaparro Sierra, L. F.; Florez, C.; Gomez, J. P.; Gomez Moreno, B.; Sanabria, J. C.; Godinovic, N.; Lelas, D.; Puljak, I.; Ribeiro Cipriano, P. M.; Antunovic, Z.; Kovac, M.; Brigljevic, V.; Kadija, K.; Luetic, J.; Micanovic, S.; Sudic, L.; Attikis, A.; Mavromanolakis, G.; Mousa, J.; Nicolaou, C.; Ptochos, F.; Razis, P. A.; Rykaczewski, H.; Bodlak, M.; Finger, M.; Finger, M., Jr.; Assran, Y.; El Sawy, M.; Elgammal, S.; Ellithi Kamel, A.; Mahmoud, M. A.; Calpas, B.; Kadastik, M.; Murumaa, M.; Raidal, M.; Tiko, A.; Veelken, C.; Eerola, P.; Pekkanen, J.; Voutilainen, M.; Härkönen, J.; Karimäki, V.; Kinnunen, R.; Lampén, T.; Lassila-Perini, K.; Lehti, S.; Lindén, T.; Luukka, P.; Mäenpää, T.; Peltola, T.; Tuominen, E.; Tuominiemi, J.; Tuovinen, E.; Wendland, L.; Talvitie, J.; Tuuva, T.; Besancon, M.; Couderc, F.; Dejardin, M.; Denegri, D.; Fabbro, B.; Faure, J. L.; Favaro, C.; Ferri, F.; Ganjour, S.; Givernaud, A.; Gras, P.; Hamel de Monchenault, G.; Jarry, P.; Locci, E.; Machet, M.; Malcles, J.; Rander, J.; Rosowsky, A.; Titov, M.; Zghiche, A.; Antropov, I.; Baffioni, S.; Beaudette, F.; Busson, P.; Cadamuro, L.; Chapon, E.; Charlot, C.; Dahms, T.; Davignon, O.; Filipovic, N.; Florent, A.; Granier de Cassagnac, R.; Lisniak, S.; Mastrolorenzo, L.; Miné, P.; Naranjo, I. N.; Nguyen, M.; Ochando, C.; Ortona, G.; Paganini, P.; Pigard, P.; Regnard, S.; Salerno, R.; Sauvan, J. B.; Sirois, Y.; Strebler, T.; Yilmaz, Y.; Zabi, A.; Agram, J.-L.; Andrea, J.; Aubin, A.; Bloch, D.; Brom, J.-M.; Buttignol, M.; Chabert, E. C.; Chanon, N.; Collard, C.; Conte, E.; Coubez, X.; Fontaine, J.-C.; Gelé, D.; Goerlach, U.; Goetzmann, C.; Le Bihan, A.-C.; Merlin, J. A.; Skovpen, K.; Van Hove, P.; Gadrat, S.; Beauceron, S.; Bernet, C.; Boudoul, G.; Bouvier, E.; Carrillo Montoya, C. A.; Chierici, R.; Contardo, D.; Courbon, B.; Depasse, P.; El Mamouni, H.; Fan, J.; Fay, J.; Gascon, S.; Gouzevitch, M.; Ille, B.; Lagarde, F.; Laktineh, I. B.; Lethuillier, M.; Mirabito, L.; Pequegnot, A. L.; Perries, S.; Ruiz Alvarez, J. D.; Sabes, D.; Sgandurra, L.; Sordini, V.; Vander Donckt, M.; Verdier, P.; Viret, S.; Toriashvili, T.; Tsamalaidze, Z.; Autermann, C.; Beranek, S.; Edelhoff, M.; Feld, L.; Heister, A.; Kiesel, M. K.; Klein, K.; Lipinski, M.; Ostapchuk, A.; Preuten, M.; Raupach, F.; Schael, S.; Schulte, J. F.; Verlage, T.; Weber, H.; Wittmer, B.; Zhukov, V.; Ata, M.; Brodski, M.; Dietz-Laursonn, E.; Duchardt, D.; Endres, M.; Erdmann, M.; Erdweg, S.; Esch, T.; Fischer, R.; Güth, A.; Hebbeker, T.; Heidemann, C.; Hoepfner, K.; Klingebiel, D.; Knutzen, S.; Kreuzer, P.; Merschmeyer, M.; Meyer, A.; Millet, P.; Olschewski, M.; Padeken, K.; Papacz, P.; Pook, T.; Radziej, M.; Reithler, H.; Rieger, M.; Scheuch, F.; Sonnenschein, L.; Teyssier, D.; Thüer, S.; Cherepanov, V.; Erdogan, Y.; Flügge, G.; Geenen, H.; Geisler, M.; Hoehle, F.; Kargoll, B.; Kress, T.; Kuessel, Y.; Künsken, A.; Lingemann, J.; Nehrkorn, A.; Nowack, A.; Nugent, I. M.; Pistone, C.; Pooth, O.; Stahl, A.; Aldaya Martin, M.; Asin, I.; Bartosik, N.; Behnke, O.; Behrens, U.; Bell, A. J.; Borras, K.; Burgmeier, A.; Campbell, A.; Choudhury, S.; Costanza, F.; Diez Pardos, C.; Dolinska, G.; Dooling, S.; Dorland, T.; Eckerlin, G.; Eckstein, D.; Eichhorn, T.; Flucke, G.; Gallo, E.; Garay Garcia, J.; Geiser, A.; Gizhko, A.; Gunnellini, P.; Hauk, J.; Hempel, M.; Jung, H.; Kalogeropoulos, A.; Karacheban, O.; Kasemann, M.; Katsas, P.; Kieseler, J.; Kleinwort, C.; Korol, I.; Lange, W.; Leonard, J.; Lipka, K.; Lobanov, A.; Lohmann, W.; Mankel, R.; Marfin, I.; Melzer-Pellmann, I.-A.; Meyer, A. B.; Mittag, G.; Mnich, J.; Mussgiller, A.; Naumann-Emme, S.; Nayak, A.; Ntomari, E.; Perrey, H.; Pitzl, D.; Placakyte, R.; Raspereza, A.; Roland, B.; Sahin, M. Ö.; Saxena, P.; Schoerner-Sadenius, T.; Schröder, M.; Seitz, C.; Spannagel, S.; Trippkewitz, K. D.; Walsh, R.; Wissing, C.; Blobel, V.; Centis Vignali, M.; Draeger, A. R.; Erfle, J.; Garutti, E.; Goebel, K.; Gonzalez, D.; Görner, M.; Haller, J.; Hoffmann, M.; Höing, R. S.; Junkes, A.; Klanner, R.; Kogler, R.; Kovalchuk, N.; Lapsien, T.; Lenz, T.; Marchesini, I.; Marconi, D.; Meyer, M.; Nowatschin, D.; Ott, J.; Pantaleo, F.; Peiffer, T.; Perieanu, A.; Pietsch, N.; Poehlsen, J.; Rathjens, D.; Sander, C.; Scharf, C.; Schettler, H.; Schleper, P.; Schlieckau, E.; Schmidt, A.; Schwandt, J.; Sola, V.; Stadie, H.; Steinbrück, G.; Tholen, H.; Troendle, D.; Usai, E.; Vanelderen, L.; Vanhoefer, A.; Vormwald, B.; Akbiyik, M.; Barth, C.; Baus, C.; Berger, J.; Böser, C.; Butz, E.; Chwalek, T.; Colombo, F.; De Boer, W.; Descroix, A.; Dierlamm, A.; Fink, S.; Frensch, F.; Friese, R.; Giffels, M.; Gilbert, A.; Haitz, D.; Hartmann, F.; Heindl, S. M.; Husemann, U.; Katkov, I.; Kornmayer, A.; Lobelle Pardo, P.; Maier, B.; Mildner, H.; Mozer, M. U.; Müller, T.; Müller, Th.; Plagge, M.; Quast, G.; Rabbertz, K.; Röcker, S.; Roscher, F.; Sieber, G.; Simonis, H. J.; Stober, F. M.; Ulrich, R.; Wagner-Kuhr, J.; Wayand, S.; Weber, M.; Weiler, T.; Wöhrmann, C.; Wolf, R.; Anagnostou, G.; Daskalakis, G.; Geralis, T.; Giakoumopoulou, V. A.; Kyriakis, A.; Loukas, D.; Psallidas, A.; Topsis-Giotis, I.; Agapitos, A.; Kesisoglou, S.; Panagiotou, A.; Saoulidou, N.; Tziaferi, E.; Evangelou, I.; Flouris, G.; Foudas, C.; Kokkas, P.; Loukas, N.; Manthos, N.; Papadopoulos, I.; Paradas, E.; Strologas, J.; Bencze, G.; Hajdu, C.; Hazi, A.; Hidas, P.; Horvath, D.; Sikler, F.; Veszpremi, V.; Vesztergombi, G.; Zsigmond, A. J.; Beni, N.; Czellar, S.; Karancsi, J.; Molnar, J.; Szillasi, Z.; Bartók, M.; Makovec, A.; Raics, P.; Trocsanyi, Z. L.; Ujvari, B.; Mal, P.; Mandal, K.; Sahoo, D. K.; Sahoo, N.; Swain, S. K.; Bansal, S.; Beri, S. B.; Bhatnagar, V.; Chawla, R.; Gupta, R.; Bhawandeep, U.; Kalsi, A. K.; Kaur, A.; Kaur, M.; Kumar, R.; Mehta, A.; Mittal, M.; Singh, J. B.; Walia, G.; Kumar, Ashok; Bhardwaj, A.; Choudhary, B. C.; Garg, R. B.; Kumar, A.; Malhotra, S.; Naimuddin, M.; Nishu, N.; Ranjan, K.; Sharma, R.; Sharma, V.; Bhattacharya, S.; Chatterjee, K.; Dey, S.; Dutta, S.; Jain, Sa.; Majumdar, N.; Modak, A.; Mondal, K.; Mukherjee, S.; Mukhopadhyay, S.; Roy, A.; Roy, D.; Chowdhury, S. Roy; Sarkar, S.; Sharan, M.; Abdulsalam, A.; Chudasama, R.; Dutta, D.; Jha, V.; Kumar, V.; Mohanty, A. K.; Pant, L. M.; Shukla, P.; Topkar, A.; Aziz, T.; Banerjee, S.; Bhowmik, S.; Chatterjee, R. M.; Dewanjee, R. K.; Dugad, S.; Ganguly, S.; Ghosh, S.; Guchait, M.; Gurtu, A.; Kole, G.; Kumar, S.; Mahakud, B.; Maity, M.; Majumder, G.; Mazumdar, K.; Mitra, S.; Mohanty, G. B.; Parida, B.; Sarkar, T.; Sur, N.; Sutar, B.; Wickramage, N.; Chauhan, S.; Dube, S.; Kothekar, K.; Sharma, S.; Bakhshiansohi, H.; Behnamian, H.; Etesami, S. M.; Fahim, A.; Goldouzian, R.; Khakzad, M.; Najafabadi, M. Mohammadi; Naseri, M.; Paktinat Mehdiabadi, S.; Rezaei Hosseinabadi, F.; Safarzadeh, B.; Zeinali, M.; Felcini, M.; Grunewald, M.; Abbrescia, M.; Calabria, C.; Caputo, C.; Colaleo, A.; Creanza, D.; Cristella, L.; De Filippis, N.; De Palma, M.; Fiore, L.; Iaselli, G.; Maggi, G.; Maggi, M.; Miniello, G.; My, S.; Nuzzo, S.; Pompili, A.; Pugliese, G.; Radogna, R.; Ranieri, A.; Selvaggi, G.; Silvestris, L.; Venditti, R.; Verwilligen, P.; Abbiendi, G.; Battilana, C.; Benvenuti, A. C.; Bonacorsi, D.; Braibant-Giacomelli, S.; Brigliadori, L.; Campanini, R.; Capiluppi, P.; Castro, A.; Cavallo, F. R.; Chhibra, S. S.; Codispoti, G.; Cuffiani, M.; Dallavalle, G. M.; Fabbri, F.; Fanfani, A.; Fasanella, D.; Giacomelli, P.; Grandi, C.; Guiducci, L.; Marcellini, S.; Masetti, G.; Montanari, A.; Navarria, F. L.; Perrotta, A.; Rossi, A. M.; Rovelli, T.; Siroli, G. P.; Tosi, N.; Travaglini, R.; Cappello, G.; Chiorboli, M.; Costa, S.; Di Mattia, A.; Giordano, F.; Potenza, R.; Tricomi, A.; Tuve, C.; Barbagli, G.; Ciulli, V.; Civinini, C.; D'Alessandro, R.; Focardi, E.; Gonzi, S.; Gori, V.; Lenzi, P.; Meschini, M.; Paoletti, S.; Sguazzoni, G.; Tropiano, A.; Viliani, L.; Benussi, L.; Bianco, S.; Fabbri, F.; Piccolo, D.; Primavera, F.; Calvelli, V.; Ferro, F.; Lo Vetere, M.; Monge, M. R.; Robutti, E.; Tosi, S.; Brianza, L.; Dinardo, M. E.; Fiorendi, S.; Gennai, S.; Gerosa, R.; Ghezzi, A.; Govoni, P.; Malvezzi, S.; Manzoni, R. A.; Marzocchi, B.; Menasce, D.; Moroni, L.; Paganoni, M.; Pedrini, D.; Ragazzi, S.; Redaelli, N.; Tabarelli de Fatis, T.; Buontempo, S.; Cavallo, N.; Di Guida, S.; Esposito, M.; Fabozzi, F.; Iorio, A. O. M.; Lanza, G.; Lista, L.; Meola, S.; Merola, M.; Paolucci, P.; Sciacca, C.; Thyssen, F.; Bacchetta, N.; Bellato, M.; Benato, L.; Bisello, D.; Boletti, A.; Carlin, R.; Checchia, P.; Dall'Osso, M.; Dosselli, U.; Gasparini, F.; Gasparini, U.; Gozzelino, A.; Lacaprara, S.; Margoni, M.; Meneguzzo, A. T.; Montecassiano, F.; Passaseo, M.; Pazzini, J.; Pegoraro, M.; Pozzobon, N.; Simonetto, F.; Torassa, E.; Tosi, M.; Vanini, S.; Ventura, S.; Zanetti, M.; Zotto, P.; Zucchetta, A.; Zumerle, G.; Braghieri, A.; Magnani, A.; Montagna, P.; Ratti, S. P.; Re, V.; Riccardi, C.; Salvini, P.; Vai, I.; Vitulo, P.; Alunni Solestizi, L.; Biasini, M.; Bilei, G. M.; Ciangottini, D.; Fanò, L.; Lariccia, P.; Mantovani, G.; Menichelli, M.; Saha, A.; Santocchia, A.; Androsov, K.; Azzurri, P.; Bagliesi, G.; Bernardini, J.; Boccali, T.; Castaldi, R.; Ciocci, M. A.; Dell'Orso, R.; Donato, S.; Fedi, G.; Foà, L.; Giassi, A.; Grippo, M. T.; Ligabue, F.; Lomtadze, T.; Martini, L.; Messineo, A.; Palla, F.; Rizzi, A.; Savoy-Navarro, A.; Serban, A. T.; Spagnolo, P.; Tenchini, R.; Tonelli, G.; Venturi, A.; Verdini, P. G.; Barone, L.; Cavallari, F.; D'imperio, G.; Del Re, D.; Diemoz, M.; Gelli, S.; Jorda, C.; Longo, E.; Margaroli, F.; Meridiani, P.; Organtini, G.; Paramatti, R.; Preiato, F.; Rahatlou, S.; Rovelli, C.; Santanastasio, F.; Traczyk, P.; Amapane, N.; Arcidiacono, R.; Argiro, S.; Arneodo, M.; Bellan, R.; Biino, C.; Cartiglia, N.; Costa, M.; Covarelli, R.; Degano, A.; Demaria, N.; Finco, L.; Kiani, B.; Mariotti, C.; Maselli, S.; Migliore, E.; Monaco, V.; Monteil, E.; Obertino, M. M.; Pacher, L.; Pastrone, N.; Pelliccioni, M.; Pinna Angioni, G. L.; Ravera, F.; Romero, A.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Tamponi, U.; Belforte, S.; Candelise, V.; Casarsa, M.; Cossutti, F.; Della Ricca, G.; Gobbo, B.; La Licata, C.; Marone, M.; Schizzi, A.; Zanetti, A.; Kropivnitskaya, A.; Nam, S. K.; Kim, D. H.; Kim, G. N.; Kim, M. S.; Kong, D. J.; Lee, S.; Oh, Y. D.; Sakharov, A.; Son, D. C.; Brochero Cifuentes, J. A.; Kim, H.; Kim, T. J.; Song, S.; Choi, S.; Go, Y.; Gyun, D.; Hong, B.; Jo, M.; Kim, H.; Kim, Y.; Lee, B.; Lee, K.; Lee, K. S.; Lee, S.; Park, S. K.; Roh, Y.; Yoo, H. D.; Choi, M.; Kim, H.; Kim, J. H.; Lee, J. S. H.; Park, I. C.; Ryu, G.; Ryu, M. S.; Choi, Y.; Goh, J.; Kim, D.; Kwon, E.; Lee, J.; Yu, I.; Dudenas, V.; Juodagalvis, A.; Vaitkus, J.; Ahmed, I.; Ibrahim, Z. A.; Komaragiri, J. R.; Ali, M. A. B. Md; Mohamad Idris, F.; Abdullah, W. A. T. Wan; Yusli, M. N.; Casimiro Linares, E.; Castilla-Valdez, H.; De La Cruz-Burelo, E.; Heredia-De La Cruz, I.; Hernandez-Almada, A.; Lopez-Fernandez, R.; Sanchez-Hernandez, A.; Carrillo Moreno, S.; Vazquez Valencia, F.; Pedraza, I.; Salazar Ibarguen, H. A.; Morelos Pineda, A.; Krofcheck, D.; Butler, P. H.; Ahmad, A.; Ahmad, M.; Hassan, Q.; Hoorani, H. R.; Khan, W. A.; Khurshid, T.; Shoaib, M.; Bialkowska, H.; Bluj, M.; Boimska, B.; Frueboes, T.; Górski, M.; Kazana, M.; Nawrocki, K.; Romanowska-Rybinska, K.; Szleper, M.; Zalewski, P.; Brona, G.; Bunkowski, K.; Byszuk, A.; Doroba, K.; Kalinowski, A.; Kierzkowski, K.; Konecki, M.; Krolikowski, J.; Misiura, M.; Oklinski, W.; Olszewski, M.; Pozniak, K.; Walczak, M.; Zabolotny, W.; Bargassa, P.; Silva, C. Beirão Da Cruz E.; Di Francesco, A.; Faccioli, P.; Ferreira Parracho, P. G.; Gallinaro, M.; Leonardo, N.; Lloret Iglesias, L.; Nguyen, F.; Rodrigues Antunes, J.; Seixas, J.; Toldaiev, O.; Vadruccio, D.; Varela, J.; Vischia, P.; Afanasiev, S.; Bunin, P.; Gavrilenko, M.; Golutvin, I.; Gorbunov, I.; Kamenev, A.; Karjavin, V.; Konoplyanikov, V.; Lanev, A.; Malakhov, A.; Matveev, V.; Moisenz, P.; Palichik, V.; Perelygin, V.; Shmatov, S.; Shulha, S.; Skatchkov, N.; Smirnov, V.; Zarubin, A.; Golovtsov, V.; Ivanov, Y.; Kim, V.; Kuznetsova, E.; Levchenko, P.; Murzin, V.; Oreshkin, V.; Smirnov, I.; Sulimov, V.; Uvarov, L.; Vavilov, S.; Vorobyev, A.; Andreev, Yu.; Dermenev, A.; Gninenko, S.; Golubev, N.; Karneyeu, A.; Kirsanov, M.; Krasnikov, N.; Pashenkov, A.; Tlisov, D.; Toropin, A.; Epshteyn, V.; Gavrilov, V.; Lychkovskaya, N.; Popov, V.; Pozdnyakov, I.; Safronov, G.; Spiridonov, A.; Vlasov, E.; Zhokin, A.; Bylinkin, A.; Andreev, V.; Azarkin, M.; Dremin, I.; Kirakosyan, M.; Leonidov, A.; Mesyats, G.; Rusakov, S. V.; Baskakov, A.; Belyaev, A.; Boos, E.; Dubinin, M.; Dudko, L.; Ershov, A.; Gribushin, A.; Kaminskiy, A.; Klyukhin, V.; Kodolova, O.; Lokhtin, I.; Myagkov, I.; Obraztsov, S.; Petrushanko, S.; Savrin, V.; Azhgirey, I.; Bayshev, I.; Bitioukov, S.; Kachanov, V.; Kalinin, A.; Konstantinov, D.; Krychkine, V.; Petrov, V.; Ryutin, R.; Sobol, A.; Tourtchanovitch, L.; Troshin, S.; Tyurin, N.; Uzunian, A.; Volkov, A.; Adzic, P.; Milosevic, J.; Rekovic, V.; Alcaraz Maestre, J.; Calvo, E.; Cerrada, M.; Chamizo Llatas, M.; Colino, N.; De La Cruz, B.; Delgado Peris, A.; Domínguez Vázquez, D.; Escalante Del Valle, A.; Fernandez Bedoya, C.; Fernández Ramos, J. P.; Flix, J.; Fouz, M. C.; Garcia-Abia, P.; Gonzalez Lopez, O.; Goy Lopez, S.; Hernandez, J. M.; Josa, M. I.; Navarro De Martino, E.; Pérez-Calero Yzquierdo, A.; Puerta Pelayo, J.; Quintario Olmeda, A.; Redondo, I.; Romero, L.; Santaolalla, J.; Soares, M. S.; Albajar, C.; de Trocóniz, J. F.; Missiroli, M.; Moran, D.; Cuevas, J.; Fernandez Menendez, J.; Folgueras, S.; Gonzalez Caballero, I.; Palencia Cortezon, E.; Vizan Garcia, J. M.; Cabrillo, I. J.; Calderon, A.; Castiñeiras De Saa, J. R.; De Castro Manzano, P.; Duarte Campderros, J.; Fernandez, M.; Garcia-Ferrero, J.; Gomez, G.; Lopez Virto, A.; Marco, J.; Marco, R.; Martinez Rivero, C.; Matorras, F.; Munoz Sanchez, F. J.; Piedra Gomez, J.; Rodrigo, T.; Rodríguez-Marrero, A. Y.; Ruiz-Jimeno, A.; Scodellaro, L.; Trevisani, N.; Vila, I.; Vilar Cortabitarte, R.; Abbaneo, D.; Auffray, E.; Auzinger, G.; Bachtis, M.; Baillon, P.; Ball, A. H.; Barney, D.; Benaglia, A.; Bendavid, J.; Benhabib, L.; Benitez, J. F.; Berruti, G. M.; Bloch, P.; Bocci, A.; Bonato, A.; Botta, C.; Breuker, H.; Camporesi, T.; Castello, R.; Cerminara, G.; D'Alfonso, M.; d'Enterria, D.; Dabrowski, A.; Daponte, V.; David, A.; De Gruttola, M.; De Guio, F.; De Roeck, A.; De Visscher, S.; Di Marco, E.; Dobson, M.; Dordevic, M.; Dorney, B.; du Pree, T.; Dünser, M.; Dupont, N.; Elliott-Peisert, A.; Franzoni, G.; Funk, W.; Gigi, D.; Gill, K.; Giordano, D.; Girone, M.; Glege, F.; Guida, R.; Gundacker, S.; Guthoff, M.; Hammer, J.; Harris, P.; Hegeman, J.; Innocente, V.; Janot, P.; Kirschenmann, H.; Kortelainen, M. J.; Kousouris, K.; Krajczar, K.; Lecoq, P.; Lourenço, C.; Lucchini, M. T.; Magini, N.; Malgeri, L.; Mannelli, M.; Martelli, A.; Masetti, L.; Meijers, F.; Mersi, S.; Meschi, E.; Moortgat, F.; Morovic, S.; Mulders, M.; Nemallapudi, M. V.; Neugebauer, H.; Orfanelli, S.; Orsini, L.; Pape, L.; Perez, E.; Peruzzi, M.; Petrilli, A.; Petrucciani, G.; Pfeiffer, A.; Piparo, D.; Racz, A.; Rolandi, G.; Rovere, M.; Ruan, M.; Sakulin, H.; Schäfer, C.; Schwick, C.; Seidel, M.; Sharma, A.; Silva, P.; Simon, M.; Sphicas, P.; Steggemann, J.; Stieger, B.; Stoye, M.; Takahashi, Y.; Treille, D.; Triossi, A.; Tsirou, A.; Veres, G. I.; Wardle, N.; Wöhri, H. K.; Zagozdzinska, A.; Zeuner, W. D.; Bertl, W.; Deiters, K.; Erdmann, W.; Horisberger, R.; Ingram, Q.; Kaestli, H. C.; Kotlinski, D.; Langenegger, U.; Renker, D.; Rohe, T.; Bachmair, F.; Bäni, L.; Bianchini, L.; Casal, B.; Dissertori, G.; Dittmar, M.; Donegà, M.; Eller, P.; Grab, C.; Heidegger, C.; Hits, D.; Hoss, J.; Kasieczka, G.; Lustermann, W.; Mangano, B.; Marionneau, M.; Martinez Ruiz del Arbol, P.; Masciovecchio, M.; Meister, D.; Micheli, F.; Musella, P.; Nessi-Tedaldi, F.; Pandolfi, F.; Pata, J.; Pauss, F.; Perrozzi, L.; Quittnat, M.; Rossini, M.; Starodumov, A.; Takahashi, M.; Tavolaro, V. R.; Theofilatos, K.; Wallny, R.; Aarrestad, T. K.; Amsler, C.; Caminada, L.; Canelli, M. F.; Chiochia, V.; De Cosa, A.; Galloni, C.; Hinzmann, A.; Hreus, T.; Kilminster, B.; Lange, C.; Ngadiuba, J.; Pinna, D.; Robmann, P.; Ronga, F. J.; Salerno, D.; Yang, Y.; Cardaci, M.; Chen, K. H.; Doan, T. H.; Jain, Sh.; Khurana, R.; Konyushikhin, M.; Kuo, C. M.; Lin, W.; Lu, Y. J.; Yu, S. S.; Kumar, Arun; Bartek, R.; Chang, P.; Chang, Y. H.; Chang, Y. W.; Chao, Y.; Chen, K. F.; Chen, P. H.; Dietz, C.; Fiori, F.; Grundler, U.; Hou, W.-S.; Hsiung, Y.; Liu, Y. F.; Lu, R.-S.; Miñano Moya, M.; Petrakou, E.; Tsai, J. f.; Tzeng, Y. M.; Asavapibhop, B.; Kovitanggoon, K.; Singh, G.; Srimanobhas, N.; Suwonjandee, N.; Adiguzel, A.; Bakirci, M. N.; Demiroglu, Z. S.; Dozen, C.; Eskut, E.; Girgis, S.; Gokbulut, G.; Guler, Y.; Gurpinar, E.; Hos, I.; Kangal, E. E.; Onengut, G.; Ozdemir, K.; Polatoz, A.; Sunar Cerci, D.; Tali, B.; Topakli, H.; Vergili, M.; Zorbilmez, C.; Akin, I. V.; Bilin, B.; Bilmis, S.; Isildak, B.; Karapinar, G.; Yalvac, M.; Zeyrek, M.; Gülmez, E.; Kaya, M.; Kaya, O.; Yetkin, E. A.; Yetkin, T.; Cakir, A.; Cankocak, K.; Sen, S.; Vardarlı, F. I.; Grynyov, B.; Levchuk, L.; Sorokin, P.; Aggleton, R.; Ball, F.; Beck, L.; Brooke, J. J.; Clement, E.; Cussans, D.; Flacher, H.; Goldstein, J.; Grimes, M.; Heath, G. P.; Heath, H. F.; Jacob, J.; Kreczko, L.; Lucas, C.; Meng, Z.; Newbold, D. M.; Paramesvaran, S.; Poll, A.; Sakuma, T.; Seif El Nasr-storey, S.; Senkin, S.; Smith, D.; Smith, V. J.; Bell, K. W.; Belyaev, A.; Brew, C.; Brown, R. M.; Calligaris, L.; Cieri, D.; Cockerill, D. J. A.; Coughlan, J. A.; Harder, K.; Harper, S.; Olaiya, E.; Petyt, D.; Shepherd-Themistocleous, C. H.; Thea, A.; Tomalin, I. R.; Williams, T.; Womersley, W. J.; Worm, S. D.; Baber, M.; Bainbridge, R.; Buchmuller, O.; Bundock, A.; Burton, D.; Casasso, S.; Citron, M.; Colling, D.; Corpe, L.; Cripps, N.; Dauncey, P.; Davies, G.; De Wit, A.; Della Negra, M.; Dunne, P.; Elwood, A.; Ferguson, W.; Fulcher, J.; Futyan, D.; Hall, G.; Iles, G.; Kenzie, M.; Lane, R.; Lucas, R.; Lyons, L.; Magnan, A.-M.; Malik, S.; Nash, J.; Nikitenko, A.; Pela, J.; Pesaresi, M.; Petridis, K.; Raymond, D. M.; Richards, A.; Rose, A.; Seez, C.; Tapper, A.; Uchida, K.; Vazquez Acosta, M.; Virdee, T.; Zenz, S. C.; Cole, J. E.; Hobson, P. R.; Khan, A.; Kyberd, P.; Leggat, D.; Leslie, D.; Reid, I. D.; Symonds, P.; Teodorescu, L.; Turner, M.; Borzou, A.; Call, K.; Dittmann, J.; Hatakeyama, K.; Liu, H.; Pastika, N.; Charaf, O.; Cooper, S. I.; Henderson, C.; Rumerio, P.; Arcaro, D.; Avetisyan, A.; Bose, T.; Fantasia, C.; Gastler, D.; Lawson, P.; Rankin, D.; Richardson, C.; Rohlf, J.; St. John, J.; Sulak, L.; Zou, D.; Alimena, J.; Berry, E.; Bhattacharya, S.; Cutts, D.; Dhingra, N.; Ferapontov, A.; Garabedian, A.; Hakala, J.; Heintz, U.; Laird, E.; Landsberg, G.; Mao, Z.; Narain, M.; Piperov, S.; Sagir, S.; Syarif, R.; Breedon, R.; Breto, G.; Calderon De La Barca Sanchez, M.; Chauhan, S.; Chertok, M.; Conway, J.; Conway, R.; Cox, P. T.; Erbacher, R.; Gardner, M.; Ko, W.; Lander, R.; Mulhearn, M.; Pellett, D.; Pilot, J.; Ricci-Tam, F.; Shalhout, S.; Smith, J.; Squires, M.; Stolp, D.; Tripathi, M.; Wilbur, S.; Yohay, R.; Cousins, R.; Everaerts, P.; Farrell, C.; Hauser, J.; Ignatenko, M.; Saltzberg, D.; Takasugi, E.; Valuev, V.; Weber, M.; Burt, K.; Clare, R.; Ellison, J.; Gary, J. W.; Hanson, G.; Heilman, J.; Ivova PANEVA, M.; Jandir, P.; Kennedy, E.; Lacroix, F.; Long, O. R.; Luthra, A.; Malberti, M.; Olmedo Negrete, M.; Shrinivas, A.; Wei, H.; Wimpenny, S.; Yates, B. R.; Branson, J. G.; Cerati, G. B.; Cittolin, S.; D'Agnolo, R. T.; Derdzinski, M.; Holzner, A.; Kelley, R.; Klein, D.; Letts, J.; Macneill, I.; Olivito, D.; Padhi, S.; Pieri, M.; Sani, M.; Sharma, V.; Simon, S.; Tadel, M.; Vartak, A.; Wasserbaech, S.; Welke, C.; Würthwein, F.; Yagil, A.; Zevi Della Porta, G.; Bradmiller-Feld, J.; Campagnari, C.; Dishaw, A.; Dutta, V.; Flowers, K.; Sevilla, M. Franco; Geffert, P.; George, C.; Golf, F.; Gouskos, L.; Gran, J.; Incandela, J.; Mccoll, N.; Mullin, S. D.; Richman, J.; Stuart, D.; Suarez, I.; West, C.; Yoo, J.; Anderson, D.; Apresyan, A.; Bornheim, A.; Bunn, J.; Chen, Y.; Duarte, J.; Mott, A.; Newman, H. B.; Pena, C.; Pierini, M.; Spiropulu, M.; Vlimant, J. R.; Xie, S.; Zhu, R. Y.; Andrews, M. B.; Azzolini, V.; Calamba, A.; Carlson, B.; Ferguson, T.; Paulini, M.; Russ, J.; Sun, M.; Vogel, H.; Vorobiev, I.; Cumalat, J. P.; Ford, W. T.; Gaz, A.; Jensen, F.; Johnson, A.; Krohn, M.; Mulholland, T.; Nauenberg, U.; Stenson, K.; Wagner, S. R.; Alexander, J.; Chatterjee, A.; Chaves, J.; Chu, J.; Dittmer, S.; Eggert, N.; Mirman, N.; Kaufman, G. Nicolas; Patterson, J. R.; Rinkevicius, A.; Ryd, A.; Skinnari, L.; Soffi, L.; Sun, W.; Tan, S. M.; Teo, W. D.; Thom, J.; Thompson, J.; Tucker, J.; Weng, Y.; Wittich, P.; Abdullin, S.; Albrow, M.; Anderson, J.; Apollinari, G.; Banerjee, S.; Bauerdick, L. A. T.; Beretvas, A.; Berryhill, J.; Bhat, P. C.; Bolla, G.; Burkett, K.; Butler, J. N.; Cheung, H. W. K.; Chlebana, F.; Cihangir, S.; Elvira, V. D.; Fisk, I.; Freeman, J.; Gottschalk, E.; Gray, L.; Green, D.; Grünendahl, S.; Gutsche, O.; Hanlon, J.; Hare, D.; Harris, R. M.; Hasegawa, S.; Hirschauer, J.; Hu, Z.; Jayatilaka, B.; Jindariani, S.; Johnson, M.; Joshi, U.; Jung, A. W.; Klima, B.; Kreis, B.; Kwan, S.; Lammel, S.; Linacre, J.; Lincoln, D.; Lipton, R.; Liu, T.; Lopes De Sá, R.; Lykken, J.; Maeshima, K.; Marraffino, J. M.; Martinez Outschoorn, V. I.; Maruyama, S.; Mason, D.; McBride, P.; Merkel, P.; Mishra, K.; Mrenna, S.; Nahn, S.; Newman-Holmes, C.; O'Dell, V.; Pedro, K.; Prokofyev, O.; Rakness, G.; Sexton-Kennedy, E.; Soha, A.; Spalding, W. J.; Spiegel, L.; Taylor, L.; Tkaczyk, S.; Tran, N. V.; Uplegger, L.; Vaandering, E. W.; Vernieri, C.; Verzocchi, M.; Vidal, R.; Weber, H. A.; Whitbeck, A.; Yang, F.; Acosta, D.; Avery, P.; Bortignon, P.; Bourilkov, D.; Carnes, A.; Carver, M.; Curry, D.; Das, S.; Di Giovanni, G. P.; Field, R. D.; Furic, I. K.; Gleyzer, S. V.; Hugon, J.; Konigsberg, J.; Korytov, A.; Low, J. F.; Ma, P.; Matchev, K.; Mei, H.; Milenovic, P.; Mitselmakher, G.; Rank, D.; Rossin, R.; Shchutska, L.; Snowball, M.; Sperka, D.; Terentyev, N.; Thomas, L.; Wang, J.; Wang, S.; Yelton, J.; Hewamanage, S.; Linn, S.; Markowitz, P.; Martinez, G.; Rodriguez, J. L.; Ackert, A.; Adams, J. R.; Adams, T.; Askew, A.; Bochenek, J.; Diamond, B.; Haas, J.; Hagopian, S.; Hagopian, V.; Johnson, K. F.; Khatiwada, A.; Prosper, H.; Weinberg, M.; Baarmand, M. M.; Bhopatkar, V.; Colafranceschi, S.; Hohlmann, M.; Kalakhety, H.; Noonan, D.; Roy, T.; Yumiceva, F.; Adams, M. R.; Apanasevich, L.; Berry, D.; Betts, R. R.; Bucinskaite, I.; Cavanaugh, R.; Evdokimov, O.; Gauthier, L.; Gerber, C. E.; Hofman, D. J.; Kurt, P.; O'Brien, C.; Sandoval Gonzalez, I. D.; Silkworth, C.; Turner, P.; Varelas, N.; Wu, Z.; Zakaria, M.; Bilki, B.; Clarida, W.; Dilsiz, K.; Durgut, S.; Gandrajula, R. P.; Haytmyradov, M.; Khristenko, V.; Merlo, J.-P.; Mermerkaya, H.; Mestvirishvili, A.; Moeller, A.; Nachtman, J.; Ogul, H.; Onel, Y.; Ozok, F.; Penzo, A.; Snyder, C.; Tiras, E.; Wetzel, J.; Yi, K.; Anderson, I.; Barnett, B. A.; Blumenfeld, B.; Eminizer, N.; Fehling, D.; Feng, L.; Gritsan, A. V.; Maksimovic, P.; Martin, C.; Osherson, M.; Roskes, J.; Sady, A.; Sarica, U.; Swartz, M.; Xiao, M.; Xin, Y.; You, C.; Baringer, P.; Bean, A.; Benelli, G.; Bruner, C.; Kenny, R. P., III; Majumder, D.; Malek, M.; Murray, M.; Sanders, S.; Stringer, R.; Wang, Q.; Ivanov, A.; Kaadze, K.; Khalil, S.; Makouski, M.; Maravin, Y.; Mohammadi, A.; Saini, L. K.; Skhirtladze, N.; Toda, S.; Lange, D.; Rebassoo, F.; Wright, D.; Anelli, C.; Baden, A.; Baron, O.; Belloni, A.; Calvert, B.; Eno, S. C.; Ferraioli, C.; Gomez, J. A.; Hadley, N. J.; Jabeen, S.; Kellogg, R. G.; Kolberg, T.; Kunkle, J.; Lu, Y.; Mignerey, A. C.; Shin, Y. H.; Skuja, A.; Tonjes, M. B.; Tonwar, S. C.; Apyan, A.; Barbieri, R.; Baty, A.; Bierwagen, K.; Brandt, S.; Busza, W.; Cali, I. A.; Demiragli, Z.; Di Matteo, L.; Gomez Ceballos, G.; Goncharov, M.; Gulhan, D.; Iiyama, Y.; Innocenti, G. M.; Klute, M.; Kovalskyi, D.; Lai, Y. S.; Lee, Y.-J.; Levin, A.; Luckey, P. D.; Marini, A. C.; Mcginn, C.; Mironov, C.; Narayanan, S.; Niu, X.; Paus, C.; Ralph, D.; Roland, C.; Roland, G.; Salfeld-Nebgen, J.; Stephans, G. S. F.; Sumorok, K.; Varma, M.; Velicanu, D.; Veverka, J.; Wang, J.; Wang, T. W.; Wyslouch, B.; Yang, M.; Zhukova, V.; Dahmes, B.; Evans, A.; Finkel, A.; Gude, A.; Hansen, P.; Kalafut, S.; Kao, S. C.; Klapoetke, K.; Kubota, Y.; Lesko, Z.; Mans, J.; Nourbakhsh, S.; Ruckstuhl, N.; Rusack, R.; Tambe, N.; Turkewitz, J.; Acosta, J. G.; Oliveros, S.; Avdeeva, E.; Bloom, K.; Bose, S.; Claes, D. R.; Dominguez, A.; Fangmeier, C.; Gonzalez Suarez, R.; Kamalieddin, R.; Keller, J.; Knowlton, D.; Kravchenko, I.; Meier, F.; Monroy, J.; Ratnikov, F.; Siado, J. E.; Snow, G. R.; Alyari, M.; Dolen, J.; George, J.; Godshalk, A.; Harrington, C.; Iashvili, I.; Kaisen, J.; Kharchilava, A.; Kumar, A.; Rappoccio, S.; Roozbahani, B.; Alverson, G.; Barberis, E.; Baumgartel, D.; Chasco, M.; Hortiangtham, A.; Massironi, A.; Morse, D. M.; Nash, D.; Orimoto, T.; Teixeira De Lima, R.; Trocino, D.; Wang, R.-J.; Wood, D.; Zhang, J.; Hahn, K. A.; Kubik, A.; Mucia, N.; Odell, N.; Pollack, B.; Pozdnyakov, A.; Schmitt, M.; Stoynev, S.; Sung, K.; Trovato, M.; Velasco, M.; Brinkerhoff, A.; Dev, N.; Hildreth, M.; Jessop, C.; Karmgard, D. J.; Kellams, N.; Lannon, K.; Lynch, S.; Marinelli, N.; Meng, F.; Mueller, C.; Musienko, Y.; Pearson, T.; Planer, M.; Reinsvold, A.; Ruchti, R.; Smith, G.; Taroni, S.; Valls, N.; Wayne, M.; Wolf, M.; Woodard, A.; Antonelli, L.; Brinson, J.; Bylsma, B.; Durkin, L. S.; Flowers, S.; Hart, A.; Hill, C.; Hughes, R.; Ji, W.; Kotov, K.; Ling, T. Y.; Liu, B.; Luo, W.; Puigh, D.; Rodenburg, M.; Winer, B. L.; Wulsin, H. W.; Driga, O.; Elmer, P.; Hardenbrook, J.; Hebda, P.; Koay, S. A.; Lujan, P.; Marlow, D.; Medvedeva, T.; Mooney, M.; Olsen, J.; Palmer, C.; Piroué, P.; Saka, H.; Stickland, D.; Tully, C.; Zuranski, A.; Malik, S.; Barnes, V. E.; Benedetti, D.; Bortoletto, D.; Gutay, L.; Jha, M. K.; Jones, M.; Jung, K.; Miller, D. H.; Neumeister, N.; Radburn-Smith, B. C.; Shi, X.; Shipsey, I.; Silvers, D.; Sun, J.; Svyatkovskiy, A.; Wang, F.; Xie, W.; Xu, L.; Parashar, N.; Stupak, J.; Adair, A.; Akgun, B.; Chen, Z.; Ecklund, K. M.; Geurts, F. J. M.; Guilbaud, M.; Li, W.; Michlin, B.; Northup, M.; Padley, B. P.; Redjimi, R.; Roberts, J.; Rorie, J.; Tu, Z.; Zabel, J.; Betchart, B.; Bodek, A.; de Barbaro, P.; Demina, R.; Eshaq, Y.; Ferbel, T.; Galanti, M.; Garcia-Bellido, A.; Han, J.; Harel, A.; Hindrichs, O.; Khukhunaishvili, A.; Petrillo, G.; Tan, P.; Verzetti, M.; Arora, S.; Barker, A.; Chou, J. P.; Contreras-Campana, C.; Contreras-Campana, E.; Duggan, D.; Ferencek, D.; Gershtein, Y.; Gray, R.; Halkiadakis, E.; Hidas, D.; Hughes, E.; Kaplan, S.; Kunnawalkam Elayavalli, R.; Lath, A.; Nash, K.; Panwalkar, S.; Park, M.; Salur, S.; Schnetzer, S.; Sheffield, D.; Somalwar, S.; Stone, R.; Thomas, S.; Thomassen, P.; Walker, M.; Foerster, M.; Riley, G.; Rose, K.; Spanier, S.; York, A.; Bouhali, O.; Castaneda Hernandez, A.; Dalchenko, M.; De Mattia, M.; Delgado, A.; Dildick, S.; Eusebi, R.; Gilmore, J.; Kamon, T.; Krutelyov, V.; Mueller, R.; Osipenkov, I.; Pakhotin, Y.; Patel, R.; Perloff, A.; Rose, A.; Safonov, A.; Tatarinov, A.; Ulmer, K. A.; Akchurin, N.; Cowden, C.; Damgov, J.; Dragoiu, C.; Dudero, P. R.; Faulkner, J.; Kunori, S.; Lamichhane, K.; Lee, S. W.; Libeiro, T.; Undleeb, S.; Volobouev, I.; Appelt, E.; Delannoy, A. G.; Greene, S.; Gurrola, A.; Janjam, R.; Johns, W.; Maguire, C.; Mao, Y.; Melo, A.; Ni, H.; Sheldon, P.; Snook, B.; Tuo, S.; Velkovska, J.; Xu, Q.; Arenton, M. W.; Cox, B.; Francis, B.; Goodell, J.; Hirosky, R.; Ledovskoy, A.; Li, H.; Lin, C.; Neu, C.; Sinthuprasith, T.; Sun, X.; Wang, Y.; Wolfe, E.; Wood, J.; Xia, F.; Clarke, C.; Harr, R.; Karchin, P. E.; Kottachchi Kankanamge Don, C.; Lamichhane, P.; Sturdy, J.; Belknap, D. A.; Carlsmith, D.; Cepeda, M.; Dasu, S.; Dodd, L.; Duric, S.; Gomber, B.; Grothe, M.; Hall-Wilton, R.; Herndon, M.; Hervé, A.; Klabbers, P.; Lanaro, A.; Levine, A.; Long, K.; Loveless, R.; Mohapatra, A.; Ojalvo, I.; Perry, T.; Pierro, G. A.; Polese, G.; Ruggles, T.; Sarangi, T.; Savin, A.; Sharma, A.; Smith, N.; Smith, W. H.; Taylor, D.; Woods, N.

    2017-01-01

    This paper describes the CMS trigger system and its performance during Run 1 of the LHC. The trigger system consists of two levels designed to select events of potential physics interest from a GHz (MHz) interaction rate of proton-proton (heavy ion) collisions. The first level of the trigger is implemented in hardware, and selects events containing detector signals consistent with an electron, photon, muon, τ lepton, jet, or missing transverse energy. A programmable menu of up to 128 object-based algorithms is used to select events for subsequent processing. The trigger thresholds are adjusted to the LHC instantaneous luminosity during data taking in order to restrict the output rate to 100 kHz, the upper limit imposed by the CMS readout electronics. The second level, implemented in software, further refines the purity of the output stream, selecting an average rate of 400 Hz for offline event storage. The objectives, strategy and performance of the trigger system during the LHC Run 1 are described.

  3. The CMS trigger system

    DOE PAGES

    Khachatryan, Vardan

    2017-01-24

    This paper describes the CMS trigger system and its performance during Run 1 of the LHC. The trigger system consists of two levels designed to select events of potential physics interest from a GHz (MHz) interaction rate of proton-proton (heavy ion) collisions. The first level of the trigger is implemented in hardware, and selects events containing detector signals consistent with an electron, photon, muon, tau lepton, jet, or missing transverse energy. A programmable menu of up to 128 object-based algorithms is used to select events for subsequent processing. The trigger thresholds are adjusted to the LHC instantaneous luminosity during datamore » taking in order to restrict the output rate to 100 kHz, the upper limit imposed by the CMS readout electronics. The second level, implemented in software, further refines the purity of the output stream, selecting an average rate of 400 Hz for offline event storage. The objectives, strategy and performance of the trigger system during the LHC Run 1 are described.« less

  4. Electrons and photons at High Level Trigger in CMS for Run II

    NASA Astrophysics Data System (ADS)

    Anuar, Afiq A.

    2015-12-01

    The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increase in center-of-mass energy and luminosity will raise the event rate to a level challenging for the HLT algorithms. New approaches have been studied to keep the HLT output rate manageable while maintaining thresholds low enough to cover physics analyses. The strategy mainly relies on porting online the ingredients that have been successfully applied in the offline reconstruction, thus allowing to move HLT selection closer to offline cuts. Improvements in HLT electron and photon definitions will be presented, focusing in particular on: updated clustering algorithm and the energy calibration procedure, new Particle-Flow-based isolation approach and pileup mitigation techniques, and the electron-dedicated track fitting algorithm based on Gaussian Sum Filter.

  5. Monitoring techniques and alarm procedures for CMS services and sites in WLCG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molina-Perez, J.; Bonacorsi, D.; Gutsche, O.

    2012-01-01

    The CMS offline computing system is composed of roughly 80 sites (including most experienced T3s) and a number of central services to distribute, process and analyze data worldwide. A high level of stability and reliability is required from the underlying infrastructure and services, partially covered by local or automated monitoring and alarming systems such as Lemon and SLS, the former collects metrics from sensors installed on computing nodes and triggers alarms when values are out of range, the latter measures the quality of service and warns managers when service is affected. CMS has established computing shift procedures with personnel operatingmore » worldwide from remote Computing Centers, under the supervision of the Computing Run Coordinator at CERN. This dedicated 24/7 computing shift personnel is contributing to detect and react timely on any unexpected error and hence ensure that CMS workflows are carried out efficiently and in a sustained manner. Synergy among all the involved actors is exploited to ensure the 24/7 monitoring, alarming and troubleshooting of the CMS computing sites and services. We review the deployment of the monitoring and alarming procedures, and report on the experience gained throughout the first two years of LHC operation. We describe the efficiency of the communication tools employed, the coherent monitoring framework, the proactive alarming systems and the proficient troubleshooting procedures that helped the CMS Computing facilities and infrastructure to operate at high reliability levels.« less

  6. The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments

    NASA Astrophysics Data System (ADS)

    Girone, M.; Andreeva, J.; Barreiro Megino, F. H.; Campana, S.; Cinquilli, M.; Di Girolamo, A.; Dimou, M.; Giordano, D.; Karavakis, E.; Kenyon, M. J.; Kokozkiewicz, L.; Lanciotti, E.; Litmaath, M.; Magini, N.; Negri, G.; Roiser, S.; Saiz, P.; Saiz Santos, M. D.; Schovancova, J.; Sciabà, A.; Spiga, D.; Trentadue, R.; Tuckett, D.; Valassi, A.; Van der Ster, D. C.; Shiers, J. D.

    2012-12-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.

  7. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  8. Framework for Development of Object-Oriented Software

    NASA Technical Reports Server (NTRS)

    Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan

    2004-01-01

    The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.

  9. Demyelinating evidences in CMS rat model of depression: a DTI study at 7 T.

    PubMed

    Hemanth Kumar, B S; Mishra, S K; Trivedi, R; Singh, S; Rana, P; Khushu, S

    2014-09-05

    Depression is among the most debilitating diseases worldwide. Long-term exposure to stressors plays a major role in development of human depression. Chronic mild stress (CMS) seems to be a valid animal model for depression. Diffusion tensor imaging (DTI) is capable of inferring microstructural abnormalities of the white matter and has shown to serve as non-invasive marker of specific pathology. We developed a CMS rat model of depression and validated with behavioral experiments. We measured the diffusion indices (mean diffusivity (MD), fractional anisotropy (FA), axial (λ∥) and radial (λ⊥) diffusivity) to investigate the changes in CMS rat brain during depression onset. Diffusion indices have shown to be useful to discriminate myelin damage from axon loss. DTI was performed in both control and CMS rats (n=10, in each group) and maps of FA, MD, λ∥ and λ⊥ diffusivity values were generated using in-house built software. The diffusion indices were calculated by region of interest (ROI) analysis in different brain regions like the frontal cortex, hippocampus, hypothalamus, cingulum, thalamus, caudate putamen, corpus callosum, cerebral peduncle and sensory motor cortex. The results showed signs of demyelination, reflected by increased MD, decreased FA and increased λ⊥. The results also suggest a possible role of edema or inflammation concerning the brain morphology in CMS rats. The overall finding using DTI suggests there might be a major role of loss of myelin sheath, which leads to disrupted connectivity between the limbic area and the prefrontal cortex during the onset of depression. Our findings indicate that interpretation of these indices may provide crucial information about the type and severity of mood disorders. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  10. Influence of flow velocity on motor behavior of sea cucumber Apostichopus japonicus.

    PubMed

    Pan, Yang; Zhang, Libin; Lin, Chenggang; Sun, Jiamin; Kan, Rentao; Yang, Hongsheng

    2015-05-15

    The influence of flow velocity on the motor behavior of the sea cucumber, Apostichopus japonicus was investigated in the laboratory. Cameras were used to record sea cucumber movements and behavior analysis software was used to measure the distance traveled, time spent, upstream or downstream of the start position and the speed of movements. In general, the mean velocity of A. japonicus was below 0.7mms(-1). The maximum velocity recorded for all the sea cucumbers tested was for a large individual (89.25±17.11g), at a flow rate of 4.6±0.5cms(-1). Medium sized (19.68±5.53g) and large individuals moved significantly faster than small individuals (2.65±1.24g) at the same flow rate. A. japonicus moved significantly faster when there was a moderate current (4.6±0.5cms(-1) and 14.7±0.3cms(-1)), compared with the fast flow rate (29.3±3.7cms(-1)) and when there was no flow (0cms(-1)). Sea cucumbers did not show positive rheotaxis in general, but did move in a downstream direction at faster current speeds. Large, medium and small sized individuals moved downstream at the fastest current speed tested, 29.3±3.7cms(-1). When there was no water flow, sea cucumbers tended to move in an irregular pattern. The movement patterns show that the sea cucumber, A. japonicus can move across the direction of flow, and can move both upstream and downstream along the direction of flow. Copyright © 2015. Published by Elsevier Inc.

  11. Building Real World Domain-Specific Social Network Websites as a Capstone Project

    ERIC Educational Resources Information Center

    Yue, Kwok-Bun; De Silva, Dilhar; Kim, Dan; Aktepe, Mirac; Nagle, Stewart; Boerger, Chris; Jain, Anubha; Verma, Sunny

    2009-01-01

    This paper describes our experience of using Content Management Software (CMS), specifically Joomla, to build a real world domain-specific social network site (SNS) as a capstone project for graduate information systems and computer science students. As Web 2.0 technologies become increasingly important in driving business application development,…

  12. Collaborative Concept Mapping Activities in a Classroom Scenario

    ERIC Educational Resources Information Center

    Elorriaga, J. A.; Arruarte, A.; Calvo, I.; Larrañaga, M.; Rueda, U.; Herrán, E.

    2013-01-01

    The aim of this study is to test collaborative concept mapping activities using computers in a classroom scenario and to evaluate the possibilities that Elkar-CM offers for collaboratively learning non-technical topics. Elkar-CM is a multi-lingual and multi-media software program designed for drawing concept maps (CMs) collaboratively. Concept…

  13. The CMS Tier0 goes cloud and grid for LHC Run 2

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threadedmore » framework to deal with the increased event complexity and to ensure efficient use of the resources. Furthermore, this contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.« less

  14. Understanding user needs for carbon monitoring information

    NASA Astrophysics Data System (ADS)

    Duren, R. M.; Macauley, M.; Gurney, K. R.; Saatchi, S. S.; Woodall, C. W.; Larsen, K.; Reidmiller, D.; Hockstad, L.; Weitz, M.; Croes, B.; Down, A.; West, T.; Mercury, M.

    2015-12-01

    The objectives of the Understanding User Needs project for NASA's Carbon Monitoring System (CMS) program are to: 1) engage the user community and identify needs for policy-relevant carbon monitoring information, 2) evaluate current and planned CMS data products with regard to their value for decision making, and 3) explore alternative methods for visualizing and communicating carbon monitoring information and associated uncertainties to decision makers and other stakeholders. To meet these objectives and help establish a sustained link between science and decision-making we have established a multi-disciplinary team that combines expertise in carbon-cycle science, engineering, economics, and carbon management and policy. We will present preliminary findings regarding emerging themes and needs for carbon information that may warrant increased attention by the science community. We will also demonstrate a new web-based tool that offers a common framework for facilitating user evaluation of carbon data products from multiple CMS projects.

  15. The CMS TierO goes Cloud and Grid for LHC Run 2

    NASA Astrophysics Data System (ADS)

    Hufnagel, Dirk

    2015-12-01

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threaded framework to deal with the increased event complexity and to ensure efficient use of the resources. This contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.

  16. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 1: Concepts and activity descriptions

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).

  17. Software manual for operating particle displacement tracking data acquisition and reduction system

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1991-01-01

    The software manual is presented. The necessary steps required to record, analyze, and reduce Particle Image Velocimetry (PIV) data using the Particle Displacement Tracking (PDT) technique are described. The new PDT system is an all electronic technique employing a CCD video camera and a large memory buffer frame-grabber board to record low velocity (less than or equal to 20 cm/s) flows. Using a simple encoding scheme, a time sequence of single exposure images are time coded into a single image and then processed to track particle displacements and determine 2-D velocity vectors. All the PDT data acquisition, analysis, and data reduction software is written to run on an 80386 PC.

  18. Performance of the CMS muon detector and muon reconstruction with proton-proton collisions at $$\\sqrt{s}=$$ 13 TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sirunyan, Albert M; et al.

    The CMS muon detector system, muon reconstruction software, and high-level trigger underwent significant changes in 2013-2014 in preparation for running at higher LHC collision energy and instantaneous luminosity. The performance of the modified system is studied using proton-proton collision data at center-of-mass energymore » $$\\sqrt{s}=$$ 13 TeV, collected at the LHC in 2015 and 2016. The measured performance parameters, including spatial resolution, efficiency, and timing, are found to meet all design specifications and are well reproduced by simulation. Despite the more challenging running conditions, the modified muon system is found to perform as well as, and in many aspects better than, previously.« less

  19. Impact of detector simulation in particle physics collider experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elvira, V. Daniel

    Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less

  20. Impact of detector simulation in particle physics collider experiments

    DOE PAGES

    Elvira, V. Daniel

    2017-06-01

    Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less

  1. Impact of detector simulation in particle physics collider experiments

    NASA Astrophysics Data System (ADS)

    Daniel Elvira, V.

    2017-06-01

    Through the last three decades, accurate simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics (HEP) experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detector simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the precision of the physics results and publication turnaround, from data-taking to submission. It also presents estimates of the cost and economic impact of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data with increasingly complex detectors, taxing heavily the performance of simulation and reconstruction software. Consequently, exploring solutions to speed up simulation and reconstruction software to satisfy the growing demand of computing resources in a time of flat budgets is a matter that deserves immediate attention. The article ends with a short discussion on the potential solutions that are being considered, based on leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering HEP code for concurrency and parallel computing.

  2. Evaluation of the Next-Gen Exercise Software Interface in the NEEMO Analog

    NASA Technical Reports Server (NTRS)

    Hanson, Andrea; Kalogera, Kent; Sandor, Aniko; Hardy, Marc; Frank, Andrew; English, Kirk; Williams, Thomas; Perera, Jeevan; Amonette, William

    2017-01-01

    NSBRI (National Space Biomedical Research Institute) funded research grant to develop the 'NextGen' exercise software for the NEEMO (NASA Extreme Environment Mission Operations) analog. Develop a software architecture to integrate instructional, motivational and socialization techniques into a common portal to enhance exercise countermeasures in remote environments. Increase user efficiency and satisfaction, and institute commonality across multiple exercise systems. Utilized GUI (Graphical User Interface) design principals focused on intuitive ease of use to minimize training time and realize early user efficiency. Project requirement to test the software in an analog environment. Top Level Project Aims: 1) Improve the usability of crew interface software to exercise CMS (Crew Management System) through common app-like interfaces. 2) Introduce virtual instructional motion training. 3) Use virtual environment to provide remote socialization with family and friends, improve exercise technique, adherence, motivation and ultimately performance outcomes.

  3. Strategic opportunities in the oversight of the U.S. hospital accreditation system.

    PubMed

    Moffett, Maurice L; Morgan, Robert O; Ashton, Carol M

    2005-12-01

    Hospital accreditation and state certification are the means that the Centers for Medicare & Medicaid Services (CMS) employs to meet quality of care requirements for medical care reimbursement. Hospitals can choose to use either a national accrediting agency or a state certification inspection in order to receive Medicare payments. Approximately, 80% of hospitals choose the Joint Commission on Accreditation of Healthcare Organizations (JCAHO). The purpose of this paper is to analyze and discuss improvements on the structure of the accreditation process in a Principal-Agent-Supervisor framework with a special emphasis on the oversight by the principal (CMS) of the supervisor (JCAHO).

  4. Software Agents to Assist in Distance Learning Environments

    ERIC Educational Resources Information Center

    Choy, Sheung-On; Ng, Sin-Chun; Tsang, Yiu-Chung

    2005-01-01

    The Open University of Hong Kong (OUHK) is a distance education university with about 22,500 students. In fulfilling its mission, the university has adopted various Web-based and electronic means to support distance learning. For instance, OUHK uses a Web-based course management system (CMS) to provide students with a flexible way to obtain course…

  5. Crossing lines: a multidisciplinary framework for assessing connectivity of hammerhead sharks across jurisdictional boundaries

    NASA Astrophysics Data System (ADS)

    Chin, A.; Simpfendorfer, C. A.; White, W. T.; Johnson, G. J.; McAuley, R. B.; Heupel, M. R.

    2017-04-01

    Conservation and management of migratory species can be complex and challenging. International agreements such as the Convention on Migratory Species (CMS) provide policy frameworks, but assessments and management can be hampered by lack of data and tractable mechanisms to integrate disparate datasets. An assessment of scalloped (Sphyrna lewini) and great (Sphyrna mokarran) hammerhead population structure and connectivity across northern Australia, Indonesia and Papua New Guinea (PNG) was conducted to inform management responses to CMS and Convention on International Trade in Endangered Species listings of these species. An Integrated Assessment Framework (IAF) was devised to systematically incorporate data across jurisdictions and create a regional synopsis, and amalgamated a suite of data from the Australasian region. Scalloped hammerhead populations are segregated by sex and size, with Australian populations dominated by juveniles and small adult males, while Indonesian and PNG populations included large adult females. The IAF process introduced genetic and tagging data to produce conceptual models of stock structure and movement. Several hypotheses were produced to explain stock structure and movement patterns, but more data are needed to identify the most likely hypothesis. This study demonstrates a process for assessing migratory species connectivity and highlights priority areas for hammerhead management and research.

  6. Crossing lines: a multidisciplinary framework for assessing connectivity of hammerhead sharks across jurisdictional boundaries.

    PubMed

    Chin, A; Simpfendorfer, C A; White, W T; Johnson, G J; McAuley, R B; Heupel, M R

    2017-04-21

    Conservation and management of migratory species can be complex and challenging. International agreements such as the Convention on Migratory Species (CMS) provide policy frameworks, but assessments and management can be hampered by lack of data and tractable mechanisms to integrate disparate datasets. An assessment of scalloped (Sphyrna lewini) and great (Sphyrna mokarran) hammerhead population structure and connectivity across northern Australia, Indonesia and Papua New Guinea (PNG) was conducted to inform management responses to CMS and Convention on International Trade in Endangered Species listings of these species. An Integrated Assessment Framework (IAF) was devised to systematically incorporate data across jurisdictions and create a regional synopsis, and amalgamated a suite of data from the Australasian region. Scalloped hammerhead populations are segregated by sex and size, with Australian populations dominated by juveniles and small adult males, while Indonesian and PNG populations included large adult females. The IAF process introduced genetic and tagging data to produce conceptual models of stock structure and movement. Several hypotheses were produced to explain stock structure and movement patterns, but more data are needed to identify the most likely hypothesis. This study demonstrates a process for assessing migratory species connectivity and highlights priority areas for hammerhead management and research.

  7. GERICOS: A Generic Framework for the Development of On-Board Software

    NASA Astrophysics Data System (ADS)

    Plasson, P.; Cuomo, C.; Gabriel, G.; Gauthier, N.; Gueguen, L.; Malac-Allain, L.

    2016-08-01

    This paper presents an overview of the GERICOS framework (GEneRIC Onboard Software), its architecture, its various layers and its future evolutions. The GERICOS framework, developed and qualified by LESIA, offers a set of generic, reusable and customizable software components for the rapid development of payload flight software. The GERICOS framework has a layered structure. The first layer (GERICOS::CORE) implements the concept of active objects and forms an abstraction layer over the top of real-time kernels. The second layer (GERICOS::BLOCKS) offers a set of reusable software components for building flight software based on generic solutions to recurrent functionalities. The third layer (GERICOS::DRIVERS) implements software drivers for several COTS IP cores of the LEON processor ecosystem.

  8. File-based data flow in the CMS Filter Farm

    NASA Astrophysics Data System (ADS)

    Andre, J.-M.; Andronidis, A.; Bawej, T.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.

  9. A genetic algorithm for a bi-objective mathematical model for dynamic virtual cell formation problem

    NASA Astrophysics Data System (ADS)

    Moradgholi, Mostafa; Paydar, Mohammad Mahdi; Mahdavi, Iraj; Jouzdani, Javid

    2016-09-01

    Nowadays, with the increasing pressure of the competitive business environment and demand for diverse products, manufacturers are force to seek for solutions that reduce production costs and rise product quality. Cellular manufacturing system (CMS), as a means to this end, has been a point of attraction to both researchers and practitioners. Limitations of cell formation problem (CFP), as one of important topics in CMS, have led to the introduction of virtual CMS (VCMS). This research addresses a bi-objective dynamic virtual cell formation problem (DVCFP) with the objective of finding the optimal formation of cells, considering the material handling costs, fixed machine installation costs and variable production costs of machines and workforce. Furthermore, we consider different skills on different machines in workforce assignment in a multi-period planning horizon. The bi-objective model is transformed to a single-objective fuzzy goal programming model and to show its performance; numerical examples are solved using the LINGO software. In addition, genetic algorithm (GA) is customized to tackle large-scale instances of the problems to show the performance of the solution method.

  10. Key Influencing Factors behind Moodle Adoption in Irish Small to Medium Sized Higher Education Colleges

    ERIC Educational Resources Information Center

    Walker, David; Livadas, Lelia; Miles, Gail

    2011-01-01

    This research investigated Irish Small to Medium Sized Educational Institutions (SMSEs) involved in Higher Education (HE) that adopted Moodle, the OSS (Open Source Software) course management system (CMS). As Moodle has only been adopted in the Irish HE sector in the last 5-7 years, this research crucially studied the attitudes of the SMSEs that…

  11. CMS Software: Installation Guide and User Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straut, Christine

    A Chemical Inventory Management System (CIMS) is a system or program that is used to track chemicals at a facility or institution. An effective CIMS begins tracking these chemicals at the point of procurement and continues through use and disposal. The management of chemicals throughout the life cycle (procurement to disposal) is a key concept for the secure management of chemicals at any institution.

  12. A gLite FTS based solution for managing user output in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cinquilli, M.; Riahi, H.; Spiga, D.

    2012-01-01

    The CMS distributed data analysis workflow assumes that jobs run in a different location from where their results are finally stored. Typically the user output must be transferred across the network from one site to another, possibly on a different continent or over links not necessarily validated for high bandwidth/high reliability transfer. This step is named stage-out and in CMS was originally implemented as a synchronous step of the analysis job execution. However, our experience showed the weakness of this approach both in terms of low total job execution efficiency and failure rates, wasting precious CPU resources. The nature ofmore » analysis data makes it inappropriate to use PhEDEx, the core data placement system for CMS. As part of the new generation of CMS Workload Management tools, the Asynchronous Stage-Out system (AsyncStageOut) has been developed to enable third party copy of the user output. The AsyncStageOut component manages glite FTS transfers of data from the temporary store at the site where the job ran to the final location of the data on behalf of that data owner. The tool uses python daemons, built using the WMCore framework, and CouchDB, to manage the queue of work and FTS transfers. CouchDB also provides the platform for a dedicated operations monitoring system. In this paper, we present the motivations of the asynchronous stage-out system. We give an insight into the design and the implementation of key features, describing how it is coupled with the CMS workload management system. Finally, we show the results and the commissioning experience.« less

  13. Framework programmable platform for the advanced software development workstation. Integration mechanism design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.

  14. Professional Ethics of Software Engineers: An Ethical Framework.

    PubMed

    Lurie, Yotam; Mark, Shlomo

    2016-04-01

    The purpose of this article is to propose an ethical framework for software engineers that connects software developers' ethical responsibilities directly to their professional standards. The implementation of such an ethical framework can overcome the traditional dichotomy between professional skills and ethical skills, which plagues the engineering professions, by proposing an approach to the fundamental tasks of the practitioner, i.e., software development, in which the professional standards are intrinsically connected to the ethical responsibilities. In so doing, the ethical framework improves the practitioner's professionalism and ethics. We call this approach Ethical-Driven Software Development (EDSD), as an approach to software development. EDSD manifests the advantages of an ethical framework as an alternative to the all too familiar approach in professional ethics that advocates "stand-alone codes of ethics". We believe that one outcome of this synergy between professional and ethical skills is simply better engineers. Moreover, since there are often different software solutions, which the engineer can provide to an issue at stake, the ethical framework provides a guiding principle, within the process of software development, that helps the engineer evaluate the advantages and disadvantages of different software solutions. It does not and cannot affect the end-product in and of-itself. However, it can and should, make the software engineer more conscious and aware of the ethical ramifications of certain engineering decisions within the process.

  15. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  16. CMS changes in reimbursement for HAIs: setting a research agenda.

    PubMed

    Stone, Patricia W; Glied, Sherry A; McNair, Peter D; Matthes, Nikolas; Cohen, Bevin; Landers, Timothy F; Larson, Elaine L

    2010-05-01

    The Centers for Medicare and Medicaid Services (CMS) promulgated regulations commencing October 1, 2008, which deny payment for selected conditions occurring during the hospital stay and are not present on admission. Three of the 10 hospital-acquired conditions covered by the new CMS policy involve healthcare-associated infections, which are a common, expensive, and often preventable cause of inpatient morbidity and mortality. To outline a research agenda on the impact of CMS's payment policy on the healthcare system and the prevention of healthcare-associated infections. An invitational day-long conference was convened in April 2009. Including the planning committee and speakers there were 41 conference participants who were national experts and senior researchers. Building upon a behavioral model and organizational theory and management research a conceptual framework was applied to organize the wide range of issues that arose. A broad array of research topics was identified. Thirty-two research agenda items were organized in the areas of incentives, environmental factors, organizational factors, clinical outcomes, staff outcomes, and financial outcomes. Methodological challenges are also discussed. This policy is a first significant step to move output-based inpatient funding to outcome-based funding, and this agenda is applicable to all hospital-acquired conditions. Studies beginning soon will have the best hope of capturing data for the years preceding the policy change, a key element in non-experimental research. The CMS payment policy offers an excellent opportunity to understand and influence the use of financial incentives for improving patient safety.

  17. Muons in the CMS High Level Trigger System

    NASA Astrophysics Data System (ADS)

    Verwilligen, Piet; CMS Collaboration

    2016-04-01

    The trigger systems of LHC detectors play a fundamental role in defining the physics capabilities of the experiments. A reduction of several orders of magnitude in the rate of collected events, with respect to the proton-proton bunch crossing rate generated by the LHC, is mandatory to cope with the limits imposed by the readout and storage system. An accurate and efficient online selection mechanism is thus required to fulfill the task keeping maximal the acceptance to physics signals. The CMS experiment operates using a two-level trigger system. Firstly a Level-1 Trigger (L1T) system, implemented using custom-designed electronics, is designed to reduce the event rate to a limit compatible to the CMS Data Acquisition (DAQ) capabilities. A High Level Trigger System (HLT) follows, aimed at further reducing the rate of collected events finally stored for analysis purposes. The latter consists of a streamlined version of the CMS offline reconstruction software and operates on a computer farm. It runs algorithms optimized to make a trade-off between computational complexity, rate reduction and high selection efficiency. With the computing power available in 2012 the maximum reconstruction time at HLT was about 200 ms per event, at the nominal L1T rate of 100 kHz. An efficient selection of muons at HLT, as well as an accurate measurement of their properties, such as transverse momentum and isolation, is fundamental for the CMS physics programme. The performance of the muon HLT for single and double muon triggers achieved in Run I will be presented. Results from new developments, aimed at improving the performance of the algorithms for the harsher scenarios of collisions per event (pile-up) and luminosity expected for Run II will also be discussed.

  18. Crossing lines: a multidisciplinary framework for assessing connectivity of hammerhead sharks across jurisdictional boundaries

    PubMed Central

    Chin, A.; Simpfendorfer, C. A.; White, W. T.; Johnson, G. J.; McAuley, R. B.; Heupel, M. R.

    2017-01-01

    Conservation and management of migratory species can be complex and challenging. International agreements such as the Convention on Migratory Species (CMS) provide policy frameworks, but assessments and management can be hampered by lack of data and tractable mechanisms to integrate disparate datasets. An assessment of scalloped (Sphyrna lewini) and great (Sphyrna mokarran) hammerhead population structure and connectivity across northern Australia, Indonesia and Papua New Guinea (PNG) was conducted to inform management responses to CMS and Convention on International Trade in Endangered Species listings of these species. An Integrated Assessment Framework (IAF) was devised to systematically incorporate data across jurisdictions and create a regional synopsis, and amalgamated a suite of data from the Australasian region. Scalloped hammerhead populations are segregated by sex and size, with Australian populations dominated by juveniles and small adult males, while Indonesian and PNG populations included large adult females. The IAF process introduced genetic and tagging data to produce conceptual models of stock structure and movement. Several hypotheses were produced to explain stock structure and movement patterns, but more data are needed to identify the most likely hypothesis. This study demonstrates a process for assessing migratory species connectivity and highlights priority areas for hammerhead management and research. PMID:28429742

  19. Data-Driven Software Framework for Web-Based ISS Telescience

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.

    2005-01-01

    Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.

  20. HEP Data Grid Applications in Korea

    NASA Astrophysics Data System (ADS)

    Cho, Kihyeon; Oh, Youngdo; Son, Dongchul; Kim, Bockjoo; Lee, Sangsan

    2003-04-01

    We will introduce the national HEP Data Grid applications in Korea. Through a five-year HEP Data Grid project (2002-2006) for CMS, AMS, CDF, PHENIX, K2K and Belle experiments in Korea, the Center for High Energy Physics, Kyungpook National University in Korea will construct the 1,000 PC cluster and related storage system such as 1,200 TByte Raid disk system. This project includes one of the master plan to construct Asia Regional Data Center by 2006 for the CMS and AMS Experiments and DCAF(DeCentralized Analysis Farm) for the CDF Experiments. During the first year of the project, we have constructed a cluster of around 200 CPU's with a 50 TBytes of a storage system. We will present our first year's experience of the software and hardware applications for HEP Data Grid of EDG and SAM Grid testbeds.

  1. A Python object-oriented framework for the CMS alignment and calibration data

    NASA Astrophysics Data System (ADS)

    Dawes, Joshua H.; CMS Collaboration

    2017-10-01

    The Alignment, Calibrations and Databases group at the CMS Experiment delivers Alignment and Calibration Conditions Data to a large set of workflows which process recorded event data and produce simulated events. The current infrastructure for releasing and consuming Conditions Data was designed in the two years of the first LHC long shutdown to respond to use cases from the preceding data-taking period. During the second run of the LHC, new use cases were defined. For the consumption of Conditions Metadata, no common interface existed for the detector experts to use in Python-based custom scripts, resulting in many different querying and transaction management patterns. A new framework has been built to address such use cases: a simple object-oriented tool that detector experts can use to read and write Conditions Metadata when using Oracle and SQLite databases, that provides a homogeneous method of querying across all services. The tool provides mechanisms for segmenting large sets of conditions while releasing them to the production database, allows for uniform error reporting to the client-side from the server-side and optimizes the data transfer to the server. The architecture of the new service has been developed exploiting many of the features made available by the metadata consumption framework to implement the required improvements. This paper presents the details of the design and implementation of the new metadata consumption and data upload framework, as well as analyses of the new upload service’s performance as the server-side state varies.

  2. Competence Management System Design in International Multicultural Environment: Registration, Transfer, Recognition and Transparency

    ERIC Educational Resources Information Center

    Starcic, Andreja Istenic

    2012-01-01

    A competence management system (CMS) was devised to assist the registration of competencies in the textile and clothing sector, starting in the four EU countries of Portugal, Slovenia, the UK and Denmark, further leading to the European network. This paper presents the design and development framework assisting international multicultural…

  3. File-Based Data Flow in the CMS Filter Farm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andre, J.M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes aremore » also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.« less

  4. Managing the CMS Data and Monte Carlo Processing during LHC Run 2

    NASA Astrophysics Data System (ADS)

    Wissing, C.; CMS Collaboration

    2017-10-01

    In order to cope with the challenges expected during the LHC Run 2 CMS put in a number of enhancements into the main software packages and the tools used for centrally managed processing. In the presentation we will highlight these improvements that allow CMS to deal with the increased trigger output rate, the increased pileup and the evolution in computing technology. The overall system aims at high flexibility, improved operational flexibility and largely automated procedures. The tight coupling of workflow classes to types of sites has been drastically relaxed. Reliable and high-performing networking between most of the computing sites and the successful deployment of a data-federation allow the execution of workflows using remote data access. That required the development of a largely automatized system to assign workflows and to handle necessary pre-staging of data. Another step towards flexibility has been the introduction of one large global HTCondor Pool for all types of processing workflows and analysis jobs. Besides classical Grid resources also some opportunistic resources as well as Cloud resources have been integrated into that Pool, which gives reach to more than 200k CPU cores.

  5. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  6. Earthquake Early Warning: Real-time Testing of an On-site Method Using Waveform Data from the Southern California Seismic Network

    NASA Astrophysics Data System (ADS)

    Solanki, K.; Hauksson, E.; Kanamori, H.; Wu, Y.; Heaton, T.; Boese, M.

    2007-12-01

    We have implemented an on-site early warning algorithm using the infrastructure of the Caltech/USGS Southern California Seismic Network (SCSN). We are evaluating the real-time performance of the software system and the algorithm for rapid assessment of earthquakes. In addition, we are interested in understanding what parts of the SCSN need to be improved to make early warning practical. Our EEW processing system is composed of many independent programs that process waveforms in real-time. The codes were generated by using a software framework. The Pd (maximum displacement amplitude of P wave during the first 3sec) and Tau-c (a period parameter during the first 3 sec) values determined during the EEW processing are being forwarded to the California Integrated Seismic Network (CISN) web page for independent evaluation of the results. The on-site algorithm measures the amplitude of the P-wave (Pd) and the frequency content of the P-wave during the first three seconds (Tau-c). The Pd and the Tau-c values make it possible to discriminate between a variety of events such as large distant events, nearby small events, and potentially damaging nearby events. The Pd can be used to infer the expected maximum ground shaking. The method relies on data from a single station although it will become more reliable if readings from several stations are associated. To eliminate false triggers from stations with high background noise level, we have created per station Pd threshold configuration for the Pd/Tau-c algorithm. To determine appropriate values for the Pd threshold we calculate Pd thresholds for stations based on the information from the EEW logs. We have operated our EEW test system for about a year and recorded numerous earthquakes in the magnitude range from M3 to M5. Two recent examples are a M4.5 earthquake near Chatsworth and a M4.7 earthquake near Elsinore. In both cases, the Pd and Tau-c parameters were determined successfully within 10 to 20 sec of the arrival of the P-wave at the station. The Tau-c values predicted the magnitude within 0.1 and the predicted average peak-ground-motion was 0.7 cm/s and 0.6 cm/s. The delays in the system are caused mostly by the packetizing delay because our software system is based on processing miniseed packets. Most recently we have begun reducing the data latency using new qmaserv2 software for the Q330 Quanterra datalogger. We implemented qmaserv2 based multicast receiver software to receive the native 1 sec packets from the dataloggers. The receiver reads multicast packets from the network and writes them into shared memory area. This new software will fully take advantage of the capabilities of the Q330 datalogger and significantly reduce data latency for EEW system. We have also implemented a new EEW sub-system that compliments the currently running EEW system by associating Pd and Tau-c values from multiple stations. So far, we have implemented a new trigger generation algorithm for real-time processing for the sub-system, and are able to routinely locate events and determine magnitudes using the Pd and Tau-c values.

  7. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  8. An Interoperability Framework and Capability Profiling for Manufacturing Software

    NASA Astrophysics Data System (ADS)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  9. Commissioning of the CMS Hadron Forward Calorimeters Phase I Upgrade

    NASA Astrophysics Data System (ADS)

    Bilki, B.; Onel, Y.

    2018-03-01

    The final phase of the CMS Hadron Forward Calorimeters Phase I Upgrade was performed during the Extended Year End Technical Stop of 2016-2017. In the framework of the upgrade, the PMT boxes were reworked to implement two channel readout in order to exploit the benefits of the multi-anode PMTs in background tagging and signal recovery. The front-end electronics were also upgraded to QIE10-based electronics which implement larger dynamic range and a 6-bit TDC. Following this major upgrade, the Hadron Forward Calorimeters were commissioned for operation readiness in 2017. Here we describe the details and the components of the upgrade, and discuss the operational experience and results obtained during the upgrade and commissioning.

  10. Fostering Multirepresentational Levels of Chemical Concepts: A Framework to Develop Educational Software

    ERIC Educational Resources Information Center

    Marson, Guilherme A.; Torres, Bayardo B.

    2011-01-01

    This work presents a convenient framework for developing interactive chemical education software to facilitate the integration of macroscopic, microscopic, and symbolic dimensions of chemical concepts--specifically, via the development of software for gel permeation chromatography. The instructional role of the software was evaluated in a study…

  11. Software Engineering Frameworks: Textbooks vs. Student Perceptions

    ERIC Educational Resources Information Center

    McMaster, Kirby; Hadfield, Steven; Wolthuis, Stuart; Sambasivam, Samuel

    2012-01-01

    This research examines the frameworks used by Computer Science and Information Systems students at the conclusion of their first semester of study of Software Engineering. A questionnaire listing 64 Software Engineering concepts was given to students upon completion of their first Software Engineering course. This survey was given to samples of…

  12. Facilitating professional liaison in collaborative care for depression in UK primary care; a qualitative study utilising normalisation process theory.

    PubMed

    Coupe, Nia; Anderson, Emma; Gask, Linda; Sykes, Paul; Richards, David A; Chew-Graham, Carolyn

    2014-05-01

    Collaborative care (CC) is an organisational framework which facilitates the delivery of a mental health intervention to patients by case managers in collaboration with more senior health professionals (supervisors and GPs), and is effective for the management of depression in primary care. However, there remains limited evidence on how to successfully implement this collaborative approach in UK primary care. This study aimed to explore to what extent CC impacts on professional working relationships, and if CC for depression could be implemented as routine in the primary care setting. This qualitative study explored perspectives of the 6 case managers (CMs), 5 supervisors (trial research team members) and 15 general practitioners (GPs) from practices participating in a randomised controlled trial of CC for depression. Interviews were transcribed verbatim and data was analysed using a two-step approach using an initial thematic analysis, and a secondary analysis using the Normalisation Process Theory concepts of coherence, cognitive participation, collective action and reflexive monitoring with respect to the implementation of CC in primary care. Supervisors and CMs demonstrated coherence in their understanding of CC, and consequently reported good levels of cognitive participation and collective action regarding delivering and supervising the intervention. GPs interviewed showed limited understanding of the CC framework, and reported limited collaboration with CMs: barriers to collaboration were identified. All participants identified the potential or experienced benefits of a collaborative approach to depression management and were able to discuss ways in which collaboration can be facilitated. Primary care professionals in this study valued the potential for collaboration, but GPs' understanding of CC and organisational barriers hindered opportunities for communication. Further work is needed to address these organisational barriers in order to facilitate collaboration around individual patients with depression, including shared IT systems, facilitating opportunities for informal discussion and building in formal collaboration into the CC framework. ISRCTN32829227 30/9/2008.

  13. Software framework for the upcoming MMT Observatory primary mirror re-aluminization

    NASA Astrophysics Data System (ADS)

    Gibson, J. Duane; Clark, Dusty; Porter, Dallan

    2014-07-01

    Details of the software framework for the upcoming in-situ re-aluminization of the 6.5m MMT Observatory (MMTO) primary mirror are presented. This framework includes: 1) a centralized key-value store and data structure server for data exchange between software modules, 2) a newly developed hardware-software interface for faster data sampling and better hardware control, 3) automated control algorithms that are based upon empirical testing, modeling, and simulation of the aluminization process, 4) re-engineered graphical user interfaces (GUI's) that use state-of-the-art web technologies, and 5) redundant relational databases for data logging. Redesign of the software framework has several objectives: 1) automated process control to provide more consistent and uniform mirror coatings, 2) optional manual control of the aluminization process, 3) modular design to allow flexibility in process control and software implementation, 4) faster data sampling and logging rates to better characterize the approximately 100-second aluminization event, and 5) synchronized "real-time" web application GUI's to provide all users with exactly the same data. The framework has been implemented as four modules interconnected by a data store/server. The four modules are integrated into two Linux system services that start automatically at boot-time and remain running at all times. Performance of the software framework is assessed through extensive testing within 2.0 meter and smaller coating chambers at the Sunnyside Test Facility. The redesigned software framework helps ensure that a better performing and longer lasting coating will be achieved during the re-aluminization of the MMTO primary mirror.

  14. Reliable prediction of heat transfer coefficient in three-phase bubble column reactor via adaptive neuro-fuzzy inference system and regularization network

    NASA Astrophysics Data System (ADS)

    Garmroodi Asil, A.; Nakhaei Pour, A.; Mirzaei, Sh.

    2018-04-01

    In the present article, generalization performances of regularization network (RN) and optimize adaptive neuro-fuzzy inference system (ANFIS) are compared with a conventional software for prediction of heat transfer coefficient (HTC) as a function of superficial gas velocity (5-25 cm/s) and solid fraction (0-40 wt%) at different axial and radial locations. The networks were trained by resorting several sets of experimental data collected from a specific system of air/hydrocarbon liquid phase/silica particle in a slurry bubble column reactor (SBCR). A special convection HTC measurement probe was manufactured and positioned in an axial distance of 40 and 130 cm above the sparger at center and near the wall of SBCR. The simulation results show that both in-house RN and optimized ANFIS due to powerful noise filtering capabilities provide superior performances compared to the conventional software of MATLAB ANFIS and ANN toolbox. For the case of 40 and 130 cm axial distance from center of sparger, at constant superficial gas velocity of 25 cm/s, adding 40 wt% silica particles to liquid phase leads to about 66% and 69% increasing in HTC respectively. The HTC in the column center for all the cases studied are about 9-14% larger than those near the wall region.

  15. The Particle Physics Playground website: tutorials and activities using real experimental data

    NASA Astrophysics Data System (ADS)

    Bellis, Matthew; CMS Collaboration

    2016-03-01

    The CERN Open Data Portal provides access to data from the LHC experiments to anyone with the time and inclination to learn the analysis procedures. The CMS experiment has made a significant amount of data availible in basically the same format the collaboration itself uses, along with software tools and a virtual enviroment in which to run those tools. These same data have also been mined for educational exercises that range from very simple .csv files that can be analyzed in a spreadsheet to more sophisticated formats that use ROOT, a dominant software package in experimental particle physics but not used as much in the general computing community. This talk will present the Particle Physics Playground website (http://particle-physics-playground.github.io/), a project that uses data from the CMS experiment, as well as the older CLEO experiment, in tutorials and exercises aimed at high school and undergraduate students and other science enthusiasts. The data are stored as text files and the users are provided with starter Python/Jupyter notebook programs and accessor functions which can be modified to perform fairly high-level analyses. The status of the project, success stories, and future plans for the website will be presented. This work was supported in part by NSF Grant PHY-1307562.

  16. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  17. A general observatory control software framework design for existing small and mid-size telescopes

    NASA Astrophysics Data System (ADS)

    Ge, Liang; Lu, Xiao-Meng; Jiang, Xiao-Jun

    2015-07-01

    A general framework for observatory control software would help to improve the efficiency of observation and operation of telescopes, and would also be advantageous for remote and joint observations. We describe a general framework for observatory control software, which considers principles of flexibility and inheritance to meet the expectations from observers and technical personnel. This framework includes observation scheduling, device control and data storage. The design is based on a finite state machine that controls the whole process.

  18. THE EPA MULTIMEDIA INTEGRATED MODELING SYSTEM SOFTWARE SUITE

    EPA Science Inventory

    The U.S. EPA is developing a Multimedia Integrated Modeling System (MIMS) framework that will provide a software infrastructure or environment to support constructing, composing, executing, and evaluating complex modeling studies. The framework will include (1) common software ...

  19. HCI∧2 framework: a software framework for multimodal human-computer interaction systems.

    PubMed

    Shen, Jie; Pantic, Maja

    2013-12-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a shared-memory-based data transport protocol for message delivery and a TCP-based system management protocol. The latter ensures that the integrity of system structure is maintained at runtime. With the inclusion of bridging modules, the HCI∧2 Framework is interoperable with other software frameworks including Psyclone and ActiveMQ. In addition to the core communication middleware, we also present the integrated development environment (IDE) of the HCI∧2 Framework. It provides a complete graphical environment to support every step in a typical MHCI system development process, including module development, debugging, packaging, and management, as well as the whole system management and testing. The quantitative evaluation indicates that our framework outperforms other similar tools in terms of average message latency and maximum data throughput under a typical single PC scenario. To demonstrate HCI∧2 Framework's capabilities in integrating heterogeneous modules, we present several example modules working with a variety of hardware and software. We also present an example of a full system developed using the proposed HCI∧2 Framework, which is called the CamGame system and represents a computer game based on hand-held marker(s) and low-cost camera(s).

  20. General Aviation Data Framework

    NASA Technical Reports Server (NTRS)

    Blount, Elaine M.; Chung, Victoria I.

    2006-01-01

    The Flight Research Services Directorate at the NASA Langley Research Center (LaRC) provides development and operations services associated with three general aviation (GA) aircraft used for research experiments. The GA aircraft includes a Cessna 206X Stationair, a Lancair Colombia 300X, and a Cirrus SR22X. Since 2004, the GA Data Framework software was designed and implemented to gather data from a varying set of hardware and software sources as well as enable transfer of the data to other computers or devices. The key requirements for the GA Data Framework software include platform independence, the ability to reuse the framework for different projects without changing the framework code, graphics display capabilities, and the ability to vary the interfaces and their performance. Data received from the various devices is stored in shared memory. This paper concentrates on the object oriented software design patterns within the General Aviation Data Framework, and how they enable the construction of project specific software without changing the base classes. The issues of platform independence and multi-threading which enable interfaces to run at different frame rates are also discussed in this paper.

  1. A New Event Builder for CMS Run II

    NASA Astrophysics Data System (ADS)

    Albertsson, K.; Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Infiniband FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. We present performance measurements from small-scale prototypes and from the full-scale production system.

  2. A new event builder for CMS Run II

    DOE PAGES

    Albertsson, K.; Andre, J-M; Andronidis, A.; ...

    2015-12-23

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Innibandmore » FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. In conclusion, ee present performance measurements from small-scale prototypes and from the full-scale production system.« less

  3. System Software Framework for System of Systems Avionics

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.

    2005-01-01

    Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.

  4. Experimentation in software engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Selby, R. W.; Hutchens, D. H.

    1986-01-01

    Experimentation in software engineering supports the advancement of the field through an iterative learning process. In this paper, a framework for analyzing most of the experimental work performed in software engineering over the past several years is presented. A variety of experiments in the framework is described and their contribution to the software engineering discipline is discussed. Some useful recommendations for the application of the experimental process in software engineering are included.

  5. Testing the 2-TeV resonance with trileptons

    DOE PAGES

    Das, Arindam; Nagata, Natsumi; Okada, Nobuchika

    2016-03-09

    The CMS collaboration has reported a 2.8 excess in the search of the SU(2) R gauge bosons decaying through right-handed neutrinos into the two electron plus two jets (more » $eejj$) final states. This can be explained if the SU(2) Rcharged gauge bosons W$$±\\atop{R}$$ have a mass of around 2TeV and a right-handed neutrino with a mass of O(1)TeV mainly decays to electron. Indeed, recent results in several other experiments, especially that from the ATLAS diboson resonance search, also indicate signatures of such a 2TeV gauge boson. However, a lack of the same-sign electron events in the CMS $eejj$ search challenges the interpretation of the right-handed neutrino as a Majorana fermion. Taking this situation into account, in this paper, we consider a possibility of explaining the CMS eejj excess based on the SU(2) L ⓍSU(2) RⓍ U(1) B-L gauge theory with pseudo-Dirac neutrinos. We fi nd that both the CMS excess events and the ATLAS diboson anomaly can actually be explained in this framework without conflicting with the current experimental bounds. This setup in general allows sizable left-right mixing in both the charged gauge boson and neutrino sectors, which enables us to probe this model through the trilepton plus missing energy search at the LHC. It turns out that the number of events in this channel predicted in our model is in good agreement with that observed by the CMS collaboration. We also discuss prospects for testing this model at the LHC Run-II experiments.« less

  6. CMS: A Web-Based System for Visualization and Analysis of Genome-Wide Methylation Data of Human Cancers

    PubMed Central

    Huang, Yi-Wen; Roa, Juan C.; Goodfellow, Paul J.; Kizer, E. Lynette; Huang, Tim H. M.; Chen, Yidong

    2013-01-01

    Background DNA methylation of promoter CpG islands is associated with gene suppression, and its unique genome-wide profiles have been linked to tumor progression. Coupled with high-throughput sequencing technologies, it can now efficiently determine genome-wide methylation profiles in cancer cells. Also, experimental and computational technologies make it possible to find the functional relationship between cancer-specific methylation patterns and their clinicopathological parameters. Methodology/Principal Findings Cancer methylome system (CMS) is a web-based database application designed for the visualization, comparison and statistical analysis of human cancer-specific DNA methylation. Methylation intensities were obtained from MBDCap-sequencing, pre-processed and stored in the database. 191 patient samples (169 tumor and 22 normal specimen) and 41 breast cancer cell-lines are deposited in the database, comprising about 6.6 billion uniquely mapped sequence reads. This provides comprehensive and genome-wide epigenetic portraits of human breast cancer and endometrial cancer to date. Two views are proposed for users to better understand methylation structure at the genomic level or systemic methylation alteration at the gene level. In addition, a variety of annotation tracks are provided to cover genomic information. CMS includes important analytic functions for interpretation of methylation data, such as the detection of differentially methylated regions, statistical calculation of global methylation intensities, multiple gene sets of biologically significant categories, interactivity with UCSC via custom-track data. We also present examples of discoveries utilizing the framework. Conclusions/Significance CMS provides visualization and analytic functions for cancer methylome datasets. A comprehensive collection of datasets, a variety of embedded analytic functions and extensive applications with biological and translational significance make this system powerful and unique in cancer methylation research. CMS is freely accessible at: http://cbbiweb.uthscsa.edu/KMethylomes/. PMID:23630576

  7. CMS: a web-based system for visualization and analysis of genome-wide methylation data of human cancers.

    PubMed

    Gu, Fei; Doderer, Mark S; Huang, Yi-Wen; Roa, Juan C; Goodfellow, Paul J; Kizer, E Lynette; Huang, Tim H M; Chen, Yidong

    2013-01-01

    DNA methylation of promoter CpG islands is associated with gene suppression, and its unique genome-wide profiles have been linked to tumor progression. Coupled with high-throughput sequencing technologies, it can now efficiently determine genome-wide methylation profiles in cancer cells. Also, experimental and computational technologies make it possible to find the functional relationship between cancer-specific methylation patterns and their clinicopathological parameters. Cancer methylome system (CMS) is a web-based database application designed for the visualization, comparison and statistical analysis of human cancer-specific DNA methylation. Methylation intensities were obtained from MBDCap-sequencing, pre-processed and stored in the database. 191 patient samples (169 tumor and 22 normal specimen) and 41 breast cancer cell-lines are deposited in the database, comprising about 6.6 billion uniquely mapped sequence reads. This provides comprehensive and genome-wide epigenetic portraits of human breast cancer and endometrial cancer to date. Two views are proposed for users to better understand methylation structure at the genomic level or systemic methylation alteration at the gene level. In addition, a variety of annotation tracks are provided to cover genomic information. CMS includes important analytic functions for interpretation of methylation data, such as the detection of differentially methylated regions, statistical calculation of global methylation intensities, multiple gene sets of biologically significant categories, interactivity with UCSC via custom-track data. We also present examples of discoveries utilizing the framework. CMS provides visualization and analytic functions for cancer methylome datasets. A comprehensive collection of datasets, a variety of embedded analytic functions and extensive applications with biological and translational significance make this system powerful and unique in cancer methylation research. CMS is freely accessible at: http://cbbiweb.uthscsa.edu/KMethylomes/.

  8. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  9. Dynamics of low velocity collisions of ice particle, coated with frost

    NASA Technical Reports Server (NTRS)

    Bridges, F.; Lin, D.; Boone, L.; Darknell, D.

    1991-01-01

    We continued our investigations of low velocity collisions of ice particles for velocities in range 10(exp -3) - 2 cm/s. The work focused on two effects: (1) the sticking forces for ice particles coated with CO2 frost, and (2) the completion of a 2-D pendulum system for glancing collisions. A new computer software was also developed to control and monitor the position of the 2-D pendulum.

  10. Framework Based Guidance Navigation and Control Flight Software Development

    NASA Technical Reports Server (NTRS)

    McComas, David

    2007-01-01

    This viewgraph presentation describes NASA's guidance navigation and control flight software development background. The contents include: 1) NASA/Goddard Guidance Navigation and Control (GN&C) Flight Software (FSW) Development Background; 2) GN&C FSW Development Improvement Concepts; and 3) GN&C FSW Application Framework.

  11. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: An Earth Modeling System Software Framework Strawman Design that Integrates Cactus and UCLA/UCB Distributed Data Broker

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.

  12. A Validation Framework for the Long Term Preservation of High Energy Physics Data

    NASA Astrophysics Data System (ADS)

    Ozerov, Dmitri; South, David M.

    2014-06-01

    The study group on data preservation in high energy physics, DPHEP, is moving to a new collaboration structure, which will focus on the implementation of preservation projects, such as those described in the group's large scale report published in 2012. One such project is the development of a validation framework, which checks the compatibility of evolving computing environments and technologies with the experiments software for as long as possible, with the aim of substantially extending the lifetime of the analysis software, and hence of the usability of the data. The framework is designed to automatically test and validate the software and data of an experiment against changes and upgrades to the computing environment, as well as changes to the experiment software itself. Technically, this is realised using a framework capable of hosting a number of virtual machine images, built with different configurations of operating systems and the relevant software, including any necessary external dependencies.

  13. CMS Analysis and Data Reduction with Apache Spark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutsche, Oliver; Canali, Luca; Cremer, Illia

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis ofmore » very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.« less

  14. A Modular Framework for Transforming Structured Data into HTML with Machine-Readable Annotations

    NASA Astrophysics Data System (ADS)

    Patton, E. W.; West, P.; Rozell, E.; Zheng, J.

    2010-12-01

    There is a plethora of web-based Content Management Systems (CMS) available for maintaining projects and data, i.a. However, each system varies in its capabilities and often content is stored separately and accessed via non-uniform web interfaces. Moving from one CMS to another (e.g., MediaWiki to Drupal) can be cumbersome, especially if a large quantity of data must be adapted to the new system. To standardize the creation, display, management, and sharing of project information, we have assembled a framework that uses existing web technologies to transform data provided by any service that supports the SPARQL Protocol and RDF Query Language (SPARQL) queries into HTML fragments, allowing it to be embedded in any existing website. The framework utilizes a two-tier XML Stylesheet Transformation (XSLT) that uses existing ontologies (e.g., Friend-of-a-Friend, Dublin Core) to interpret query results and render them as HTML documents. These ontologies can be used in conjunction with custom ontologies suited to individual needs (e.g., domain-specific ontologies for describing data records). Furthermore, this transformation process encodes machine-readable annotations, namely, the Resource Description Framework in attributes (RDFa), into the resulting HTML, so that capable parsers and search engines can extract the relationships between entities (e.g, people, organizations, datasets). To facilitate editing of content, the framework provides a web-based form system, mapping each query to a dynamically generated form that can be used to modify and create entities, while keeping the native data store up-to-date. This open framework makes it easy to duplicate data across many different sites, allowing researchers to distribute their data in many different online forums. In this presentation we will outline the structure of queries and the stylesheets used to transform them, followed by a brief walkthrough that follows the data from storage to human- and machine-accessible web page. We conclude with a discussion on content caching and steps toward performing queries across multiple domains.

  15. Distributed Computing Framework for Synthetic Radar Application

    NASA Technical Reports Server (NTRS)

    Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael

    2006-01-01

    We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.

  16. Development of a change management system

    NASA Technical Reports Server (NTRS)

    Parks, Cathy Bonifas

    1993-01-01

    The complexity and interdependence of software on a computer system can create a situation where a solution to one problem causes failures in dependent software. In the computer industry, software problems arise and are often solved with 'quick and dirty' solutions. But in implementing these solutions, documentation about the solution or user notification of changes is often overlooked, and new problems are frequently introduced because of insufficient review or testing. These problems increase when numerous heterogeneous systems are involved. Because of this situation, a change management system plays an integral part in the maintenance of any multisystem computing environment. At the NASA Ames Advanced Computational Facility (ACF), the Online Change Management System (OCMS) was designed and developed to manage the changes being applied to its multivendor computing environment. This paper documents the research, design, and modifications that went into the development of this change management system (CMS).

  17. A software framework for real-time multi-modal detection of microsleeps.

    PubMed

    Knopp, Simon J; Bones, Philip J; Weddell, Stephen J; Jones, Richard D

    2017-09-01

    A software framework is described which was designed to process EEG, video of one eye, and head movement in real time, towards achieving early detection of microsleeps for prevention of fatal accidents, particularly in transport sectors. The framework is based around a pipeline structure with user-replaceable signal processing modules. This structure can encapsulate a wide variety of feature extraction and classification techniques and can be applied to detecting a variety of aspects of cognitive state. Users of the framework can implement signal processing plugins in C++ or Python. The framework also provides a graphical user interface and the ability to save and load data to and from arbitrary file formats. Two small studies are reported which demonstrate the capabilities of the framework in typical applications: monitoring eye closure and detecting simulated microsleeps. While specifically designed for microsleep detection/prediction, the software framework can be just as appropriately applied to (i) other measures of cognitive state and (ii) development of biomedical instruments for multi-modal real-time physiological monitoring and event detection in intensive care, anaesthesiology, cardiology, neurosurgery, etc. The software framework has been made freely available for researchers to use and modify under an open source licence.

  18. The NASA Carbon Monitoring System

    NASA Astrophysics Data System (ADS)

    Hurtt, G. C.

    2015-12-01

    Greenhouse gas emission inventories, forest carbon sequestration programs (e.g., Reducing Emissions from Deforestation and Forest Degradation (REDD and REDD+), cap-and-trade systems, self-reporting programs, and their associated monitoring, reporting and verification (MRV) frameworks depend upon data that are accurate, systematic, practical, and transparent. A sustained, observationally-driven carbon monitoring system using remote sensing data has the potential to significantly improve the relevant carbon cycle information base for the U.S. and world. Initiated in 2010, NASA's Carbon Monitoring System (CMS) project is prototyping and conducting pilot studies to evaluate technological approaches and methodologies to meet carbon monitoring and reporting requirements for multiple users and over multiple scales of interest. NASA's approach emphasizes exploitation of the satellite remote sensing resources, computational capabilities, scientific knowledge, airborne science capabilities, and end-to-end system expertise that are major strengths of the NASA Earth Science program. Through user engagement activities, the NASA CMS project is taking specific actions to be responsive to the needs of stakeholders working to improve carbon MRV frameworks. The first phase of NASA CMS projects focused on developing products for U.S. biomass/carbon stocks and global carbon fluxes, and on scoping studies to identify stakeholders and explore other potential carbon products. The second phase built upon these initial efforts, with a large expansion in prototyping activities across a diversity of systems, scales, and regions, including research focused on prototype MRV systems and utilization of COTS technologies. Priorities for the future include: 1) utilizing future satellite sensors, 2) prototyping with commercial off-the-shelf technology, 3) expanding the range of prototyping activities, 4) rigorous evaluation, uncertainty quantification, and error characterization, 5) stakeholder engagement, 6) partnerships with other U.S. agencies and international partners, and 7) modeling and data assimilation.

  19. Genomic Conflicts that Cause Pollen Mortality and Raise Reproductive Barriers in Arabidopsis thaliana

    PubMed Central

    Simon, Matthieu; Durand, Stéphanie; Pluta, Natacha; Gobron, Nicolas; Botran, Lucy; Ricou, Anthony; Camilleri, Christine; Budar, Françoise

    2016-01-01

    Species differentiation and the underlying genetics of reproductive isolation are central topics in evolutionary biology. Hybrid sterility is one kind of reproductive barrier that can lead to differentiation between species. Here, we analyze the complex genetic basis of the intraspecific hybrid male sterility that occurs in the offspring of two distant natural strains of Arabidopsis thaliana, Shahdara and Mr-0, with Shahdara as the female parent. Using both classical and quantitative genetic approaches as well as cytological observation of pollen viability, we demonstrate that this particular hybrid sterility results from two causes of pollen mortality. First, the Shahdara cytoplasm induces gametophytic cytoplasmic male sterility (CMS) controlled by several nuclear loci. Second, several segregation distorters leading to allele-specific pollen abortion (pollen killers) operate in hybrids with either cytoplasm. The complete sterility of the hybrid with the Shahdara cytoplasm results from the genetic linkage of the two causes of pollen mortality, i.e., CMS nuclear determinants and pollen killers. Furthermore, natural variation at these loci in A. thaliana is associated with different male-sterility phenotypes in intraspecific hybrids. Our results suggest that the genomic conflicts that underlie segregation distorters and CMS can concurrently lead to reproductive barriers between distant strains within a species. This study provides a new framework for identifying molecular mechanisms and the evolutionary history of loci that contribute to reproductive isolation, and possibly to speciation. It also suggests that two types of genomic conflicts, CMS and segregation distorters, may coevolve in natural populations. PMID:27182945

  20. The Five 'R's' for Developing Trusted Software Frameworks to increase confidence in, and maximise reuse of, Open Source Software.

    NASA Astrophysics Data System (ADS)

    Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens

    2015-04-01

    Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.

  1. BioContainers: an open-source and community-driven framework for software standardization.

    PubMed

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  2. BioContainers: an open-source and community-driven framework for software standardization

    PubMed Central

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  3. Evidence-based and heuristic approaches for customization of care in cardiometabolic syndrome after spinal cord injury.

    PubMed

    Nash, Mark S; Cowan, Rachel E; Kressler, Jochen

    2012-09-01

    Component and coalesced health risks of the cardiometabolic syndrome (CMS) are commonly reported in persons with spinal cord injuries (SCIs). These CMS hazards are also co-morbid with physical deconditioning and elevated pro-atherogenic inflammatory cytokines, both of which are common after SCI and worsen the prognosis for all-cause cardiovascular disease. This article describes a systematic procedure for individualized CMS risk assessment after SCI, and emphasizes evidence-based and intuition-centered countermeasures to disease. A unified approach will propose therapeutic lifestyle intervention as a routine plan for aggressive primary prevention in this risk-susceptible population. Customization of dietary and exercise plans then follow, identifying shortfalls in diet and activity patterns, and ways in which these healthy lifestyles can be more substantially embraced by both stakeholders with SCI and their health care providers. In cases where lifestyle intervention utilizing diet and exercise is unsuccessful in countering risks, available pharmacotherapies and a preferred therapeutic agent are proposed according to authoritative standards. The over-arching purpose of the monograph is to create an operational framework in which existing evidence-based approaches or heuristic modeling becomes best practice. In this way persons with SCI can lead more active and healthy lives.

  4. KMR kt-factorization procedure for the description of the LHCb forward hadron-hadron Z0 production at √{ s} = 13TeV

    NASA Astrophysics Data System (ADS)

    Modarres, M.; Masouminia, M. R.; Aminzadeh Nik, R.; Hosseinkhani, H.; Olanj, N.

    2017-09-01

    Quite recently, two sets of new experimental data from the LHCb and the CMS Collaborations have been published, concerning the production of the Z0 vector boson in hadron-hadron collisions with the center-of-mass energy ECM =√{ s} = 13TeV. On the other hand, in our recent work, we have conducted a set of semi-NLO calculations for the production of the electro-weak gauge vector bosons, utilizing the unintegrated parton distribution functions (UPDF) in the frameworks of Kimber-Martin-Ryskin (KMR) or Martin-Ryskin-Watt (MRW) and the kt-factorization formalism, concluding that the results of the KMR scheme are arguably better in describing the existing experimental data, coming from D0, CDF, CMS and ATLAS Collaborations. In the present work, we intend to follow the same semi-NLO formalism and calculate the rate of the production of the Z0 vector boson, utilizing the UPDF of KMR within the dynamics of the recent data. It will be shown that our results are in good agreement with the new measurements of the LHCb and the CMS Collaborations.

  5. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  6. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  7. HPC Software Stack Testing Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garvey, Cormac

    The HPC Software stack testing framework (hpcswtest) is used in the INL Scientific Computing Department to test the basic sanity and integrity of the HPC Software stack (Compilers, MPI, Numerical libraries and Applications) and to quickly discover hard failures, and as a by-product it will indirectly check the HPC infrastructure (network, PBS and licensing servers).

  8. Exploiting multicore compute resources in the CMS experiment

    NASA Astrophysics Data System (ADS)

    Ramírez, J. E.; Pérez-Calero Yzquierdo, A.; Hernández, J. M.; CMS Collaboration

    2016-10-01

    CMS has developed a strategy to efficiently exploit the multicore architecture of the compute resources accessible to the experiment. A coherent use of the multiple cores available in a compute node yields substantial gains in terms of resource utilization. The implemented approach makes use of the multithreading support of the event processing framework and the multicore scheduling capabilities of the resource provisioning system. Multicore slots are acquired and provisioned by means of multicore pilot agents which internally schedule and execute single and multicore payloads. Multicore scheduling and multithreaded processing are currently used in production for online event selection and prompt data reconstruction. More workflows are being adapted to run in multicore mode. This paper presents a review of the experience gained in the deployment and operation of the multicore scheduling and processing system, the current status and future plans.

  9. Composite quantum collision models

    NASA Astrophysics Data System (ADS)

    Lorenzo, Salvatore; Ciccarello, Francesco; Palma, G. Massimo

    2017-09-01

    A collision model (CM) is a framework to describe open quantum dynamics. In its memoryless version, it models the reservoir R as consisting of a large collection of elementary ancillas: the dynamics of the open system S results from successive collisions of S with the ancillas of R . Here, we present a general formulation of memoryless composite CMs, where S is partitioned into the very open system under study S coupled to one or more auxiliary systems {Si} . Their composite dynamics occurs through internal S -{Si} collisions interspersed with external ones involving {Si} and the reservoir R . We show that important known instances of quantum non-Markovian dynamics of S —such as the emission of an atom into a reservoir featuring a Lorentzian, or multi-Lorentzian, spectral density or a qubit subject to random telegraph noise—can be mapped on to such memoryless composite CMs.

  10. A Health Science Process Framework for Comprehensive Clinical Functional Assessment

    DTIC Science & Technology

    2014-02-01

    Services (CMS), a Research , Measurement, Assessment, Design, and Analysis (RMADA) IDIQ with the primary task order targeting improving the disability ...2014 TYPE OF REPORT: Annual PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 DISTRIBUTION STATEMENT...U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION

  11. The Impact of CMS Quality on the Outcomes of E-Learning Systems in Higher Education: An Empirical Study

    ERIC Educational Resources Information Center

    Kim, Kihyun; Trimi, Silvana; Park, Hyesung; Rhee, Shanggeun

    2012-01-01

    Course Management Systems (CMSs) in higher education have emerged as one of the most widely adopted e-learning platforms. This study examines the success of e-learning CMSs based on user satisfaction and benefits. Using DeLone and McLean's information system success model as a theoretical framework, we analyze the success of e-learning CMSs in…

  12. Flight Software Development for the CHEOPS Instrument with the CORDET Framework

    NASA Astrophysics Data System (ADS)

    Cechticky, V.; Ottensamer, R.; Pasetti, A.

    2015-09-01

    CHEOPS is an ESA S-class mission dedicated to the precise measurement of radii of already known exoplanets using ultra-high precision photometry. The instrument flight software controlling the instrument and handling the science data is developed by the University of Vienna using the CORDET Framework offered by P&P Software GmbH. The CORDET Framework provides a generic software infrastructure for PUS-based applications. This paper describes how the framework is used for the CHEOPS application software to provide a consistent solution for to the communication and control services, event handling and FDIR procedures. This approach is innovative in four respects: (a) it is a true third-party reuse; (b) re-use is done at specification, validation and code level; (c) the re-usable assets and their qualification data package are entirely open-source; (d) re-use is based on call-back with the application developer providing functions which are called by the reusable architecture. File names missing from here on out (I tried to mimic the files names from before.)

  13. Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    PubMed Central

    Degenhart, Alan D.; Kelly, John W.; Ashmore, Robin C.; Collinger, Jennifer L.; Tyler-Kabara, Elizabeth C.; Weber, Douglas J.; Wang, Wei

    2011-01-01

    This paper presents “Craniux,” an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development. PMID:21687575

  14. Craniux: a LabVIEW-based modular software framework for brain-machine interface research.

    PubMed

    Degenhart, Alan D; Kelly, John W; Ashmore, Robin C; Collinger, Jennifer L; Tyler-Kabara, Elizabeth C; Weber, Douglas J; Wang, Wei

    2011-01-01

    This paper presents "Craniux," an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.

  15. Problem Solving Frameworks for Mathematics and Software Development

    ERIC Educational Resources Information Center

    McMaster, Kirby; Sambasivam, Samuel; Blake, Ashley

    2012-01-01

    In this research, we examine how problem solving frameworks differ between Mathematics and Software Development. Our methodology is based on the assumption that the words used frequently in a book indicate the mental framework of the author. We compared word frequencies in a sample of 139 books that discuss problem solving. The books were grouped…

  16. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  17. Parallels in Computer-Aided Design Framework and Software Development Environment Efforts.

    DTIC Science & Technology

    1992-05-01

    de - sign kits, and tool and design management frameworks. Also, books about software engineer- ing environments [Long 91] and electronic design...tool integration [Zarrella 90], and agreement upon a universal de - sign automation framework, such as the CAD Framework Initiative (CFI) [Malasky 91...ments: identification, control, status accounting, and audit and review. The paper by Dart ex- tracts 15 CM concepts from existing SDEs and tools

  18. Applying a Framework to Evaluate Assignment Marking Software: A Case Study on Lightwork

    ERIC Educational Resources Information Center

    Heinrich, Eva; Milne, John

    2012-01-01

    This article presents the findings of a qualitative evaluation on the effect of a specialised software tool on the efficiency and quality of assignment marking. The software, Lightwork, combines with the Moodle learning management system and provides support through marking rubrics and marker allocations. To enable the evaluation a framework has…

  19. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer Framework...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  20. Testing Cobol Programs by Mutation. Volume II. CMS 1 System Documentation,

    DTIC Science & Technology

    1980-02-01

    8217 LOOP MVth a’ solution was to enebese the macn depesi- 1F rumen Tom dent imes ide a apeelal type of mdulo called a G aaraadom(2); device maul* (see...software status register can be modified at will, while the required, but as an additional benefit, Modula contents of the buffer can be dealt with as...language. The requirements are stringent and usu- Dick up the contents of the address by Indirect ally the work Is never completed, an the uses for

  1. A Software Rejuvenation Framework for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  2. A software framework for developing measurement applications under variable requirements.

    PubMed

    Arpaia, Pasquale; Buzio, Marco; Fiscarelli, Lucio; Inglese, Vitaliano

    2012-11-01

    A framework for easily developing software for measurement and test applications under highly and fast-varying requirements is proposed. The framework allows the software quality, in terms of flexibility, usability, and maintainability, to be maximized. Furthermore, the development effort is reduced and finalized, by relieving the test engineer of development details. The framework can be configured for satisfying a large set of measurement applications in a generic field for an industrial test division, a test laboratory, or a research center. As an experimental case study, the design, the implementation, and the assessment inside the application to a measurement scenario of magnet testing at the European Organization for Nuclear Research is reported.

  3. A Methodological Framework for Enterprise Information System Requirements Derivation

    NASA Astrophysics Data System (ADS)

    Caplinskas, Albertas; Paškevičiūtė, Lina

    Current information systems (IS) are enterprise-wide systems supporting strategic goals of the enterprise and meeting its operational business needs. They are supported by information and communication technologies (ICT) and other software that should be fully integrated. To develop software responding to real business needs, we need requirements engineering (RE) methodology that ensures the alignment of requirements for all levels of enterprise system. The main contribution of this chapter is a requirement-oriented methodological framework allowing to transform business requirements level by level into software ones. The structure of the proposed framework reflects the structure of Zachman's framework. However, it has other intentions and is purposed to support not the design but the RE issues.

  4. Solving a mathematical model integrating unequal-area facilities layout and part scheduling in a cellular manufacturing system by a genetic algorithm.

    PubMed

    Ebrahimi, Ahmad; Kia, Reza; Komijan, Alireza Rashidi

    2016-01-01

    In this article, a novel integrated mixed-integer nonlinear programming model is presented for designing a cellular manufacturing system (CMS) considering machine layout and part scheduling problems simultaneously as interrelated decisions. The integrated CMS model is formulated to incorporate several design features including part due date, material handling time, operation sequence, processing time, an intra-cell layout of unequal-area facilities, and part scheduling. The objective function is to minimize makespan, tardiness penalties, and material handling costs of inter-cell and intra-cell movements. Two numerical examples are solved by the Lingo software to illustrate the results obtained by the incorporated features. In order to assess the effects and importance of integration of machine layout and part scheduling in designing a CMS, two approaches, sequentially and concurrent are investigated and the improvement resulted from a concurrent approach is revealed. Also, due to the NP-hardness of the integrated model, an efficient genetic algorithm is designed. As a consequence, computational results of this study indicate that the best solutions found by GA are better than the solutions found by B&B in much less time for both sequential and concurrent approaches. Moreover, the comparisons between the objective function values (OFVs) obtained by sequential and concurrent approaches demonstrate that the OFV improvement is averagely around 17 % by GA and 14 % by B&B.

  5. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.

    PubMed

    Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel

    2018-02-20

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.

  6. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints

    PubMed Central

    Navet, Nicolas; Havet, Lionel

    2018-01-01

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489

  7. A Buyer Behaviour Framework for the Development and Design of Software Agents in E-Commerce.

    ERIC Educational Resources Information Center

    Sproule, Susan; Archer, Norm

    2000-01-01

    Software agents are computer programs that run in the background and perform tasks autonomously as delegated by the user. This paper blends models from marketing research and findings from the field of decision support systems to build a framework for the design of software agents to support in e-commerce buying applications. (Contains 35…

  8. Unintended Consequences in Cancer Care Delivery Created by the Medicare Part B Proposal: Is the Clinical Rationale for the Experiment Flawed?

    PubMed

    Gordan, Lucio; Grogg, Amy; Blazer, Marlo; Fortner, Barry

    2017-02-01

    Medicare currently enrolls ≥ 45 million adults, and by 2030 this is projected to increase to ≥ 80 million beneficiaries. With this growth, the Centers for Medicare & Medicaid Services (CMS) issued a proposal, the Medicare Part B Drug Payment Model, to shrink drug expenditures, a major contributor to overall health care costs. For this to not adversely affect patient outcomes, lower-cost alternative medications with equivalent efficacy and no increased toxicity must be available. This is often not true in the treatment of cancer. Herein, we examine the flaws in the rationale of the CMS and the potential unintended consequences of this experiment. We identified the top three oncology expenditures (rituximab, bevacizumab, and trastuzumab) and their vetted alternatives (per the National Comprehensive Cancer Network guidelines) to ascertain whether lower-cost equivalent alternatives are available. Drug cost was based on April 2016 average sale price. We explored both efficacy of the agents and, when applicable, toxicity to compare alternatives to these high-dollar medications. For the largest Medicare oncology drug expenditures, there is not a lower-cost option with equal efficacy for their primary indications. Without lower-cost alternatives, the unintended consequence of this CMS experiment may include curtailing access to care or an increase in patient/program costs. The CMS proposal, by simply lowering reimbursement for drugs, does not acknowledge the value of these agents and could unintentionally reduce quality of care. Alternative approaches to value-based care, such as the Oncology Care Model and similar frameworks, should be explored.

  9. Genomic Conflicts that Cause Pollen Mortality and Raise Reproductive Barriers in Arabidopsis thaliana.

    PubMed

    Simon, Matthieu; Durand, Stéphanie; Pluta, Natacha; Gobron, Nicolas; Botran, Lucy; Ricou, Anthony; Camilleri, Christine; Budar, Françoise

    2016-07-01

    Species differentiation and the underlying genetics of reproductive isolation are central topics in evolutionary biology. Hybrid sterility is one kind of reproductive barrier that can lead to differentiation between species. Here, we analyze the complex genetic basis of the intraspecific hybrid male sterility that occurs in the offspring of two distant natural strains of Arabidopsis thaliana, Shahdara and Mr-0, with Shahdara as the female parent. Using both classical and quantitative genetic approaches as well as cytological observation of pollen viability, we demonstrate that this particular hybrid sterility results from two causes of pollen mortality. First, the Shahdara cytoplasm induces gametophytic cytoplasmic male sterility (CMS) controlled by several nuclear loci. Second, several segregation distorters leading to allele-specific pollen abortion (pollen killers) operate in hybrids with either cytoplasm. The complete sterility of the hybrid with the Shahdara cytoplasm results from the genetic linkage of the two causes of pollen mortality, i.e., CMS nuclear determinants and pollen killers. Furthermore, natural variation at these loci in A. thaliana is associated with different male-sterility phenotypes in intraspecific hybrids. Our results suggest that the genomic conflicts that underlie segregation distorters and CMS can concurrently lead to reproductive barriers between distant strains within a species. This study provides a new framework for identifying molecular mechanisms and the evolutionary history of loci that contribute to reproductive isolation, and possibly to speciation. It also suggests that two types of genomic conflicts, CMS and segregation distorters, may coevolve in natural populations. Copyright © 2016 by the Genetics Society of America.

  10. Multi-core processing and scheduling performance in CMS

    NASA Astrophysics Data System (ADS)

    Hernández, J. M.; Evans, D.; Foulkes, S.

    2012-12-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.

  11. ControlShell - A real-time software framework

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Ullman, Marc A.; Chen, Vincent W.

    1991-01-01

    ControlShell is designed to enable modular design and impplementation of real-time software. It is an object-oriented tool-set for real-time software system programming. It provides a series of execution and data interchange mechansims that form a framework for building real-time applications. These mechanisms allow a component-based approach to real-time software generation and mangement. By defining a set of interface specifications for intermodule interaction, ControlShell provides a common platform that is the basis for real-time code development and exchange.

  12. Software Framework for Peer Data-Management Services

    NASA Technical Reports Server (NTRS)

    Hughes, John; Hardman, Sean; Crichton, Daniel; Hyon, Jason; Kelly, Sean; Tran, Thuy

    2007-01-01

    Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-to-peer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.

  13. Morphological and genetic characterization of a new cytoplasmic male sterility system (oxa CMS) in stem mustard (Brassica juncea).

    PubMed

    Heng, Shuangping; Liu, Sansan; Xia, Chunxiu; Tang, HongYu; Xie, Fei; Fu, Tingdong; Wan, Zhengjie

    2018-01-01

    KEY MESSAGE: oxa CMS is a new cytoplasmic male sterility type in Brassica juncea. oxa CMS is a cytoplasmic male sterility (CMS) line that has been widely used in the production and cultivation of stem mustard in the southwestern China. In this study, different CMS-type specific mitochondrial markers were used to confirm that oxa CMS is distinct from the pol CMS, ogu CMS, nap CMS, hau CMS, tour CMS, Moricandia arvensis CMS, orf220-type CMS, etc., that have been previously reported in Brassica crops. Pollen grains of the oxa CMS line are sterile with a self-fertility rate of almost 0% and the sterility strain rate and sterility degree of oxa CMS is 100% due to a specific flower structure and flowering habit. Scanning electron microscopy revealed that most pollen grains in mature anthers of the oxa CMS line are empty, flat and deflated. Semi-thin section further showed that the abortive stage of anther development in oxa CMS is initiated at the late uninucleate stage. Abnormally vacuolated microspores caused male sterility in the oxa CMS line. This cytological study combined with marker-assisted selection showed that oxa CMS is a novel CMS type in stem mustard (Brassica juncea). Interestingly, the abortive stage of oxa CMS is later than those in other CMS types reported in Brassica crops, and there is no negative effect on the oxa CMS line growth period. This study demonstrated that this novel oxa CMS has a unique flower structure with sterile pollen grains at the late uninucleate stage. Our results may help to uncover the mechanism of oxa CMS in Brassica juncea.

  14. Architecture of a framework for providing information services for public transport.

    PubMed

    García, Carmelo R; Pérez, Ricardo; Lorenzo, Alvaro; Quesada-Arencibia, Alexis; Alayón, Francisco; Padrón, Gabino

    2012-01-01

    This paper presents OnRoute, a framework for developing and running ubiquitous software that provides information services to passengers of public transportation, including payment systems and on-route guidance services. To achieve a high level of interoperability, accessibility and context awareness, OnRoute uses the ubiquitous computing paradigm. To guarantee the quality of the software produced, the reliable software principles used in critical contexts, such as automotive systems, are also considered by the framework. The main components of its architecture (run-time, system services, software components and development discipline) and how they are deployed in the transportation network (stations and vehicles) are described in this paper. Finally, to illustrate the use of OnRoute, the development of a guidance service for travellers is explained.

  15. A Software Framework for Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.

    2008-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has a long history in developing simulations of experimental fixed-wing aircraft from gliders to suborbital vehicles on platforms ranging from desktop simulators to pilot-in-the-loop/aircraft-in-the-loop simulators. Regardless of the aircraft or simulator hardware, much of the software framework is common to all NASA Dryden simulators. Some of this software has withstood the test of time, but in recent years the push toward high-fidelity user-friendly simulations has resulted in some significant changes. This report presents an overview of the current NASA Dryden simulation software framework and capabilities with an emphasis on the new features that have permitted NASA to develop more capable simulations while maintaining the same staffing levels.

  16. Property-Based Software Engineering Measurement

    NASA Technical Reports Server (NTRS)

    Briand, Lionel; Morasca, Sandro; Basili, Victor R.

    1995-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysis, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact, and rigorous, because it is based on precise mathematical concepts. This framework defines several important measurement concepts (size, length, complexity, cohesion, coupling). It is not intended to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalism and properties we introduce are convenient and intuitive. In addition, we have reviewed the literature on this subject and compared it with our work. This framework contributes constructively to a firmer theoretical ground of software measurement.

  17. Property-Based Software Engineering Measurement

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1997-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

  18. ESPC Common Model Architecture Earth System Modeling Framework (ESMF) Software and Application Development

    DTIC Science & Technology

    2015-09-30

    originate from NASA , NOAA , and community modeling efforts, and support for creation of the suite was shared by sponsors from other agencies. ESPS...Framework (ESMF) Software and Application Development Cecelia Deluca NESII/CIRES/ NOAA Earth System Research Laboratory 325 Broadway Boulder, CO...Capability (NUOPC) was established between NOAA and Navy to develop a common software architecture for easy and efficient interoperability. The

  19. Diagnosis and Prognosis of Weapon Systems

    NASA Technical Reports Server (NTRS)

    Nolan, Mary; Catania, Rebecca; deMare, Gregory

    2005-01-01

    The Prognostics Framework is a set of software tools with an open architecture that affords a capability to integrate various prognostic software mechanisms and to provide information for operational and battlefield decision-making and logistical planning pertaining to weapon systems. The Prognostics NASA Tech Briefs, February 2005 17 Framework is also a system-level health -management software system that (1) receives data from performance- monitoring and built-in-test sensors and from other prognostic software and (2) processes the received data to derive a diagnosis and a prognosis for a weapon system. This software relates the diagnostic and prognostic information to the overall health of the system, to the ability of the system to perform specific missions, and to needed maintenance actions and maintenance resources. In the development of the Prognostics Framework, effort was focused primarily on extending previously developed model-based diagnostic-reasoning software to add prognostic reasoning capabilities, including capabilities to perform statistical analyses and to utilize information pertaining to deterioration of parts, failure modes, time sensitivity of measured values, mission criticality, historical data, and trends in measurement data. As thus extended, the software offers an overall health-monitoring capability.

  20. CRAB3: Establishing a new generation of services for distributed analysis at CMS

    NASA Astrophysics Data System (ADS)

    Cinquilli, M.; Spiga, D.; Grandi, C.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Riahi, H.; Vaandering, E.

    2012-12-01

    In CMS Computing the highest priorities for analysis tools are the improvement of the end users’ ability to produce and publish reliable samples and analysis results as well as a transition to a sustainable development and operations model. To achieve these goals CMS decided to incorporate analysis processing into the same framework as data and simulation processing. This strategy foresees that all workload tools (TierO, Tier1, production, analysis) share a common core with long term maintainability as well as the standardization of the operator interfaces. The re-engineered analysis workload manager, called CRAB3, makes use of newer technologies, such as RESTFul based web services and NoSQL Databases, aiming to increase the scalability and reliability of the system. As opposed to CRAB2, in CRAB3 all work is centrally injected and managed in a global queue. A pool of agents, which can be geographically distributed, consumes work from the central services serving the user tasks. The new architecture of CRAB substantially changes the deployment model and operations activities. In this paper we present the implementation of CRAB3, emphasizing how the new architecture improves the workflow automation and simplifies maintainability. In particular, we will highlight the impact of the new design on daily operations.

  1. Web application for detailed real-time database transaction monitoring for CMS condition data

    NASA Astrophysics Data System (ADS)

    de Gruttola, Michele; Di Guida, Salvatore; Innocente, Vincenzo; Pierro, Antonio

    2012-12-01

    In the upcoming LHC era, database have become an essential part for the experiments collecting data from LHC, in order to safely store, and consistently retrieve, a wide amount of data, which are produced by different sources. In the CMS experiment at CERN, all this information is stored in ORACLE databases, allocated in several servers, both inside and outside the CERN network. In this scenario, the task of monitoring different databases is a crucial database administration issue, since different information may be required depending on different users' tasks such as data transfer, inspection, planning and security issues. We present here a web application based on Python web framework and Python modules for data mining purposes. To customize the GUI we record traces of user interactions that are used to build use case models. In addition the application detects errors in database transactions (for example identify any mistake made by user, application failure, unexpected network shutdown or Structured Query Language (SQL) statement error) and provides warning messages from the different users' perspectives. Finally, in order to fullfill the requirements of the CMS experiment community, and to meet the new development in many Web client tools, our application was further developed, and new features were deployed.

  2. Evidence-based and heuristic approaches for customization of care in cardiometabolic syndrome after spinal cord injury

    PubMed Central

    Nash, Mark S.; Cowan, Rachel E.; Kressler, Jochen

    2012-01-01

    Component and coalesced health risks of the cardiometabolic syndrome (CMS) are commonly reported in persons with spinal cord injuries (SCIs). These CMS hazards are also co-morbid with physical deconditioning and elevated pro-atherogenic inflammatory cytokines, both of which are common after SCI and worsen the prognosis for all-cause cardiovascular disease. This article describes a systematic procedure for individualized CMS risk assessment after SCI, and emphasizes evidence-based and intuition-centered countermeasures to disease. A unified approach will propose therapeutic lifestyle intervention as a routine plan for aggressive primary prevention in this risk-susceptible population. Customization of dietary and exercise plans then follow, identifying shortfalls in diet and activity patterns, and ways in which these healthy lifestyles can be more substantially embraced by both stakeholders with SCI and their health care providers. In cases where lifestyle intervention utilizing diet and exercise is unsuccessful in countering risks, available pharmacotherapies and a preferred therapeutic agent are proposed according to authoritative standards. The over-arching purpose of the monograph is to create an operational framework in which existing evidence-based approaches or heuristic modeling becomes best practice. In this way persons with SCI can lead more active and healthy lives. PMID:23031165

  3. Description and performance of track and primary-vertex reconstruction with the CMS tracker

    DOE PAGES

    Chatrchyan, Serguei

    2014-10-16

    A description is provided of the software algorithms developed for the CMS tracker both for reconstructing charged-particle trajectories in proton-proton interactions and for using the resulting tracks to estimate the positions of the LHC luminous region and individual primary-interaction vertices. Despite the very hostile environment at the LHC, the performance obtained with these algorithms is found to be excellent. For tbar t events under typical 2011 pileup conditions, the average track-reconstruction efficiency for promptly-produced charged particles with transverse momenta of p T > 0.9GeV is 94% for pseudorapidities of |η| < 0.9 and 85% for 0.9 < |η| < 2.5.more » The inefficiency is caused mainly by hadrons that undergo nuclear interactions in the tracker material. For isolated muons, the corresponding efficiencies are essentially 100%. For isolated muons of p T = 100GeV emitted at |η| < 1.4, the resolutions are approximately 2.8% in p T, and respectively, 10μm and 30μm in the transverse and longitudinal impact parameters. The position resolution achieved for reconstructed primary vertices that correspond to interesting pp collisions is 10–12μm in each of the three spatial dimensions. The tracking and vertexing software is fast and flexible, and easily adaptable to other functions, such as fast tracking for the trigger, or dedicated tracking for electrons that takes into account bremsstrahlung.« less

  4. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  5. Eighteen-Month Outcomes of Titanium Frameworks Using Computer-Aided Design and Computer-Aided Manufacturing Method.

    PubMed

    Turkyilmaz, Ilser; Asar, Neset Volkan

    2017-06-01

    The aim of the report is to introduce a new software and a new scanner with a noncontact laser probe and to present outcomes of computer-aided design and computer-aided manufacturing titanium frameworks using this new software and scanner with a laser probe. Seven patients received 40 implants placed using a 1-stage protocol. After all implants were planned using an implant planning software (NobelClinician), either 5 or 6 implants were placed in each edentulous arch. Each edentulous arch was treated with a fixed dental prosthesis using implant-supported complete-arch milled-titanium framework using the software (NobelProcera) and the scanner. All patients were followed up for 18 ± 3 months. Implant survival, prosthesis survival, framework fit, marginal bone levels, and maintenance requirements were evaluated. One implant was lost during the follow-up period, giving the implant survival rate of 97.5%; 0.4 ± 0.2 mm marginal bone loss was noted for all implants after 18 ± 3 months. None of the prostheses needed a replacement, indicating the prosthesis success rate of 100%. The results of this clinical study suggest that titanium frameworks fabricated using the software and scanner presented in this study fit accurately and may be a viable option to restore edentulous arches.

  6. [Construction of educational software about personality disorders].

    PubMed

    Botti, Nadja Cristiane Lappann; Carneiro, Ana Luíza Marques; Almeida, Camila Souza; Pereira, Cíntia Braga Silva

    2011-01-01

    The study describes the experience of building educational software in the area of mental health. The software was developed to enable the nursing student identify personality disorders. In this process, we applied the pedagogical framework of Vygotsky and the theoretical framework of the diagnostic criteria defined by DSM-IV. From these references were identified personality disorders characters in stories and / or children's movies. The software development bank was built with multimedia graphics data, sound and explanatory. The software developed like educational game like questions with increasing levels of difficulty. The software was developed with Microsoft Office PowerPoint 2007. It is believed in the validity of this strategy for teaching-learning to the area of mental health nursing.

  7. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  8. Detection of the Diversity of Cytoplasmic Male Sterility Sources in Broccoli (Brassica Oleracea var. Italica) Using Mitochondrial Markers.

    PubMed

    Shu, Jinshuai; Liu, Yumei; Li, Zhansheng; Zhang, Lili; Fang, Zhiyuan; Yang, Limei; Zhuang, Mu; Zhang, Yangyong; Lv, Honghao

    2016-01-01

    Broccoli (Brassica oleracea var. italica) is an important commercial vegetable crop. As part of an efficient pollination system, cytoplasmic male sterility (CMS) has been widely used for broccoli hybrid production. Identifying the original sources of CMS in broccoli accessions has become an important part of broccoli breeding. In this study, the diversity of the CMS sources of 39 broccoli accessions, including 19 CMS lines and 20 hybrids, were analyzed using mitochondrial markers. All CMS accessions contained the ogu orf138-related DNA fragment and the key genes of nap CMS, pol CMS, and tour CMS were not detected. The 39 CMS accessions were divided into five groups using six orf138-related and two simple sequence repeat markers. We observed that ogu CMS R3 constituted 79.49% of the CMS sources. CMS6 and CMS26 were differentiated from the other accessions using a specific primer. CMS32 was distinguished from the other accessions based on a 78-nucleotide deletion at the same locus as the orf138-related sequence. When the coefficient was about 0.90, five CMS accessions (13CMS6, 13CMS23, 13CMS24, 13CMS37, and 13CMS39) exhibiting abnormal floral organs with poor seed setting were grouped together. The polymerase chain reaction amplification profiles for these five accessions differed from those of the other accessions. We identified eight useful molecular markers that can be used to detect CMS types during broccoli breeding. Our data also provide important information relevant to future studies on the possible origins and molecular mechanisms of CMS in broccoli.

  9. Detection of the Diversity of Cytoplasmic Male Sterility Sources in Broccoli (Brassica Oleracea var. Italica) Using Mitochondrial Markers

    PubMed Central

    Shu, Jinshuai; Liu, Yumei; Li, Zhansheng; Zhang, Lili; Fang, Zhiyuan; Yang, Limei; Zhuang, Mu; Zhang, Yangyong; Lv, Honghao

    2016-01-01

    Broccoli (Brassica oleracea var. italica) is an important commercial vegetable crop. As part of an efficient pollination system, cytoplasmic male sterility (CMS) has been widely used for broccoli hybrid production. Identifying the original sources of CMS in broccoli accessions has become an important part of broccoli breeding. In this study, the diversity of the CMS sources of 39 broccoli accessions, including 19 CMS lines and 20 hybrids, were analyzed using mitochondrial markers. All CMS accessions contained the ogu orf138-related DNA fragment and the key genes of nap CMS, pol CMS, and tour CMS were not detected. The 39 CMS accessions were divided into five groups using six orf138-related and two simple sequence repeat markers. We observed that ogu CMS R3 constituted 79.49% of the CMS sources. CMS6 and CMS26 were differentiated from the other accessions using a specific primer. CMS32 was distinguished from the other accessions based on a 78-nucleotide deletion at the same locus as the orf138-related sequence. When the coefficient was about 0.90, five CMS accessions (13CMS6, 13CMS23, 13CMS24, 13CMS37, and 13CMS39) exhibiting abnormal floral organs with poor seed setting were grouped together. The polymerase chain reaction amplification profiles for these five accessions differed from those of the other accessions. We identified eight useful molecular markers that can be used to detect CMS types during broccoli breeding. Our data also provide important information relevant to future studies on the possible origins and molecular mechanisms of CMS in broccoli. PMID:27446156

  10. Architecture of a Framework for Providing Information Services for Public Transport

    PubMed Central

    García, Carmelo R.; Pérez, Ricardo; Lorenzo, Álvaro; Quesada-Arencibia, Alexis; Alayón, Francisco; Padrón, Gabino

    2012-01-01

    This paper presents OnRoute, a framework for developing and running ubiquitous software that provides information services to passengers of public transportation, including payment systems and on-route guidance services. To achieve a high level of interoperability, accessibility and context awareness, OnRoute uses the ubiquitous computing paradigm. To guarantee the quality of the software produced, the reliable software principles used in critical contexts, such as automotive systems, are also considered by the framework. The main components of its architecture (run-time, system services, software components and development discipline) and how they are deployed in the transportation network (stations and vehicles) are described in this paper. Finally, to illustrate the use of OnRoute, the development of a guidance service for travellers is explained. PMID:22778585

  11. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  12. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  13. Generic Software Architecture for Prognostics (GSAP) User Guide

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher Allen; Daigle, Matthew John; Watkins, Jason; Sankararaman, Shankar; Goebel, Kai

    2016-01-01

    The Generic Software Architecture for Prognostics (GSAP) is a framework for applying prognostics. It makes applying prognostics easier by implementing many of the common elements across prognostic applications. The standard interface enables reuse of prognostic algorithms and models across systems using the GSAP framework.

  14. The NOvA software testing framework

    NASA Astrophysics Data System (ADS)

    Tamsett, M.; C Group

    2015-12-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner.

  15. A Framework of the Use of Information in Software Testing

    ERIC Educational Resources Information Center

    Kaveh, Payman

    2010-01-01

    With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…

  16. ICW eHealth Framework.

    PubMed

    Klein, Karsten; Wolff, Astrid C; Ziebold, Oliver; Liebscher, Thomas

    2008-01-01

    The ICW eHealth Framework (eHF) is a powerful infrastructure and platform for the development of service-oriented solutions in the health care business. It is the culmination of many years of experience of ICW in the development and use of in-house health care solutions and represents the foundation of ICW product developments based on the Java Enterprise Edition (Java EE). The ICW eHealth Framework has been leveraged to allow development by external partners - enabling adopters a straightforward integration into ICW solutions. The ICW eHealth Framework consists of reusable software components, development tools, architectural guidelines and conventions defining a full software-development and product lifecycle. From the perspective of a partner, the framework provides services and infrastructure capabilities for integrating applications within an eHF-based solution. This article introduces the ICW eHealth Framework's basic architectural concepts and technologies. It provides an overview of its module and component model, describes the development platform that supports the complete software development lifecycle of health care applications and outlines technological aspects, mainly focusing on application development frameworks and open standards.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, K; Yu, Z; Chen, H

    Purpose: To implement VMAT in RayStation with the Elekta Synergy linac with the new Agility MLC, and to utilize the same vendor softwares to determine the optimum Elekta VMAT machine parameters in RayStation for accurate modeling and robust delivery. Methods: iCOMCat is utilized to create various beam patterns with user defined dose rate, gantry, MLC and jaw speed for each control point. The accuracy and stability of the output and beam profile are qualified for each isolated functional component of VMAT delivery using ion chamber and Profiler2 with isocentric mounting fixture. Service graphing on linac console is used to verifymore » the mechanical motion accuracy. The determined optimum Elekta VMAT machine parameters were configured in RayStation v4.5.1. To evaluate the system overall performance, TG-119 test cases and nine retrospective VMAT patients were planned on RayStation, and validated using both ArcCHECK (with plug and ion chamber) and MapCHECK2. Results: Machine output and profile varies <0.3% when only variable is dose rate (35MU/min-600MU/min). <0.9% output and <0.3% profile variation are observed with additional gantry motion (0.53deg/s–5.8deg/s both directions). The output and profile variation are still <1% with additional slow leaf motion (<1.5cm/s both direction). However, the profile becomes less symmetric, and >1.5% output and 7% profile deviation is seen with >2.5cm/s leaf motion. All clinical cases achieved comparable plan quality as treated IMRT plans. The gamma passing rate is 99.5±0.5% on ArcCheck (<3% iso center dose deviation) and 99.1±0.8% on MapCheck2 using 3%/3mm gamma (10% lower threshold). Mechanical motion accuracy in all VMAT deliveries is <1°/1mm. Conclusion: Accurate RayStation modeling and robust VMAT delivery is achievable on Elekta Agility for <2.5cm/s leaf motion and full range of dose rate and gantry speed determined by the same vendor softwares. Our TG-119 and patient results have provided us with the confidence to use VMAT clinically.« less

  18. Evolution of the ATLAS Software Framework towards Concurrency

    NASA Astrophysics Data System (ADS)

    Jones, R. W. L.; Stewart, G. A.; Leggett, C.; Wynne, B. M.

    2015-05-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather requirements for an updated framework, going back to the first principles of how event processing occurs. In this paper we report on both these aspects of our work. For the hive based demonstrators, we discuss what changes were necessary in order to allow the serially designed ATLAS code to run, both to the framework and to the tools and algorithms used. We report on what general lessons were learned about the code patterns that had been employed in the software and which patterns were identified as particularly problematic for multi-threading. These lessons were fed into our considerations of a new framework and we present preliminary conclusions on this work. In particular we identify areas where the framework can be simplified in order to aid the implementation of a concurrent event processing scheme. Finally, we discuss the practical difficulties involved in migrating a large established code base to a multi-threaded framework and how this can be achieved for LHC Run 3.

  19. Comparative Gene Expression Analyses Reveal Distinct Molecular Signatures between Differentially Reprogrammed Cardiomyocytes.

    PubMed

    Zhou, Yang; Wang, Li; Liu, Ziqing; Alimohamadi, Sahar; Yin, Chaoying; Liu, Jiandong; Qian, Li

    2017-09-26

    Cardiomyocytes derived from induced pluripotent stem cells (iPSC-CMs) or directly reprogrammed from non-myocytes (induced cardiomyocytes [iCMs]) are promising sources for heart regeneration or disease modeling. However, the similarities and differences between iPSC-CMs and iCMs are still unknown. Here, we performed transcriptome analyses of beating iPSC-CMs and iCMs generated from cardiac fibroblasts (CFs) of the same origin. Although both iPSC-CMs and iCMs establish CM-like molecular features globally, iPSC-CMs exhibit a relatively hyperdynamic epigenetic status, whereas iCMs exhibit a maturation status that more closely resembles that of adult CMs. Based on gene expression of metabolic enzymes, iPSC-CMs primarily employ glycolysis, whereas iCMs utilize fatty acid oxidation as the main pathway. Importantly, iPSC-CMs and iCMs exhibit different cell-cycle status, alteration of which influenced their maturation. Therefore, our study provides a foundation for understanding the pros and cons of different reprogramming approaches. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  20. Towards Archetypes-Based Software Development

    NASA Astrophysics Data System (ADS)

    Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak

    We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.

  1. Software Reviews Since Acquisition Reform - The Artifact Perspective

    DTIC Science & Technology

    2004-01-01

    Risk Management OLD NEW Slide 13Acquisition of Software Intensive Systems 2004 – Peter Hantos Single, basic software paradigm Single processor Low...software risk mitigation related trade-offs must be done together Integral Software Engineering Activities Process Maturity and Quality Frameworks Quality

  2. A Framework for Testing Scientific Software: A Case Study of Testing Amsterdam Discrete Dipole Approximation Software

    NASA Astrophysics Data System (ADS)

    Shao, Hongbing

    Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.

  3. 45 CFR 150.203 - Circumstances requiring CMS enforcement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Circumstances requiring CMS enforcement. 150.203... CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement Processes for... requiring CMS enforcement. CMS enforces HIPAA requirements to the extent warranted (as determined by CMS) in...

  4. Coupled dam safety analysis using WinDAM

    USDA-ARS?s Scientific Manuscript database

    Windows® Dam Analysis Modules (WinDAM) is a set of modular software components that can be used to analyze overtopping and internal erosion of embankment dams. Dakota is an extensive software framework for design exploration and simulation. These tools can be coupled to create a powerful framework...

  5. Multi-core processing and scheduling performance in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J. M.; Evans, D.; Foulkes, S.

    2012-01-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resultingmore » in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.« less

  6. Development and use of mathematical models and software frameworks for integrated analysis of agricultural systems and associated water use impacts

    USGS Publications Warehouse

    Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.

    2016-01-01

    The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.

  7. System administrator`s guide to CDPS. Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Didier, B.T.; Portwood, M.H.

    The System Administrator`s Guide to CDPS is intended for those responsible for setting up and maintaining the hardware and software of a Common Mapping Standard (CMS) Date Production System (CDPS) installation. This guide assists the system administrator in performing typical administrative functions. It is not intended to replace the Ultrix Documentation Set that should be available for a DCPS installation. The Ultrix Documentation Set will be required to provide details on referenced Ultrix commands as well as procedures for performing Ultrix maintenance functions. There are six major sections in this guide. Section 1 introduces the system administrator to CDPS andmore » describes the assumptions that are made by this guide. Section 2 describes the CDPS platform configuration. Section 3 describes the platform preparation that is required to install the CDPS software. Section 4 describes the CPS software and its installation procedures. Section 5 describes the CDS software and its installation procedures. Section 6 describes various operation and maintenance procedures. Four appendices are also provided. Appendix A contains a list of used acronyms. Appendix B provides a terse description of common Ultrix commands that are used in administrative functions. Appendix C provides sample CPS and CDS configuration files. Appendix D provides a required list and a recommended list of Ultrix software subsets for installation on a CDPS platform.« less

  8. Science on Drupal: An evaluation of CMS Technologies

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Gonzalez, A.; Pinto, A.; Pascuzzi, F.; Gerard, A.

    2011-12-01

    We conducted an extensive evaluation of various Content Management System (CMS) technologies for implementing different websites supporting interdisciplinary science data and information. We chose two products, Drupal and Bluenog/Hippo CMS, to meet our specific needs and requirements. Drupal is an open source product that is quick and easy to setup and use. It is a very mature, stable, and widely used product. It has rich functionality supported by a large and active user base and developer community. There are many plugins available that provide additional features for managing citations, map gallery, semantic search, digital repositories (fedora), scientific workflows, collaborative authoring, social networking, and other functions. All of these work very well within the Drupal framework if minimal customization is needed. We have successfully implemented Drupal for multiple projects such as: 1) the Haiti Regeneration Initiative (http://haitiregeneration.org/); 2) the Consortium on Climate Risk in the Urban Northeast (http://beta.ccrun.org/); and 3) the Africa Soils Information Service (http://africasoils.net/). We are also developing two other websites, the Côte Sud Initiative (CSI) and Emerging Infectious Diseases, using Drupal. We are testing the Drupal multi-site install for managing different websites with one install to streamline the maintenance. In addition, paid support and consultancy for Drupal website development are available at affordable prices. All of these features make Drupal very attractive for implementing state-of-the-art scientific websites that do not have complex requirements. One of our major websites, the NASA Socioeconomic Data and Applications Center (SEDAC), has a very complex set of requirements. It has to easily re-purpose content across multiple web pages and sites with different presentations. It has to serve the content via REST or similar standard interfaces so that external client applications can access content in the CMS repository. This means the content repository and structure should be completely separated from the content presentation and site structure. In addition to the CMS repository, the front-end website has to be able to consume, integrate, and display diverse content flexibly from multiple back-end systems, including custom and legacy systems, such as Oracle, Geoserver, Flickr, Fedora, and other web services. We needed the ability to customize the workflow to author, edit, approve, and publish content based on different content types and project requirements. In addition, we required the ability to use the existing active directory for user management with support for roles and groups and permissions using Access Control List (ACL) model. The ability to version and lock content was also important. We determined that most of these capabilities are difficult to implement with Drupal and needed significant customization. The Bluenog eCMS (enterprise CMS) product satisfied most of these requirements. Bluenog eCMS is based on an open source product called Hippo with customizations and support provided by the vendor Bluenog. Our newly redesigned and recently released SEDAC website, http://sedac.ciesin.columbia.edu, is implemented using Bluenog eCMS. Other products we evaluated include WebLogic portal, Magnolia, Liferay portal, and Alfresco.

  9. Towards a comprehensive framework for reuse: A reuse-enabling software evolution environment

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Rombach, H. D.

    1988-01-01

    Reuse of products, processes and knowledge will be the key to enable the software industry to achieve the dramatic improvement in productivity and quality required to satisfy the anticipated growing demand. Although experience shows that certain kinds of reuse can be successful, general success has been elusive. A software life-cycle technology which allows broad and extensive reuse could provide the means to achieving the desired order-of-magnitude improvements. The scope of a comprehensive framework for understanding, planning, evaluating and motivating reuse practices and the necessary research activities is outlined. As a first step towards such a framework, a reuse-enabling software evolution environment model is introduced which provides a basis for the effective recording of experience, the generalization and tailoring of experience, the formalization of experience, and the (re-)use of experience.

  10. A Framework for Teaching Software Development Methods

    ERIC Educational Resources Information Center

    Dubinsky, Yael; Hazzan, Orit

    2005-01-01

    This article presents a study that aims at constructing a teaching framework for software development methods in higher education. The research field is a capstone project-based course, offered by the Technion's Department of Computer Science, in which Extreme Programming is introduced. The research paradigm is an Action Research that involves…

  11. Frameworks Coordinate Scientific Data Management

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Jet Propulsion Laboratory computer scientists developed a unique software framework to help NASA manage its massive amounts of science data. Through a partnership with the Apache Software Foundation of Forest Hill, Maryland, the technology is now available as an open-source solution and is in use by cancer researchers and pediatric hospitals.

  12. Dose-ranging pharmacokinetics of colistin methanesulphonate (CMS) and colistin in rats following single intravenous CMS doses.

    PubMed

    Marchand, Sandrine; Lamarche, Isabelle; Gobin, Patrice; Couet, William

    2010-08-01

    The aim of this study was to evaluate the effect of colistin methanesulphonate (CMS) dose on CMS and colistin pharmacokinetics in rats. Three rats per group received an intravenous bolus of CMS at a dose of 5, 15, 30, 60 or 120 mg/kg. Arterial blood samples were drawn at 0, 5, 15, 30, 60, 90, 120, 150 and 180 min. CMS and colistin plasma concentrations were determined by liquid chromatography-tandem mass spectrometry (LC-MS/MS). The pharmacokinetic parameters of CMS and colistin were calculated by non-compartmental analysis. Linear relationships were observed between CMS and colistin AUCs to infinity and CMS doses, as well as between CMS and colistin C(max) and CMS doses. CMS and colistin pharmacokinetics were linear for a range of colistin concentrations covering the range of values encountered and recommended in patients even during treatment with higher doses.

  13. Development of a software framework for data assimilation and its applications for streamflow forecasting in Japan

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.

    2012-04-01

    Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.

  14. Phenomenological MSSM interpretation of CMS searches in pp collisions at $$ \\sqrt{s}=7 $$ and 8 TeV

    DOE PAGES

    Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.; ...

    2016-10-24

    Searches for new physics by the CMS collaboration are interpreted in the framework of the phenomenological minimal supersymmetric standard model (pMSSM). The data samples used in this study were collected atmore » $$ \\sqrt{s}=7 $$ and 8 TeV and have integrated luminosities of 5.0 fb$$^{-1}$$ and 19.5 fb$$^{-1}$$, respectively. A global Bayesian analysis is performed, incorporating results from a broad range of CMS supersymmetry searches, as well as constraints from other experiments. Because the pMSSM incorporates several well-motivated assumptions that reduce the 120 parameters of the MSSM to just 19 parameters defined at the electroweak scale, it is possible to assess the results of the study in a relatively straightforward way. Approximately half of the model points in a potentially accessible subspace of the pMSSM are excluded, including all pMSSM model points with a gluino mass below 500 GeV, as well as models with a squark mass less than 300 GeV. Models with chargino and neutralino masses below 200 GeV are disfavored, but no mass range of model points can be ruled out based on the analyses considered. Lastly, the nonexcluded regions in the pMSSM parameter space are characterized in terms of physical processes and key observables, and implications for future searches are discussed.« less

  15. Phenomenological MSSM interpretation of CMS searches in pp collisions at $$ \\sqrt{s}=7 $$ and 8 TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.

    Searches for new physics by the CMS collaboration are interpreted in the framework of the phenomenological minimal supersymmetric standard model (pMSSM). The data samples used in this study were collected atmore » $$ \\sqrt{s}=7 $$ and 8 TeV and have integrated luminosities of 5.0 fb$$^{-1}$$ and 19.5 fb$$^{-1}$$, respectively. A global Bayesian analysis is performed, incorporating results from a broad range of CMS supersymmetry searches, as well as constraints from other experiments. Because the pMSSM incorporates several well-motivated assumptions that reduce the 120 parameters of the MSSM to just 19 parameters defined at the electroweak scale, it is possible to assess the results of the study in a relatively straightforward way. Approximately half of the model points in a potentially accessible subspace of the pMSSM are excluded, including all pMSSM model points with a gluino mass below 500 GeV, as well as models with a squark mass less than 300 GeV. Models with chargino and neutralino masses below 200 GeV are disfavored, but no mass range of model points can be ruled out based on the analyses considered. Lastly, the nonexcluded regions in the pMSSM parameter space are characterized in terms of physical processes and key observables, and implications for future searches are discussed.« less

  16. Phenomenological MSSM interpretation of CMS searches in pp collisions at √{s}=7 and 8 TeV

    NASA Astrophysics Data System (ADS)

    Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.; Adam, W.; Asilar, E.; Bergauer, T.; Brandstetter, J.; Brondolin, E.; Dragicevic, M.; Erö, J.; Flechl, M.; Friedl, M.; Frühwirth, R.; Ghete, V. M.; Hartl, C.; Hörmann, N.; Hrubec, J.; Jeitler, M.; König, A.; Krammer, M.; Krätschmer, I.; Liko, D.; Matsushita, T.; Mikulec, I.; Rabady, D.; Rad, N.; Rahbaran, B.; Rohringer, H.; Schieck, J.; Schöfbeck, R.; Strauss, J.; Treberer-Treberspurg, W.; Waltenberger, W.; Wulz, C.-E.; Mossolov, V.; Shumeiko, N.; Suarez Gonzalez, J.; Alderweireldt, S.; Cornelis, T.; de Wolf, E. A.; Janssen, X.; Knutsson, A.; Lauwers, J.; Luyckx, S.; van de Klundert, M.; van Haevermaet, H.; van Mechelen, P.; van Remortel, N.; van Spilbeeck, A.; Abu Zeid, S.; Blekman, F.; D'Hondt, J.; Daci, N.; de Bruyn, I.; Deroover, K.; Heracleous, N.; Keaveney, J.; Lowette, S.; Moortgat, S.; Moreels, L.; Olbrechts, A.; Python, Q.; Strom, D.; Tavernier, S.; van Doninck, W.; van Mulders, P.; van Onsem, G. P.; van Parijs, I.; Brun, H.; Caillol, C.; Clerbaux, B.; de Lentdecker, G.; Fasanella, G.; Favart, L.; Goldouzian, R.; Grebenyuk, A.; Karapostoli, G.; Lenzi, T.; Léonard, A.; Maerschalk, T.; Marinov, A.; Perniè, L.; Randle-Conde, A.; Seva, T.; Vander Velde, C.; Vanlaer, P.; Yonamine, R.; Zenoni, F.; Zhang, F.; Beernaert, K.; Benucci, L.; Cimmino, A.; Crucy, S.; Dobur, D.; Fagot, A.; Garcia, G.; Gul, M.; McCartin, J.; Ocampo Rios, A. A.; Poyraz, D.; Ryckbosch, D.; Salva, S.; Sigamani, M.; Tytgat, M.; van Driessche, W.; Yazgan, E.; Zaganidis, N.; Basegmez, S.; Beluffi, C.; Bondu, O.; Brochet, S.; Bruno, G.; Caudron, A.; Ceard, L.; de Visscher, S.; Delaere, C.; Delcourt, M.; Favart, D.; Forthomme, L.; Giammanco, A.; Jafari, A.; Jez, P.; Komm, M.; Lemaitre, V.; Mertens, A.; Musich, M.; Nuttens, C.; Perrini, L.; Piotrzkowski, K.; Quertenmont, L.; Selvaggi, M.; Vidal Marono, M.; Beliy, N.; Hammad, G. H.; Aldá Júnior, W. L.; Alves, F. L.; Alves, G. A.; Brito, L.; Correa Martins Junior, M.; Hamer, M.; Hensel, C.; Moraes, A.; Pol, M. E.; Rebello Teles, P.; Belchior Batista Das Chagas, E.; Carvalho, W.; Chinellato, J.; Custódio, A.; da Costa, E. M.; de Jesus Damiao, D.; de Oliveira Martins, C.; Fonseca de Souza, S.; Huertas Guativa, L. M.; Malbouisson, H.; Matos Figueiredo, D.; Mora Herrera, C.; Mundim, L.; Nogima, H.; Prado da Silva, W. L.; Santoro, A.; Sznajder, A.; Tonelli Manganote, E. J.; Vilela Pereira, A.; Ahuja, S.; Bernardes, C. A.; de Souza Santos, A.; Dogra, S.; Fernandez Perez Tomei, T. R.; Gregores, E. M.; Mercadante, P. G.; Moon, C. S.; Novaes, S. F.; Padula, Sandra S.; Romero Abad, D.; Ruiz Vargas, J. C.; Aleksandrov, A.; Hadjiiska, R.; Iaydjiev, P.; Rodozov, M.; Stoykova, S.; Sultanov, G.; Vutova, M.; Dimitrov, A.; Glushkov, I.; Litov, L.; Pavlov, B.; Petkov, P.; Fang, W.; Ahmad, M.; Bian, J. G.; Chen, G. M.; Chen, H. S.; Chen, M.; Cheng, T.; Du, R.; Jiang, C. H.; Leggat, D.; Plestina, R.; Romeo, F.; Shaheen, S. M.; Spiezia, A.; Tao, J.; Wang, C.; Wang, Z.; Zhang, H.; Asawatangtrakuldee, C.; Ban, Y.; Li, Q.; Liu, S.; Mao, Y.; Qian, S. J.; Wang, D.; Xu, Z.; Avila, C.; Cabrera, A.; Chaparro Sierra, L. F.; Florez, C.; Gomez, J. P.; Gomez Moreno, B.; Sanabria, J. C.; Godinovic, N.; Lelas, D.; Puljak, I.; Ribeiro Cipriano, P. M.; Antunovic, Z.; Kovac, M.; Brigljevic, V.; Kadija, K.; Luetic, J.; Micanovic, S.; Sudic, L.; Attikis, A.; Mavromanolakis, G.; Mousa, J.; Nicolaou, C.; Ptochos, F.; Razis, P. A.; Rykaczewski, H.; Finger, M.; Finger, M.; Elkafrawy, T.; Mahmoud, M. A.; Mohammed, Y.; Calpas, B.; Kadastik, M.; Murumaa, M.; Raidal, M.; Tiko, A.; Veelken, C.; Eerola, P.; Pekkanen, J.; Voutilainen, M.; Härkönen, J.; Karimäki, V.; Kinnunen, R.; Lampén, T.; Lassila-Perini, K.; Lehti, S.; Lindén, T.; Luukka, P.; Peltola, T.; Tuominiemi, J.; Tuovinen, E.; Wendland, L.; Talvitie, J.; Tuuva, T.; Besancon, M.; Couderc, F.; Dejardin, M.; Denegri, D.; Fabbro, B.; Faure, J. L.; Favaro, C.; Ferri, F.; Ganjour, S.; Givernaud, A.; Gras, P.; Hamel de Monchenault, G.; Jarry, P.; Locci, E.; Machet, M.; Malcles, J.; Rander, J.; Rosowsky, A.; Titov, M.; Zghiche, A.; Abdulsalam, A.; Antropov, I.; Baffioni, S.; Beaudette, F.; Busson, P.; Cadamuro, L.; Chapon, E.; Charlot, C.; Davignon, O.; Filipovic, N.; Granier de Cassagnac, R.; Jo, M.; Kraml, S.; Lisniak, S.; Miné, P.; Naranjo, I. N.; Nguyen, M.; Ochando, C.; Ortona, G.; Paganini, P.; Pigard, P.; Regnard, S.; Salerno, R.; Sirois, Y.; Strebler, T.; Yilmaz, Y.; Zabi, A.; Agram, J.-L.; Andrea, J.; Aubin, A.; Bloch, D.; Brom, J.-M.; Buttignol, M.; Chabert, E. C.; Chanon, N.; Collard, C.; Conte, E.; Coubez, X.; Fontaine, J.-C.; Gelé, D.; Goerlach, U.; Goetzmann, C.; Le Bihan, A.-C.; Merlin, J. A.; Skovpen, K.; van Hove, P.; Gadrat, S.; Beauceron, S.; Bernet, C.; Boudoul, G.; Bouvier, E.; Carrillo Montoya, C. A.; Chierici, R.; Contardo, D.; Courbon, B.; Depasse, P.; El Mamouni, H.; Fan, J.; Fay, J.; Gascon, S.; Gouzevitch, M.; Ille, B.; Lagarde, F.; Laktineh, I. B.; Lethuillier, M.; Mirabito, L.; Pequegnot, A. L.; Perries, S.; Popov, A.; Ruiz Alvarez, J. D.; Sabes, D.; Sordini, V.; Vander Donckt, M.; Verdier, P.; Viret, S.; Toriashvili, T.; Tsamalaidze, Z.; Autermann, C.; Beranek, S.; Feld, L.; Heister, A.; Kiesel, M. K.; Klein, K.; Lipinski, M.; Ostapchuk, A.; Preuten, M.; Raupach, F.; Schael, S.; Schulte, J. F.; Verlage, T.; Weber, H.; Zhukov, V.; Ata, M.; Brodski, M.; Dietz-Laursonn, E.; Duchardt, D.; Endres, M.; Erdmann, M.; Erdweg, S.; Esch, T.; Fischer, R.; Güth, A.; Hebbeker, T.; Heidemann, C.; Hoepfner, K.; Knutzen, S.; Merschmeyer, M.; Meyer, A.; Millet, P.; Mukherjee, S.; Olschewski, M.; Padeken, K.; Papacz, P.; Pook, T.; Radziej, M.; Reithler, H.; Rieger, M.; Scheuch, F.; Sonnenschein, L.; Teyssier, D.; Thüer, S.; Cherepanov, V.; Erdogan, Y.; Flügge, G.; Geenen, H.; Geisler, M.; Hoehle, F.; Kargoll, B.; Kress, T.; Künsken, A.; Lingemann, J.; Nehrkorn, A.; Nowack, A.; Nugent, I. M.; Pistone, C.; Pooth, O.; Stahl, A.; Aldaya Martin, M.; Asin, I.; Bartosik, N.; Behnke, O.; Behrens, U.; Borras, K.; Burgmeier, A.; Campbell, A.; Contreras-Campana, C.; Costanza, F.; Diez Pardos, C.; Dolinska, G.; Dooling, S.; Dorland, T.; Eckerlin, G.; Eckstein, D.; Eichhorn, T.; Flucke, G.; Gallo, E.; Garay Garcia, J.; Geiser, A.; Gizhko, A.; Gunnellini, P.; Hauk, J.; Hempel, M.; Jung, H.; Kalogeropoulos, A.; Karacheban, O.; Kasemann, M.; Katsas, P.; Kieseler, J.; Kleinwort, C.; Korol, I.; Lange, W.; Leonard, J.; Lipka, K.; Lobanov, A.; Lohmann, W.; Mankel, R.; Melzer-Pellmann, I.-A.; Meyer, A. B.; Mittag, G.; Mnich, J.; Mussgiller, A.; Naumann-Emme, S.; Nayak, A.; Ntomari, E.; Perrey, H.; Pitzl, D.; Placakyte, R.; Raspereza, A.; Roland, B.; Sahin, M. Ö.; Saxena, P.; Schoerner-Sadenius, T.; Seitz, C.; Spannagel, S.; Stefaniuk, N.; Trippkewitz, K. D.; Walsh, R.; Wissing, C.; Blobel, V.; Centis Vignali, M.; Draeger, A. R.; Dreyer, T.; Erfle, J.; Garutti, E.; Goebel, K.; Gonzalez, D.; Görner, M.; Haller, J.; Hoffmann, M.; Höing, R. S.; Junkes, A.; Klanner, R.; Kogler, R.; Kovalchuk, N.; Lapsien, T.; Lenz, T.; Marchesini, I.; Marconi, D.; Meyer, M.; Niedziela, M.; Nowatschin, D.; Ott, J.; Pantaleo, F.; Peiffer, T.; Perieanu, A.; Pietsch, N.; Poehlsen, J.; Sander, C.; Scharf, C.; Schleper, P.; Schlieckau, E.; Schmidt, A.; Schumann, S.; Schwandt, J.; Sola, V.; Stadie, H.; Steinbrück, G.; Stober, F. M.; Tholen, H.; Troendle, D.; Usai, E.; Vanelderen, L.; Vanhoefer, A.; Vormwald, B.; Barth, C.; Baus, C.; Berger, J.; Böser, C.; Butz, E.; Chwalek, T.; Colombo, F.; de Boer, W.; Descroix, A.; Dierlamm, A.; Fink, S.; Frensch, F.; Friese, R.; Giffels, M.; Gilbert, A.; Haitz, D.; Hartmann, F.; Heindl, S. M.; Husemann, U.; Katkov, I.; Kornmayer, A.; Lobelle Pardo, P.; Maier, B.; Mildner, H.; Mozer, M. U.; Müller, T.; Müller, Th.; Plagge, M.; Quast, G.; Rabbertz, K.; Röcker, S.; Roscher, F.; Schröder, M.; Sieber, G.; Simonis, H. J.; Ulrich, R.; Wagner-Kuhr, J.; Wayand, S.; Weber, M.; Weiler, T.; Williamson, S.; Wöhrmann, C.; Wolf, R.; Anagnostou, G.; Daskalakis, G.; Geralis, T.; Giakoumopoulou, V. A.; Kyriakis, A.; Loukas, D.; Psallidas, A.; Topsis-Giotis, I.; Agapitos, A.; Kesisoglou, S.; Panagiotou, A.; Saoulidou, N.; Tziaferi, E.; Evangelou, I.; Flouris, G.; Foudas, C.; Kokkas, P.; Loukas, N.; Manthos, N.; Papadopoulos, I.; Paradas, E.; Strologas, J.; Bencze, G.; Hajdu, C.; Hidas, P.; Horvath, D.; Sikler, F.; Veszpremi, V.; Vesztergombi, G.; Zsigmond, A. J.; Beni, N.; Czellar, S.; Karancsi, J.; Molnar, J.; Szillasi, Z.; Bartók, M.; Makovec, A.; Raics, P.; Trocsanyi, Z. L.; Ujvari, B.; Choudhury, S.; Mal, P.; Mandal, K.; Sahoo, D. K.; Sahoo, N.; Swain, S. K.; Bansal, S.; Beri, S. B.; Bhatnagar, V.; Chawla, R.; Gupta, R.; Bhawandeep, U.; Kalsi, A. K.; Kaur, A.; Kaur, M.; Kumar, R.; Mehta, A.; Mittal, M.; Singh, J. B.; Walia, G.; Kumar, Ashok; Bhardwaj, A.; Choudhary, B. C.; Garg, R. B.; Keshri, S.; Kumar, A.; Malhotra, S.; Naimuddin, M.; Nishu, N.; Ranjan, K.; Sharma, R.; Sharma, V.; Bhattacharya, R.; Bhattacharya, S.; Chatterjee, K.; Dey, S.; Dutta, S.; Ghosh, S.; Majumdar, N.; Modak, A.; Mondal, K.; Mukhopadhyay, S.; Nandan, S.; Purohit, A.; Roy, A.; Roy, D.; Roy Chowdhury, S.; Sarkar, S.; Sharan, M.; Chudasama, R.; Dutta, D.; Jha, V.; Kumar, V.; Mohanty, A. K.; Pant, L. M.; Shukla, P.; Topkar, A.; Aziz, T.; Banerjee, S.; Bhowmik, S.; Chatterjee, R. M.; Dewanjee, R. K.; Dugad, S.; Ganguly, S.; Ghosh, S.; Guchait, M.; Gurtu, A.; Jain, Sa.; Kole, G.; Kumar, S.; Mahakud, B.; Maity, M.; Majumder, G.; Mazumdar, K.; Mitra, S.; Mohanty, G. B.; Parida, B.; Sarkar, T.; Sur, N.; Sutar, B.; Wickramage, N.; Chauhan, S.; Dube, S.; Kapoor, A.; Kothekar, K.; Rane, A.; Sharma, S.; Bakhshiansohi, H.; Behnamian, H.; Etesami, S. M.; Fahim, A.; Khakzad, M.; Mohammadi Najafabadi, M.; Naseri, M.; Paktinat Mehdiabadi, S.; Rezaei Hosseinabadi, F.; Safarzadeh, B.; Zeinali, M.; Felcini, M.; Grunewald, M.; Abbrescia, M.; Calabria, C.; Caputo, C.; Colaleo, A.; Creanza, D.; Cristella, L.; de Filippis, N.; de Palma, M.; Fiore, L.; Iaselli, G.; Maggi, G.; Maggi, M.; Miniello, G.; My, S.; Nuzzo, S.; Pompili, A.; Pugliese, G.; Radogna, R.; Ranieri, A.; Selvaggi, G.; Silvestris, L.; Venditti, R.; Abbiendi, G.; Battilana, C.; Bonacorsi, D.; Braibant-Giacomelli, S.; Brigliadori, L.; Campanini, R.; Capiluppi, P.; Castro, A.; Cavallo, F. R.; Chhibra, S. S.; Codispoti, G.; Cuffiani, M.; Dallavalle, G. M.; Fabbri, F.; Fanfani, A.; Fasanella, D.; Giacomelli, P.; Grandi, C.; Guiducci, L.; Marcellini, S.; Masetti, G.; Montanari, A.; Navarria, F. L.; Perrotta, A.; Rossi, A. M.; Rovelli, T.; Siroli, G. P.; Tosi, N.; Cappello, G.; Chiorboli, M.; Costa, S.; di Mattia, A.; Giordano, F.; Potenza, R.; Tricomi, A.; Tuve, C.; Barbagli, G.; Ciulli, V.; Civinini, C.; D'Alessandro, R.; Focardi, E.; Gori, V.; Lenzi, P.; Meschini, M.; Paoletti, S.; Sguazzoni, G.; Viliani, L.; Benussi, L.; Bianco, S.; Fabbri, F.; Piccolo, D.; Primavera, F.; Calvelli, V.; Ferro, F.; Lo Vetere, M.; Monge, M. R.; Robutti, E.; Tosi, S.; Brianza, L.; Dinardo, M. E.; Fiorendi, S.; Gennai, S.; Gerosa, R.; Ghezzi, A.; Govoni, P.; Malvezzi, S.; Manzoni, R. A.; Marzocchi, B.; Menasce, D.; Moroni, L.; Paganoni, M.; Pedrini, D.; Pigazzini, S.; Ragazzi, S.; Redaelli, N.; Tabarelli de Fatis, T.; Buontempo, S.; Cavallo, N.; di Guida, S.; Esposito, M.; Fabozzi, F.; Iorio, A. O. M.; Lanza, G.; Lista, L.; Meola, S.; Merola, M.; Paolucci, P.; Sciacca, C.; Thyssen, F.; Azzi, P.; Bacchetta, N.; Benato, L.; Bisello, D.; Boletti, A.; Branca, A.; Carlin, R.; Checchia, P.; Dall'Osso, M.; Dorigo, T.; Dosselli, U.; Gasparini, F.; Gasparini, U.; Gozzelino, A.; Kanishchev, K.; Lacaprara, S.; Margoni, M.; Meneguzzo, A. T.; Pazzini, J.; Pozzobon, N.; Ronchese, P.; Simonetto, F.; Torassa, E.; Tosi, M.; Ventura, S.; Zanetti, M.; Zotto, P.; Zucchetta, A.; Zumerle, G.; Braghieri, A.; Magnani, A.; Montagna, P.; Ratti, S. P.; Re, V.; Riccardi, C.; Salvini, P.; Vai, I.; Vitulo, P.; Alunni Solestizi, L.; Bilei, G. M.; Ciangottini, D.; Fanò, L.; Lariccia, P.; Mantovani, G.; Menichelli, M.; Saha, A.; Santocchia, A.; Androsov, K.; Azzurri, P.; Bagliesi, G.; Bernardini, J.; Boccali, T.; Castaldi, R.; Ciocci, M. A.; Dell'Orso, R.; Donato, S.; Fedi, G.; Foà, L.; Giassi, A.; Grippo, M. T.; Ligabue, F.; Lomtadze, T.; Martini, L.; Messineo, A.; Palla, F.; Rizzi, A.; Savoy-Navarro, A.; Spagnolo, P.; Tenchini, R.; Tonelli, G.; Venturi, A.; Verdini, P. G.; Barone, L.; Cavallari, F.; D'Imperio, G.; Del Re, D.; Diemoz, M.; Gelli, S.; Jorda, C.; Longo, E.; Margaroli, F.; Meridiani, P.; Organtini, G.; Paramatti, R.; Preiato, F.; Rahatlou, S.; Rovelli, C.; Santanastasio, F.; Amapane, N.; Arcidiacono, R.; Argiro, S.; Arneodo, M.; Bellan, R.; Biino, C.; Cartiglia, N.; Costa, M.; Covarelli, R.; Degano, A.; Demaria, N.; Finco, L.; Kiani, B.; Mariotti, C.; Maselli, S.; Migliore, E.; Monaco, V.; Monteil, E.; Obertino, M. M.; Pacher, L.; Pastrone, N.; Pelliccioni, M.; Pinna Angioni, G. L.; Ravera, F.; Romero, A.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Belforte, S.; Candelise, V.; Casarsa, M.; Cossutti, F.; Della Ricca, G.; Gobbo, B.; La Licata, C.; Schizzi, A.; Zanetti, A.; Kropivnitskaya, A.; Nam, S. K.; Kim, D. H.; Kim, G. N.; Kim, M. S.; Kong, D. J.; Lee, S.; Lee, S. W.; Oh, Y. D.; Sakharov, A.; Sekmen, S.; Son, D. C.; Brochero Cifuentes, J. A.; Kim, H.; Kim, T. J.; Song, S.; Cho, S.; Choi, S.; Go, Y.; Gyun, D.; Hong, B.; Kim, H.; Kim, Y.; Lee, B.; Lee, K.; Lee, K. S.; Lee, S.; Lim, J.; Park, S. K.; Roh, Y.; Yoo, H. D.; Choi, M.; Kim, H.; Kim, J. H.; Lee, J. S. H.; Park, I. C.; Ryu, G.; Ryu, M. S.; Choi, Y.; Goh, J.; Kim, D.; Kwon, E.; Lee, J.; Yu, I.; Dudenas, V.; Juodagalvis, A.; Vaitkus, J.; Ahmed, I.; Ibrahim, Z. A.; Komaragiri, J. R.; Md Ali, M. A. B.; Mohamad Idris, F.; Wan Abdullah, W. A. T.; Yusli, M. N.; Zolkapli, Z.; Casimiro Linares, E.; Castilla-Valdez, H.; de La Cruz-Burelo, E.; Heredia-de La Cruz, I.; Hernandez-Almada, A.; Lopez-Fernandez, R.; Mejia Guisao, J.; Sanchez-Hernandez, A.; Carrillo Moreno, S.; Vazquez Valencia, F.; Pedraza, I.; Salazar Ibarguen, H. A.; Morelos Pineda, A.; Krofcheck, D.; Butler, P. H.; Ahmad, A.; Ahmad, M.; Hassan, Q.; Hoorani, H. R.; Khan, W. A.; Khurshid, T.; Shoaib, M.; Waqas, M.; Bialkowska, H.; Bluj, M.; Boimska, B.; Frueboes, T.; Górski, M.; Kazana, M.; Nawrocki, K.; Romanowska-Rybinska, K.; Szleper, M.; Traczyk, P.; Zalewski, P.; Brona, G.; Bunkowski, K.; Byszuk, A.; Doroba, K.; Kalinowski, A.; Konecki, M.; Krolikowski, J.; Misiura, M.; Olszewski, M.; Walczak, M.; Bargassa, P.; Beirão da Cruz E Silva, C.; di Francesco, A.; Faccioli, P.; Ferreira Parracho, P. G.; Gallinaro, M.; Hollar, J.; Leonardo, N.; Lloret Iglesias, L.; Nemallapudi, M. V.; Nguyen, F.; Rodrigues Antunes, J.; Seixas, J.; Toldaiev, O.; Vadruccio, D.; Varela, J.; Vischia, P.; Bunin, P.; Gavrilenko, M.; Golutvin, I.; Gorbunov, I.; Kamenev, A.; Karjavin, V.; Lanev, A.; Malakhov, A.; Matveev, V.; Moisenz, P.; Palichik, V.; Perelygin, V.; Savina, M.; Shmatov, S.; Shulha, S.; Skatchkov, N.; Smirnov, V.; Voytishin, N.; Zarubin, A.; Golovtsov, V.; Ivanov, Y.; Kim, V.; Kuznetsova, E.; Levchenko, P.; Murzin, V.; Oreshkin, V.; Smirnov, I.; Sulimov, V.; Uvarov, L.; Vavilov, S.; Vorobyev, A.; Andreev, Yu.; Dermenev, A.; Gninenko, S.; Golubev, N.; Karneyeu, A.; Kirsanov, M.; Krasnikov, N.; Pashenkov, A.; Tlisov, D.; Toropin, A.; Epshteyn, V.; Gavrilov, V.; Lychkovskaya, N.; Popov, V.; Pozdnyakov, I.; Safronov, G.; Spiridonov, A.; Vlasov, E.; Zhokin, A.; Chadeeva, M.; Danilov, M.; Markin, O.; Rusinov, V.; Tarkovskii, E.; Andreev, V.; Azarkin, M.; Dremin, I.; Kirakosyan, M.; Leonidov, A.; Mesyats, G.; Rusakov, S. V.; Baskakov, A.; Belyaev, A.; Boos, E.; Dubinin, M.; Dudko, L.; Ershov, A.; Gribushin, A.; Klyukhin, V.; Kodolova, O.; Lokhtin, I.; Miagkov, I.; Obraztsov, S.; Petrushanko, S.; Savrin, V.; Snigirev, A.; Azhgirey, I.; Bayshev, I.; Bitioukov, S.; Kachanov, V.; Kalinin, A.; Konstantinov, D.; Krychkine, V.; Petrov, V.; Ryutin, R.; Sobol, A.; Tourtchanovitch, L.; Troshin, S.; Tyurin, N.; Uzunian, A.; Volkov, A.; Adzic, P.; Cirkovic, P.; Devetak, D.; Milosevic, J.; Rekovic, V.; Alcaraz Maestre, J.; Calvo, E.; Cerrada, M.; Chamizo Llatas, M.; Colino, N.; de La Cruz, B.; Delgado Peris, A.; Escalante Del Valle, A.; Fernandez Bedoya, C.; Fernández Ramos, J. P.; Flix, J.; Fouz, M. C.; Garcia-Abia, P.; Gonzalez Lopez, O.; Goy Lopez, S.; Hernandez, J. M.; Josa, M. I.; Navarro de Martino, E.; Pérez-Calero Yzquierdo, A.; Puerta Pelayo, J.; Quintario Olmeda, A.; Redondo, I.; Romero, L.; Soares, M. S.; de Trocóniz, J. F.; Missiroli, M.; Moran, D.; Cuevas, J.; Fernandez Menendez, J.; Folgueras, S.; Gonzalez Caballero, I.; Palencia Cortezon, E.; Vizan Garcia, J. M.; Cabrillo, I. J.; Calderon, A.; Castiñeiras de Saa, J. R.; Curras, E.; de Castro Manzano, P.; Fernandez, M.; Garcia-Ferrero, J.; Gomez, G.; Lopez Virto, A.; Marco, J.; Marco, R.; Martinez Rivero, C.; Matorras, F.; Piedra Gomez, J.; Rodrigo, T.; Rodríguez-Marrero, A. Y.; Ruiz-Jimeno, A.; Scodellaro, L.; Trevisani, N.; Vila, I.; Vilar Cortabitarte, R.; Abbaneo, D.; Auffray, E.; Auzinger, G.; Bachtis, M.; Baillon, P.; Ball, A. H.; Barney, D.; Benaglia, A.; Benhabib, L.; Berruti, G. M.; Bloch, P.; Bocci, A.; Bonato, A.; Botta, C.; Breuker, H.; Camporesi, T.; Castello, R.; Cepeda, M.; Cerminara, G.; D'Alfonso, M.; D'Enterria, D.; Dabrowski, A.; Daponte, V.; David, A.; de Gruttola, M.; de Guio, F.; de Roeck, A.; di Marco, E.; Dobson, M.; Dordevic, M.; Dorney, B.; Du Pree, T.; Duggan, D.; Dünser, M.; Dupont, N.; Elliott-Peisert, A.; Franzoni, G.; Fulcher, J.; Funk, W.; Gigi, D.; Gill, K.; Girone, M.; Glege, F.; Guida, R.; Gundacker, S.; Guthoff, M.; Hammer, J.; Harris, P.; Hegeman, J.; Innocente, V.; Janot, P.; Kirschenmann, H.; Knünz, V.; Kortelainen, M. J.; Kousouris, K.; Lecoq, P.; Lourenço, C.; Lucchini, M. T.; Magini, N.; Malgeri, L.; Mannelli, M.; Martelli, A.; Masetti, L.; Meijers, F.; Mersi, S.; Meschi, E.; Moortgat, F.; Morovic, S.; Mulders, M.; Neugebauer, H.; Orfanelli, S.; Orsini, L.; Pape, L.; Perez, E.; Peruzzi, M.; Petrilli, A.; Petrucciani, G.; Pfeiffer, A.; Pierini, M.; Piparo, D.; Racz, A.; Reis, T.; Rolandi, G.; Rovere, M.; Ruan, M.; Sakulin, H.; Sauvan, J. B.; Schäfer, C.; Schwick, C.; Seidel, M.; Sharma, A.; Silva, P.; Simon, M.; Sphicas, P.; Steggemann, J.; Stoye, M.; Takahashi, Y.; Treille, D.; Triossi, A.; Tsirou, A.; Veres, G. I.; Wardle, N.; Wöhri, H. K.; Zagozdzinska, A.; Zeuner, W. D.; Bertl, W.; Deiters, K.; Erdmann, W.; Horisberger, R.; Ingram, Q.; Kaestli, H. C.; Kotlinski, D.; Langenegger, U.; Rohe, T.; Bachmair, F.; Bäni, L.; Bianchini, L.; Casal, B.; Dissertori, G.; Dittmar, M.; Donegà, M.; Eller, P.; Grab, C.; Heidegger, C.; Hits, D.; Hoss, J.; Kasieczka, G.; Lecomte, P.; Lustermann, W.; Mangano, B.; Marionneau, M.; Martinez Ruiz Del Arbol, P.; Masciovecchio, M.; Meinhard, M. T.; Meister, D.; Micheli, F.; Musella, P.; Nessi-Tedaldi, F.; Pandolfi, F.; Pata, J.; Pauss, F.; Perrin, G.; Perrozzi, L.; Quittnat, M.; Rossini, M.; Schönenberger, M.; Starodumov, A.; Takahashi, M.; Tavolaro, V. R.; Theofilatos, K.; Wallny, R.; Aarrestad, T. K.; Amsler, C.; Caminada, L.; Canelli, M. F.; Chiochia, V.; de Cosa, A.; Galloni, C.; Hinzmann, A.; Hreus, T.; Kilminster, B.; Lange, C.; Ngadiuba, J.; Pinna, D.; Rauco, G.; Robmann, P.; Salerno, D.; Yang, Y.; Chen, K. H.; Doan, T. H.; Jain, Sh.; Khurana, R.; Konyushikhin, M.; Kuo, C. M.; Lin, W.; Lu, Y. J.; Pozdnyakov, A.; Yu, S. S.; Kumar, Arun; Chang, P.; Chang, Y. H.; Chang, Y. W.; Chao, Y.; Chen, K. F.; Chen, P. H.; Dietz, C.; Fiori, F.; Grundler, U.; Hou, W.-S.; Hsiung, Y.; Liu, Y. F.; Lu, R.-S.; Miñano Moya, M.; Petrakou, E.; Tsai, J. F.; Tzeng, Y. M.; Asavapibhop, B.; Kovitanggoon, K.; Singh, G.; Srimanobhas, N.; Suwonjandee, N.; Adiguzel, A.; Cerci, S.; Damarseckin, S.; Demiroglu, Z. S.; Dozen, C.; Dumanoglu, I.; Girgis, S.; Gokbulut, G.; Guler, Y.; Gurpinar, E.; Hos, I.; Kangal, E. E.; Kayis Topaksu, A.; Onengut, G.; Ozdemir, K.; Ozturk, S.; Tali, B.; Topakli, H.; Zorbilmez, C.; Bilin, B.; Bilmis, S.; Isildak, B.; Karapinar, G.; Yalvac, M.; Zeyrek, M.; Gülmez, E.; Kaya, M.; Kaya, O.; Yetkin, E. A.; Yetkin, T.; Cakir, A.; Cankocak, K.; Sen, S.; Vardarlı, F. I.; Grynyov, B.; Levchuk, L.; Sorokin, P.; Aggleton, R.; Ball, F.; Beck, L.; Brooke, J. J.; Burns, D.; Clement, E.; Cussans, D.; Flacher, H.; Goldstein, J.; Grimes, M.; Heath, G. P.; Heath, H. F.; Jacob, J.; Kreczko, L.; Lucas, C.; Meng, Z.; Newbold, D. M.; Paramesvaran, S.; Poll, A.; Sakuma, T.; Seif El Nasr-Storey, S.; Senkin, S.; Smith, D.; Smith, V. J.; Bell, K. W.; Belyaev, A.; Brew, C.; Brown, R. M.; Calligaris, L.; Cieri, D.; Cockerill, D. J. A.; Coughlan, J. A.; Harder, K.; Harper, S.; Olaiya, E.; Petyt, D.; Shepherd-Themistocleous, C. H.; Thea, A.; Tomalin, I. R.; Williams, T.; Worm, S. D.; Baber, M.; Bainbridge, R.; Buchmuller, O.; Bundock, A.; Burton, D.; Casasso, S.; Citron, M.; Colling, D.; Corpe, L.; Dauncey, P.; Davies, G.; de Wit, A.; Della Negra, M.; Dunne, P.; Elwood, A.; Futyan, D.; Hall, G.; Iles, G.; Lane, R.; Lucas, R.; Lyons, L.; Magnan, A.-M.; Malik, S.; Mastrolorenzo, L.; Nash, J.; Nikitenko, A.; Pela, J.; Penning, B.; Pesaresi, M.; Raymond, D. M.; Richards, A.; Rose, A.; Seez, C.; Tapper, A.; Uchida, K.; Vazquez Acosta, M.; Virdee, T.; Zenz, S. C.; Cole, J. E.; Hobson, P. R.; Khan, A.; Kyberd, P.; Leslie, D.; Reid, I. D.; Symonds, P.; Teodorescu, L.; Turner, M.; Borzou, A.; Call, K.; Dittmann, J.; Hatakeyama, K.; Liu, H.; Pastika, N.; Charaf, O.; Cooper, S. I.; Henderson, C.; Rumerio, P.; Arcaro, D.; Avetisyan, A.; Bose, T.; Gastler, D.; Rankin, D.; Richardson, C.; Rohlf, J.; Sulak, L.; Zou, D.; Alimena, J.; Benelli, G.; Berry, E.; Cutts, D.; Ferapontov, A.; Garabedian, A.; Hakala, J.; Heintz, U.; Jesus, O.; Laird, E.; Landsberg, G.; Mao, Z.; Narain, M.; Piperov, S.; Sagir, S.; Syarif, R.; Breedon, R.; Breto, G.; Calderon de La Barca Sanchez, M.; Chauhan, S.; Chertok, M.; Conway, J.; Conway, R.; Cox, P. T.; Erbacher, R.; Funk, G.; Gardner, M.; Gunion, J.; Ko, W.; Lander, R.; McLean, C.; Mulhearn, M.; Pellett, D.; Pilot, J.; Ricci-Tam, F.; Shalhout, S.; Smith, J.; Squires, M.; Stolp, D.; Tripathi, M.; Wilbur, S.; Yohay, R.; Cousins, R.; Everaerts, P.; Florent, A.; Hauser, J.; Ignatenko, M.; Saltzberg, D.; Takasugi, E.; Valuev, V.; Weber, M.; Burt, K.; Clare, R.; Ellison, J.; Gary, J. W.; Hanson, G.; Heilman, J.; Ivova Paneva, M.; Jandir, P.; Kennedy, E.; Lacroix, F.; Long, O. R.; Malberti, M.; Olmedo Negrete, M.; Shrinivas, A.; Wei, H.; Wimpenny, S.; Yates, B. R.; Branson, J. G.; Cerati, G. B.; Cittolin, S.; D'Agnolo, R. T.; Derdzinski, M.; Holzner, A.; Kelley, R.; Klein, D.; Letts, J.; MacNeill, I.; Olivito, D.; Padhi, S.; Pieri, M.; Sani, M.; Sharma, V.; Simon, S.; Tadel, M.; Vartak, A.; Wasserbaech, S.; Welke, C.; Würthwein, F.; Yagil, A.; Zevi Della Porta, G.; Bradmiller-Feld, J.; Campagnari, C.; Dishaw, A.; Dutta, V.; Flowers, K.; Franco Sevilla, M.; Geffert, P.; George, C.; Golf, F.; Gouskos, L.; Gran, J.; Incandela, J.; McColl, N.; Mullin, S. D.; Richman, J.; Stuart, D.; Suarez, I.; West, C.; Yoo, J.; Anderson, D.; Apresyan, A.; Bendavid, J.; Bornheim, A.; Bunn, J.; Chen, Y.; Duarte, J.; Mott, A.; Newman, H. B.; Pena, C.; Spiropulu, M.; Vlimant, J. R.; Xie, S.; Zhu, R. Y.; Andrews, M. B.; Azzolini, V.; Calamba, A.; Carlson, B.; Ferguson, T.; Paulini, M.; Russ, J.; Sun, M.; Vogel, H.; Vorobiev, I.; Cumalat, J. P.; Ford, W. T.; Gaz, A.; Jensen, F.; Johnson, A.; Krohn, M.; Mulholland, T.; Nauenberg, U.; Stenson, K.; Wagner, S. R.; Alexander, J.; Chatterjee, A.; Chaves, J.; Chu, J.; Dittmer, S.; Eggert, N.; Mirman, N.; Nicolas Kaufman, G.; Patterson, J. R.; Rinkevicius, A.; Ryd, A.; Skinnari, L.; Soffi, L.; Sun, W.; Tan, S. M.; Teo, W. D.; Thom, J.; Thompson, J.; Tucker, J.; Weng, Y.; Wittich, P.; Abdullin, S.; Albrow, M.; Apollinari, G.; Banerjee, S.; Bauerdick, L. A. T.; Beretvas, A.; Berryhill, J.; Bhat, P. C.; Bolla, G.; Burkett, K.; Butler, J. N.; Cheung, H. W. K.; Chlebana, F.; Cihangir, S.; Elvira, V. D.; Fisk, I.; Freeman, J.; Gottschalk, E.; Gray, L.; Green, D.; Grünendahl, S.; Gutsche, O.; Hanlon, J.; Hare, D.; Harris, R. M.; Hasegawa, S.; Hirschauer, J.; Hu, Z.; Jayatilaka, B.; Jindariani, S.; Johnson, M.; Joshi, U.; Klima, B.; Kreis, B.; Lammel, S.; Lewis, J.; Linacre, J.; Lincoln, D.; Lipton, R.; Liu, T.; Lopes de Sá, R.; Lykken, J.; Maeshima, K.; Marraffino, J. M.; Maruyama, S.; Mason, D.; McBride, P.; Merkel, P.; Mrenna, S.; Nahn, S.; Newman-Holmes, C.; O'Dell, V.; Pedro, K.; Prokofyev, O.; Rakness, G.; Sexton-Kennedy, E.; Soha, A.; Spalding, W. J.; Spiegel, L.; Stoynev, S.; Strobbe, N.; Taylor, L.; Tkaczyk, S.; Tran, N. V.; Uplegger, L.; Vaandering, E. W.; Vernieri, C.; Verzocchi, M.; Vidal, R.; Wang, M.; Weber, H. A.; Whitbeck, A.; Acosta, D.; Avery, P.; Bortignon, P.; Bourilkov, D.; Brinkerhoff, A.; Carnes, A.; Carver, M.; Curry, D.; Das, S.; Field, R. D.; Furic, I. K.; Konigsberg, J.; Korytov, A.; Kotov, K.; Ma, P.; Matchev, K.; Mei, H.; Milenovic, P.; Mitselmakher, G.; Rank, D.; Rossin, R.; Shchutska, L.; Snowball, M.; Sperka, D.; Terentyev, N.; Thomas, L.; Wang, J.; Wang, S.; Yelton, J.; Linn, S.; Markowitz, P.; Martinez, G.; Rodriguez, J. L.; Ackert, A.; Adams, J. R.; Adams, T.; Askew, A.; Bein, S.; Bochenek, J.; Diamond, B.; Haas, J.; Hagopian, S.; Hagopian, V.; Johnson, K. F.; Khatiwada, A.; Prosper, H.; Weinberg, M.; Baarmand, M. M.; Bhopatkar, V.; Colafranceschi, S.; Hohlmann, M.; Kalakhety, H.; Noonan, D.; Roy, T.; Yumiceva, F.; Adams, M. R.; Apanasevich, L.; Berry, D.; Betts, R. R.; Bucinskaite, I.; Cavanaugh, R.; Evdokimov, O.; Gauthier, L.; Gerber, C. E.; Hofman, D. J.; Kurt, P.; O'Brien, C.; Sandoval Gonzalez, I. D.; Turner, P.; Varelas, N.; Wu, Z.; Zakaria, M.; Zhang, J.; Bilki, B.; Clarida, W.; Dilsiz, K.; Durgut, S.; Gandrajula, R. P.; Haytmyradov, M.; Khristenko, V.; Merlo, J.-P.; Mermerkaya, H.; Mestvirishvili, A.; Moeller, A.; Nachtman, J.; Ogul, H.; Onel, Y.; Ozok, F.; Penzo, A.; Snyder, C.; Tiras, E.; Wetzel, J.; Yi, K.; Anderson, I.; Barnett, B. A.; Blumenfeld, B.; Cocoros, A.; Eminizer, N.; Fehling, D.; Feng, L.; Gritsan, A. V.; Maksimovic, P.; Osherson, M.; Roskes, J.; Sarica, U.; Swartz, M.; Xiao, M.; Xin, Y.; You, C.; Baringer, P.; Bean, A.; Bruner, C.; Kenny, R. P.; Majumder, D.; Malek, M.; McBrayer, W.; Murray, M.; Sanders, S.; Stringer, R.; Wang, Q.; Ivanov, A.; Kaadze, K.; Khalil, S.; Makouski, M.; Maravin, Y.; Mohammadi, A.; Saini, L. K.; Skhirtladze, N.; Toda, S.; Lange, D.; Rebassoo, F.; Wright, D.; Anelli, C.; Baden, A.; Baron, O.; Belloni, A.; Calvert, B.; Eno, S. C.; Ferraioli, C.; Gomez, J. A.; Hadley, N. J.; Jabeen, S.; Kellogg, R. G.; Kolberg, T.; Kunkle, J.; Lu, Y.; Mignerey, A. C.; Shin, Y. H.; Skuja, A.; Tonjes, M. B.; Tonwar, S. C.; Apyan, A.; Barbieri, R.; Baty, A.; Bi, R.; Bierwagen, K.; Brandt, S.; Busza, W.; Cali, I. A.; Demiragli, Z.; Di Matteo, L.; Gomez Ceballos, G.; Goncharov, M.; Gulhan, D.; Iiyama, Y.; Innocenti, G. M.; Klute, M.; Kovalskyi, D.; Krajczar, K.; Lai, Y. S.; Lee, Y.-J.; Levin, A.; Luckey, P. D.; Marini, A. C.; McGinn, C.; Mironov, C.; Narayanan, S.; Niu, X.; Paus, C.; Roland, C.; Roland, G.; Salfeld-Nebgen, J.; Stephans, G. S. F.; Sumorok, K.; Tatar, K.; Varma, M.; Velicanu, D.; Veverka, J.; Wang, J.; Wang, T. W.; Wyslouch, B.; Yang, M.; Zhukova, V.; Benvenuti, A. C.; Dahmes, B.; Evans, A.; Finkel, A.; Gude, A.; Hansen, P.; Kalafut, S.; Kao, S. C.; Klapoetke, K.; Kubota, Y.; Lesko, Z.; Mans, J.; Nourbakhsh, S.; Ruckstuhl, N.; Rusack, R.; Tambe, N.; Turkewitz, J.; Acosta, J. G.; Oliveros, S.; Avdeeva, E.; Bartek, R.; Bloom, K.; Bose, S.; Claes, D. R.; Dominguez, A.; Fangmeier, C.; Gonzalez Suarez, R.; Kamalieddin, R.; Knowlton, D.; Kravchenko, I.; Meier, F.; Monroy, J.; Ratnikov, F.; Siado, J. E.; Snow, G. R.; Stieger, B.; Alyari, M.; Dolen, J.; George, J.; Godshalk, A.; Harrington, C.; Iashvili, I.; Kaisen, J.; Kharchilava, A.; Kumar, A.; Rappoccio, S.; Roozbahani, B.; Alverson, G.; Barberis, E.; Baumgartel, D.; Chasco, M.; Hortiangtham, A.; Massironi, A.; Morse, D. M.; Nash, D.; Orimoto, T.; Teixeira de Lima, R.; Trocino, D.; Wang, R.-J.; Wood, D.; Zhang, J.; Bhattacharya, S.; Hahn, K. A.; Kubik, A.; Low, J. F.; Mucia, N.; Odell, N.; Pollack, B.; Schmitt, M. H.; Sung, K.; Trovato, M.; Velasco, M.; Dev, N.; Hildreth, M.; Jessop, C.; Karmgard, D. J.; Kellams, N.; Lannon, K.; Marinelli, N.; Meng, F.; Mueller, C.; Musienko, Y.; Planer, M.; Reinsvold, A.; Ruchti, R.; Rupprecht, N.; Smith, G.; Taroni, S.; Valls, N.; Wayne, M.; Wolf, M.; Woodard, A.; Antonelli, L.; Brinson, J.; Bylsma, B.; Durkin, L. S.; Flowers, S.; Hart, A.; Hill, C.; Hughes, R.; Ji, W.; Ling, T. Y.; Liu, B.; Luo, W.; Puigh, D.; Rodenburg, M.; Winer, B. L.; Wulsin, H. W.; Driga, O.; Elmer, P.; Hardenbrook, J.; Hebda, P.; Koay, S. A.; Lujan, P.; Marlow, D.; Medvedeva, T.; Mooney, M.; Olsen, J.; Palmer, C.; Piroué, P.; Stickland, D.; Tully, C.; Zuranski, A.; Malik, S.; Barker, A.; Barnes, V. E.; Benedetti, D.; Bortoletto, D.; Gutay, L.; Jha, M. K.; Jones, M.; Jung, A. W.; Jung, K.; Miller, D. H.; Neumeister, N.; Radburn-Smith, B. C.; Shi, X.; Shipsey, I.; Silvers, D.; Sun, J.; Svyatkovskiy, A.; Wang, F.; Xie, W.; Xu, L.; Parashar, N.; Stupak, J.; Adair, A.; Akgun, B.; Chen, Z.; Ecklund, K. M.; Geurts, F. J. M.; Guilbaud, M.; Li, W.; Michlin, B.; Northup, M.; Padley, B. P.; Redjimi, R.; Roberts, J.; Rorie, J.; Tu, Z.; Zabel, J.; Betchart, B.; Bodek, A.; de Barbaro, P.; Demina, R.; Eshaq, Y.; Ferbel, T.; Galanti, M.; Garcia-Bellido, A.; Han, J.; Hindrichs, O.; Khukhunaishvili, A.; Lo, K. H.; Tan, P.; Verzetti, M.; Chou, J. P.; Contreras-Campana, E.; Ferencek, D.; Gershtein, Y.; Halkiadakis, E.; Heindl, M.; Hidas, D.; Hughes, E.; Kaplan, S.; Kunnawalkam Elayavalli, R.; Lath, A.; Nash, K.; Saka, H.; Salur, S.; Schnetzer, S.; Sheffield, D.; Somalwar, S.; Stone, R.; Thomas, S.; Thomassen, P.; Walker, M.; Foerster, M.; Riley, G.; Rose, K.; Spanier, S.; Thapa, K.; Bouhali, O.; Castaneda Hernandez, A.; Celik, A.; Dalchenko, M.; de Mattia, M.; Delgado, A.; Dildick, S.; Eusebi, R.; Gilmore, J.; Huang, T.; Kamon, T.; Krutelyov, V.; Mueller, R.; Osipenkov, I.; Pakhotin, Y.; Patel, R.; Perloff, A.; Rathjens, D.; Rose, A.; Safonov, A.; Tatarinov, A.; Ulmer, K. A.; Akchurin, N.; Cowden, C.; Damgov, J.; Dragoiu, C.; Dudero, P. R.; Faulkner, J.; Kunori, S.; Lamichhane, K.; Lee, S. W.; Libeiro, T.; Undleeb, S.; Volobouev, I.; Appelt, E.; Delannoy, A. G.; Greene, S.; Gurrola, A.; Janjam, R.; Johns, W.; Maguire, C.; Mao, Y.; Melo, A.; Ni, H.; Sheldon, P.; Tuo, S.; Velkovska, J.; Xu, Q.; Arenton, M. W.; Barria, P.; Cox, B.; Francis, B.; Goodell, J.; Hirosky, R.; Ledovskoy, A.; Li, H.; Neu, C.; Sinthuprasith, T.; Sun, X.; Wang, Y.; Wolfe, E.; Wood, J.; Xia, F.; Clarke, C.; Harr, R.; Karchin, P. E.; Kottachchi Kankanamge Don, C.; Lamichhane, P.; Sturdy, J.; Belknap, D. A.; Carlsmith, D.; Dasu, S.; Dodd, L.; Duric, S.; Gomber, B.; Grothe, M.; Herndon, M.; Hervé, A.; Klabbers, P.; Lanaro, A.; Levine, A.; Long, K.; Loveless, R.; Mohapatra, A.; Ojalvo, I.; Perry, T.; Pierro, G. A.; Polese, G.; Ruggles, T.; Sarangi, T.; Savin, A.; Sharma, A.; Smith, N.; Smith, W. H.; Taylor, D.; Verwilligen, P.; Woods, N.

    2016-10-01

    Searches for new physics by the CMS collaboration are interpreted in the framework of the phenomenological minimal supersymmetric standard model (pMSSM). The data samples used in this study were collected at √{s}=7 and 8 TeV and have integrated luminosities of 5.0 fb-1 and 19.5 fb-1, respectively. A global Bayesian analysis is performed, incorporating results from a broad range of CMS supersymmetry searches, as well as constraints from other experiments. Because the pMSSM incorporates several well-motivated assumptions that reduce the 120 parameters of the MSSM to just 19 parameters defined at the electroweak scale, it is possible to assess the results of the study in a relatively straightforward way. Approximately half of the model points in a potentially accessible subspace of the pMSSM are excluded, including all pMSSM model points with a gluino mass below 500 GeV, as well as models with a squark mass less than 300 GeV. Models with chargino and neutralino masses below 200 GeV are disfavored, but no mass range of model points can be ruled out based on the analyses considered. The nonexcluded regions in the pMSSM parameter space are characterized in terms of physical processes and key observables, and implications for future searches are discussed. [Figure not available: see fulltext.

  17. Matrigel Mattress: A Method for the Generation of Single Contracting Human-Induced Pluripotent Stem Cell-Derived Cardiomyocytes.

    PubMed

    Feaster, Tromondae K; Cadar, Adrian G; Wang, Lili; Williams, Charles H; Chun, Young Wook; Hempel, Jonathan E; Bloodworth, Nathaniel; Merryman, W David; Lim, Chee Chew; Wu, Joseph C; Knollmann, Björn C; Hong, Charles C

    2015-12-04

    The lack of measurable single-cell contractility of human-induced pluripotent stem cell-derived cardiac myocytes (hiPSC-CMs) currently limits the utility of hiPSC-CMs for evaluating contractile performance for both basic research and drug discovery. To develop a culture method that rapidly generates contracting single hiPSC-CMs and allows quantification of cell shortening with standard equipment used for studying adult CMs. Single hiPSC-CMs were cultured for 5 to 7 days on a 0.4- to 0.8-mm thick mattress of undiluted Matrigel (mattress hiPSC-CMs) and compared with hiPSC-CMs maintained on a control substrate (<0.1-mm thick 1:60 diluted Matrigel, control hiPSC-CMs). Compared with control hiPSC-CMs, mattress hiPSC-CMs had more rod-shape morphology and significantly increased sarcomere length. Contractile parameters of mattress hiPSC-CMs measured with video-based edge detection were comparable with those of freshly isolated adult rabbit ventricular CMs. Morphological and contractile properties of mattress hiPSC-CMs were consistent across cryopreserved hiPSC-CMs generated independently at another institution. Unlike control hiPSC-CMs, mattress hiPSC-CMs display robust contractile responses to positive inotropic agents, such as myofilament calcium sensitizers. Mattress hiPSC-CMs exhibit molecular changes that include increased expression of the maturation marker cardiac troponin I and significantly increased action potential upstroke velocity because of a 2-fold increase in sodium current (INa). The Matrigel mattress method enables the rapid generation of robustly contracting hiPSC-CMs and enhances maturation. This new method allows quantification of contractile performance at the single-cell level, which should be valuable to disease modeling, drug discovery, and preclinical cardiotoxicity testing. © 2015 American Heart Association, Inc.

  18. Structured feedback on students' concept maps: the proverbial path to learning?

    PubMed

    Joseph, Conran; Conradsson, David; Nilsson Wikmar, Lena; Rowe, Michael

    2017-05-25

    Good conceptual knowledge is an essential requirement for health professions students, in that they are required to apply concepts learned in the classroom to a variety of different contexts. However, the use of traditional methods of assessment limits the educator's ability to correct students' conceptual knowledge prior to altering the educational context. Concept mapping (CM) is an educational tool for evaluating conceptual knowledge, but little is known about its use in facilitating the development of richer knowledge frameworks. In addition, structured feedback has the potential to develop good conceptual knowledge. The purpose of this study was to use Kinchin's criteria to assess the impact of structured feedback on the graphical complexity of CM's by observing the development of richer knowledge frameworks. Fifty-eight physiotherapy students created CM's targeting the integration of two knowledge domains within a case-based teaching paradigm. Each student received one round of structured feedback that addressed correction, reinforcement, forensic diagnosis, benchmarking, and longitudinal development on their CM's prior to the final submission. The concept maps were categorized according to Kinchin's criteria as either Spoke, Chain or Net representations, and then evaluated against defined traits of meaningful learning. The inter-rater reliability of categorizing CM's was good. Pre-feedback CM's were predominantly Chain structures (57%), with Net structures appearing least often. There was a significant reduction of the basic Spoke- structured CMs (P = 0.002) and a significant increase of Net-structured maps (P < 0.001) at the final evaluation (post-feedback). Changes in structural complexity of CMs appeared to be indicative of broader knowledge frameworks as assessed against the meaningful learning traits. Feedback on CM's seemed to have contributed towards improving conceptual knowledge and correcting naive conceptions of related knowledge. Educators in medical education could therefore consider using CM's to target individual student development.

  19. Software design and implementation concepts for an interoperable medical communication framework.

    PubMed

    Besting, Andreas; Bürger, Sebastian; Kasparick, Martin; Strathen, Benjamin; Portheine, Frank

    2018-02-23

    The new IEEE 11073 service-oriented device connectivity (SDC) standard proposals for networked point-of-care and surgical devices constitutes the basis for improved interoperability due to its independence of vendors. To accelerate the distribution of the standard a reference implementation is indispensable. However, the implementation of such a framework has to overcome several non-trivial challenges. First, the high level of complexity of the underlying standard must be reflected in the software design. An efficient implementation has to consider the limited resources of the underlying hardware. Moreover, the frameworks purpose of realizing a distributed system demands a high degree of reliability of the framework itself and its internal mechanisms. Additionally, a framework must provide an easy-to-use and fail-safe application programming interface (API). In this work, we address these challenges by discussing suitable software engineering principles and practical coding guidelines. A descriptive model is developed that identifies key strategies. General feasibility is shown by outlining environments in which our implementation has been utilized.

  20. Continuous integration for concurrent MOOSE framework and application development on GitHub

    DOE PAGES

    Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.; ...

    2015-11-20

    For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less

  1. Continuous integration for concurrent MOOSE framework and application development on GitHub

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.

    For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less

  2. An Ontology and a Software Framework for Competency Modeling and Management

    ERIC Educational Resources Information Center

    Paquette, Gilbert

    2007-01-01

    The importance given to competency management is well justified. Acquiring new competencies is the central goal of any education or knowledge management process. Thus, it must be embedded in any software framework as an instructional engineering tool, to inform the runtime environment of the knowledge that is processed by actors, and their…

  3. ALFA: The new ALICE-FAIR software framework

    NASA Astrophysics Data System (ADS)

    Al-Turany, M.; Buncic, P.; Hristov, P.; Kollegger, T.; Kouzinopoulos, C.; Lebedev, A.; Lindenstruth, V.; Manafov, A.; Richter, M.; Rybalchenko, A.; Vande Vyvre, P.; Winckler, N.

    2015-12-01

    The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of large parts of a common software framework in an experiment independent way. The FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities[1, 2]. The ALFA framework is a joint development between ALICE Online- Offline (O2) and FairRoot teams. ALFA is designed as a flexible, elastic system, which balances reliability and ease of development with performance using multi-processing and multithreading. A message- based approach has been adopted; such an approach will support the use of the software on different hardware platforms, including heterogeneous systems. Each process in ALFA assumes limited communication and reliance on other processes. Such a design will add horizontal scaling (multiple processes) to vertical scaling provided by multiple threads to meet computing and throughput demands. ALFA does not dictate any application protocols. Potentially, any content-based processor or any source can change the application protocol. The framework supports different serialization standards for data exchange between different hardware and software languages.

  4. Software cost/resource modeling: Software quality tradeoff measurement

    NASA Technical Reports Server (NTRS)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  5. Pinning down the large- x gluon with NNLO top-quark pair differential distributions

    NASA Astrophysics Data System (ADS)

    Czakon, Michał; Hartland, Nathan P.; Mitov, Alexander; Nocera, Emanuele R.; Rojo, Juan

    2017-04-01

    Top-quark pair production at the LHC is directly sensitive to the gluon PDF at large x. While total cross-section data is already included in several PDF determinations, differential distributions are not, because the corresponding NNLO calculations have become available only recently. In this work we study the impact on the large- x gluon of top-quark pair differential distributions measured by ATLAS and CMS at √{s}=8 TeV. Our analysis, performed in the NNPDF3.0 framework at NNLO accuracy, allows us to identify the optimal combination of LHC top-quark pair measurements that maximize the constraints on the gluon, as well as to assess the compatibility between ATLAS and CMS data. We find that differential distributions from top-quark pair production provide significant constraints on the large- x gluon, comparable to those obtained from inclusive jet production data, and thus should become an important ingredient for the next generation of global PDF fits.

  6. Health Care Quality: Measuring Obesity in Performance Frameworks.

    PubMed

    Zvenyach, Tracy; Pickering, Matthew K

    2017-08-01

    Obesity affects over one-third of Americans and leads to several chronic and costly comorbid conditions. The national movement toward value-based care calls for a refocusing of efforts to address the US obesity epidemic. To help set the stage, the current landscape of obesity-specific quality measures was evaluated. Seven quality measure databases and nine professional societies were searched. Inclusion and exclusion criteria were applied. Measures were then classified by domain and by implementation in national public programs. Eleven obesity-specific quality measures in adults were identified (nine process and two outcome). Three measures received National Quality Forum (NQF) endorsement. Two measures were actively used within Centers for Medicare and Medicaid Services (CMS) programs. Only one measure was both NQF-endorsed and used by CMS. Limitations exist with respect to obesity-specific quality metrics. Such gaps provide opportunities for obesity care specialists to engage and offer valuable insights and pragmatic approaches toward quality measurement. © 2017 The Obesity Society.

  7. Person-Centeredness in Home- and Community-Based Services and Supports: Domains, Attributes, and Assisted Living Indicators.

    PubMed

    Zimmerman, Sheryl; Love, Karen; Cohen, Lauren W; Pinkowitz, Jackie; Nyrop, Kirsten A

    2014-01-01

    As a result of the Centers for Medicare & Medicaid Services (CMS) interest in creating a unifying definition of "community living" for its Medicaid Home and Community Based Services and Support (HCBS) programs, it needed clarifying descriptors of person-centered (PC) practices in assisted living to distinguish them from institutional ones. Additionally, CMS's proposed language defining "community living" had the unintended potential to exclude many assisted living communities and disadvantage residents who receive Medicaid. This manuscript describes the consensus process through which clarifying language for "community living" and a framework for HCBS PC domains, attributes, and indicators specific to assisted living were developed. It examines the validity of those domains based on literature review, surveys, and stakeholder focus groups, and identifies nine domains and 43 indicators that provide a foundation for defining and measuring PC practice in assisted living. Ongoing efforts using community-based participatory research methods are further refining and testing PC indicators for assisted living to advance knowledge, operational policies, practices, and quality outcomes.

  8. Flexible and Low-Cost Measurements for Space Software Development- The Measurements Exploration Framework

    NASA Astrophysics Data System (ADS)

    Marculescu, Bogdan; Feldt, Robert; Torkar, Richard; Green, Lars-Goran; Liljegren, Thomas; Hult, Erika

    2011-08-01

    Verification and validation is an important part of software development and accounts for significant amounts of the costs associated with such a project. For developers of life or mission critical systems, such as software being developed for space applications, a balance must be reached between ensuring the quality of the system by extensive and rigorous testing and reducing costs and allowing the company to compete.Ensuring the quality of any system starts with a quality development process. To evaluate both the software development process and the product itself, measurements are needed. A balance must be then struck between ensuring the best possible quality of both process and product on the one hand, and reducing the cost of performing requirements on the other.A number of measurements have already been defined and are being used. For some of these, data collection can be automated as well, further lowering costs associated with implementing them. In practice, however, there may be situations where existing measurements are unsuitable for a variety of reasons.This paper describes a framework for creating low cost, flexible measurements in areas where initial information is scarce. The framework, called The Measurements Exploration Framework, is aimed in particular at the Space Software development industry and was developed is such an environment.

  9. 77 FR 31618 - Medicaid Program; Announcement of Requirements and Registration for CMS Provider Screening...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-29

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services (CMS) [CMS-2382-N... Challenge AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Notice. SUMMARY: The Centers for Medicare & Medicaid Services (CMS), is announcing the launch of the ``CMS Provider Screening...

  10. Architecture for autonomy

    NASA Astrophysics Data System (ADS)

    Broten, Gregory S.; Monckton, Simon P.; Collier, Jack; Giesbrecht, Jared

    2006-05-01

    In 2002 Defence R&D Canada changed research direction from pure tele-operated land vehicles to general autonomy for land, air, and sea craft. The unique constraints of the military environment coupled with the complexity of autonomous systems drove DRDC to carefully plan a research and development infrastructure that would provide state of the art tools without restricting research scope. DRDC's long term objectives for its autonomy program address disparate unmanned ground vehicle (UGV), unattended ground sensor (UGS), air (UAV), and subsea and surface (UUV and USV) vehicles operating together with minimal human oversight. Individually, these systems will range in complexity from simple reconnaissance mini-UAVs streaming video to sophisticated autonomous combat UGVs exploiting embedded and remote sensing. Together, these systems can provide low risk, long endurance, battlefield services assuming they can communicate and cooperate with manned and unmanned systems. A key enabling technology for this new research is a software architecture capable of meeting both DRDC's current and future requirements. DRDC built upon recent advances in the computing science field while developing its software architecture know as the Architecture for Autonomy (AFA). Although a well established practice in computing science, frameworks have only recently entered common use by unmanned vehicles. For industry and government, the complexity, cost, and time to re-implement stable systems often exceeds the perceived benefits of adopting a modern software infrastructure. Thus, most persevere with legacy software, adapting and modifying software when and wherever possible or necessary -- adopting strategic software frameworks only when no justifiable legacy exists. Conversely, academic programs with short one or two year projects frequently exploit strategic software frameworks but with little enduring impact. The open-source movement radically changes this picture. Academic frameworks, open to public scrutiny and modification, now rival commercial frameworks in both quality and economic impact. Further, industry now realizes that open source frameworks can reduce cost and risk of systems engineering. This paper describes the Architecture for Autonomy implemented by DRDC and how this architecture meets DRDC's current needs. It also presents an argument for why this architecture should also satisfy DRDC's future requirements as well.

  11. Development of software for computing forming information using a component based approach

    NASA Astrophysics Data System (ADS)

    Ko, Kwang Hee; Park, Jiing Seo; Kim, Jung; Kim, Young Bum; Shin, Jong Gye

    2009-12-01

    In shipbuilding industry, the manufacturing technology> has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology', however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary> to create a "plug-in ''framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a frame-work for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology; which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.

  12. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  13. A distributed finite-element modeling and control approach for large flexible structures

    NASA Technical Reports Server (NTRS)

    Young, K. D.

    1989-01-01

    An unconventional framework is described for the design of decentralized controllers for large flexible structures. In contrast to conventional control system design practice which begins with a model of the open loop plant, the controlled plant is assembled from controlled components in which the modeling phase and the control design phase are integrated at the component level. The developed framework is called controlled component synthesis (CCS) to reflect that it is motivated by the well developed Component Mode Synthesis (CMS) methods which were demonstrated to be effective for solving large complex structural analysis problems for almost three decades. The design philosophy behind CCS is also closely related to that of the subsystem decomposition approach in decentralized control.

  14. Research and Design of the Three-tier Distributed Network Management System Based on COM / COM + and DNA

    NASA Astrophysics Data System (ADS)

    Liang, Likai; Bi, Yushen

    Considered on the distributed network management system's demand of high distributives, extensibility and reusability, a framework model of Three-tier distributed network management system based on COM/COM+ and DNA is proposed, which adopts software component technology and N-tier application software framework design idea. We also give the concrete design plan of each layer of this model. Finally, we discuss the internal running process of each layer in the distributed network management system's framework model.

  15. CompHEP: developments and applications

    NASA Astrophysics Data System (ADS)

    Boos, E. E.; Bunichev, V. E.; Dubinin, M. N.; Ilyin, V. A.; Savrin, V. I.; CompHEP Collaboration

    2017-11-01

    New developments of the CompHEP package and its applications to the top quark and the Higgs boson physics at the LHC collider are reviewed. These developments were motivated mainly by the needs of experimental searches of DO (Tevatron) and CMS (LHC) collaborations where identification of the top quark and the Higgs boson in the framework of the Standard Model (SM) or possible extensions of the SM played an important role. New useful features of the CompHEP Graphics User Interface (GUI) are described.

  16. ActiveTutor: Towards More Adaptive Features in an E-Learning Framework

    ERIC Educational Resources Information Center

    Fournier, Jean-Pierre; Sansonnet, Jean-Paul

    2008-01-01

    Purpose: This paper aims to sketch the emerging notion of auto-adaptive software when applied to e-learning software. Design/methodology/approach: The study and the implementation of the auto-adaptive architecture are based on the operational framework "ActiveTutor" that is used for teaching the topic of computer science programming in first-grade…

  17. Developing a Pedagogical-Technical Framework to Improve Creative Writing

    ERIC Educational Resources Information Center

    Chong, Stefanie Xinyi; Lee, Chien-Sing

    2012-01-01

    There are many evidences of motivational and educational benefits from the use of learning software. However, there is a lack of study with regards to the teaching of creative writing. This paper aims to bridge the following gaps: first, the need for a proper framework for scaffolding creative writing through learning software; second, the lack of…

  18. Genome-wide network-based pathway analysis of CSF t-tau/Aβ1-42 ratio in the ADNI cohort.

    PubMed

    Cong, Wang; Meng, Xianglian; Li, Jin; Zhang, Qiushi; Chen, Feng; Liu, Wenjie; Wang, Ying; Cheng, Sipu; Yao, Xiaohui; Yan, Jingwen; Kim, Sungeun; Saykin, Andrew J; Liang, Hong; Shen, Li

    2017-05-30

    The cerebrospinal fluid (CSF) levels of total tau (t-tau) and Aβ 1-42 are potential early diagnostic markers for probable Alzheimer's disease (AD). The influence of genetic variation on these CSF biomarkers has been investigated in candidate or genome-wide association studies (GWAS). However, the investigation of statistically modest associations in GWAS in the context of biological networks is still an under-explored topic in AD studies. The main objective of this study is to gain further biological insights via the integration of statistical gene associations in AD with physical protein interaction networks. The CSF and genotyping data of 843 study subjects (199 CN, 85 SMC, 239 EMCI, 207 LMCI, 113 AD) from the Alzheimer's Disease Neuroimaging Initiative (ADNI) were analyzed. PLINK was used to perform GWAS on the t-tau/Aβ 1-42 ratio using quality controlled genotype data, including 563,980 single nucleotide polymorphisms (SNPs), with age, sex and diagnosis as covariates. Gene-level p-values were obtained by VEGAS2. Genes with p-value ≤ 0.05 were mapped on to a protein-protein interaction (PPI) network (9,617 nodes, 39,240 edges, from the HPRD Database). We integrated a consensus model strategy into the iPINBPA network analysis framework, and named it as CM-iPINBPA. Four consensus modules (CMs) were discovered by CM-iPINBPA, and were functionally annotated using the pathway analysis tool Enrichr. The intersection of four CMs forms a common subnetwork of 29 genes, including those related to tau phosphorylation (GSK3B, SUMO1, AKAP5, CALM1 and DLG4), amyloid beta production (CASP8, PIK3R1, PPA1, PARP1, CSNK2A1, NGFR, and RHOA), and AD (BCL3, CFLAR, SMAD1, and HIF1A). This study coupled a consensus module (CM) strategy with the iPINBPA network analysis framework, and applied it to the GWAS of CSF t-tau/Aβ1-42 ratio in an AD study. The genome-wide network analysis yielded 4 enriched CMs that share not only genes related to tau phosphorylation or amyloid beta production but also multiple genes enriching several KEGG pathways such as Alzheimer's disease, colorectal cancer, gliomas, renal cell carcinoma, Huntington's disease, and others. This study demonstrated that integration of gene-level associations with CMs could yield statistically significant findings to offer valuable biological insights (e.g., functional interaction among the protein products of these genes) and suggest high confidence candidates for subsequent analyses.

  19. Design of an AdvancedTCA board management controller (IPMC)

    NASA Astrophysics Data System (ADS)

    Mendez, J.; Bobillier, V.; Haas, S.; Joos, M.; Mico, S.; Vasey, F.

    2017-03-01

    The AdvancedTCA (ATCA) standard has been selected as the hardware platform for the upgrade of the back-end electronics of the CMS and ATLAS experiments of the Large Hadron Collider (LHC) . In this context, the electronic systems for experiments group at CERN is running a project to evaluate, specify, design and support xTCA equipment. As part of this project, an Intelligent Platform Management Controller (IPMC) for ATCA blades, based on a commercial solution, has been designed to be used on existing and future ATCA blades. This paper reports on the status of this project presenting the hardware and software developments.

  20. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  1. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  2. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  3. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  4. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  5. Advantages of Repeated Low Dose against Single High Dose of Kainate in C57BL/6J Mouse Model of Status Epilepticus: Behavioral and Electroencephalographic Studies

    PubMed Central

    Beamer, Edward; Sills, Graeme J.; Thippeswamy, Thimmasettappa

    2014-01-01

    A refined kainate (KA) C57BL/6J mouse model of status epilepticus (SE) using a repeated low dose (RLD) of KA (5 mg/kg, intraperitoneal; at 30 min intervals) was compared with the established single high dose (SHD) of KA (20 mg/kg, intraperitoneal) model. In the RLD group, increased duration of convulsive motor seizures (CMS, Racine scale stage ≥3) with a significant reduction in mortality from 21% to 6% and decreased variability in seizure severity between animals/batches were observed when compared to the SHD group. There was a significant increase in the percentage of animals that reached stage-5 seizures (65% versus 96%) in the RLD group. Integrated real-time video-EEG analysis of both groups, using NeuroScore software, revealed stage-specific spikes and power spectral density characteristics. When the seizures progressed from non-convulsive seizures (NCS, stage 1–2) to CMS (stage 3–5), the delta power decreased which was followed by an increase in gamma and beta power. A transient increase in alpha and sigma power marked the transition from NCS to CMS with characteristic ‘high frequency trigger’ spikes on the EEG, which had no behavioral expression. During SE the spike rate was higher in the RLD group than in the SHD group. Overall these results confirm that RLD of KA is a more robust and consistent mouse model of SE than the SHD of KA mouse model. PMID:24802808

  6. PyPWA: A partial-wave/amplitude analysis software framework

    NASA Astrophysics Data System (ADS)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  7. EMMA: a new paradigm in configurable software

    DOE PAGES

    Nogiec, J. M.; Trombly-Freytag, K.

    2017-11-23

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  8. EMMA: A New Paradigm in Configurable Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogiec, J. M.; Trombly-Freytag, K.

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  9. EMMA: a new paradigm in configurable software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogiec, J. M.; Trombly-Freytag, K.

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  10. EMMA: a new paradigm in configurable software

    NASA Astrophysics Data System (ADS)

    Nogiec, J. M.; Trombly-Freytag, K.

    2017-10-01

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  11. A Generic Metadata Editor Supporting System Using Drupal CMS

    NASA Astrophysics Data System (ADS)

    Pan, J.; Banks, N. G.; Leggott, M.

    2011-12-01

    Metadata handling is a key factor in preserving and reusing scientific data. In recent years, standardized structural metadata has become widely used in Geoscience communities. However, there exist many different standards in Geosciences, such as the current version of the Federal Geographic Data Committee's Content Standard for Digital Geospatial Metadata (FGDC CSDGM), the Ecological Markup Language (EML), the Geography Markup Language (GML), and the emerging ISO 19115 and related standards. In addition, there are many different subsets within the Geoscience subdomain such as the Biological Profile of the FGDC (CSDGM), or for geopolitical regions, such as the European Profile or the North American Profile in the ISO standards. It is therefore desirable to have a software foundation to support metadata creation and editing for multiple standards and profiles, without re-inventing the wheels. We have developed a software module as a generic, flexible software system to do just that: to facilitate the support for multiple metadata standards and profiles. The software consists of a set of modules for the Drupal Content Management System (CMS), with minimal inter-dependencies to other Drupal modules. There are two steps in using the system's metadata functions. First, an administrator can use the system to design a user form, based on an XML schema and its instances. The form definition is named and stored in the Drupal database as a XML blob content. Second, users in an editor role can then use the persisted XML definition to render an actual metadata entry form, for creating or editing a metadata record. Behind the scenes, the form definition XML is transformed into a PHP array, which is then rendered via Drupal Form API. When the form is submitted the posted values are used to modify a metadata record. Drupal hooks can be used to perform custom processing on metadata record before and after submission. It is trivial to store the metadata record as an actual XML file or in a storage/archive system. We are working on adding many features to help editor users, such as auto completion, pre-populating of forms, partial saving, as well as automatic schema validation. In this presentation we will demonstrate a few sample editors, including an FGDC editor and a bare bone editor for ISO 19115/19139. We will also demonstrate the use of templates during the definition phase, with the support of export and import functions. Form pre-population and input validation will also be covered. Theses modules are available as open-source software from the Islandora software foundation, as a component of a larger Drupal-based data archive system. They can be easily installed as stand-alone system, or to be plugged into other existing metadata platforms.

  12. A Unified Framework for Periodic, On-Demand, and User-Specified Software Information

    NASA Technical Reports Server (NTRS)

    Kolano, Paul Z.

    2004-01-01

    Although grid computing can increase the number of resources available to a user; not all resources on the grid may have a software environment suitable for running a given application. To provide users with the necessary assistance for selecting resources with compatible software environments and/or for automatically establishing such environments, it is necessary to have an accurate source of information about the software installed across the grid. This paper presents a new OGSI-compliant software information service that has been implemented as part of NASA's Information Power Grid project. This service is built on top of a general framework for reconciling information from periodic, on-demand, and user-specified sources. Information is retrieved using standard XPath queries over a single unified namespace independent of the information's source. Two consumers of the provided software information, the IPG Resource Broker and the IPG Neutralization Service, are briefly described.

  13. Dynamic Weather Routes Architecture Overview

    NASA Technical Reports Server (NTRS)

    Eslami, Hassan; Eshow, Michelle

    2014-01-01

    Dynamic Weather Routes Architecture Overview, presents the high level software architecture of DWR, based on the CTAS software framework and the Direct-To automation tool. The document also covers external and internal data flows, required dataset, changes to the Direct-To software for DWR, collection of software statistics, and the code structure.

  14. Effects of intravenous bolus injection of nicorandil on renal artery flow velocity assessed by color Doppler ultrasound.

    PubMed

    Shimamoto, Yukiko; Kubo, Takashi; Tanabe, Kazumi; Emori, Hiroki; Katayama, Yosuke; Nishiguchi, Tsuyoshi; Taruya, Akira; Kameyama, Takeyoshi; Orii, Makoto; Yamano, Takashi; Kuroi, Akio; Yamaguchi, Tomoyuki; Takemoto, Kazushi; Matsuo, Yoshiki; Ino, Yasushi; Tanaka, Atsushi; Hozumi, Takeshi; Terada, Masaki; Akasaka, Takashi

    2017-01-01

    Previous animal studies have shown that a potassium channel opener, nicorandil, provokes vasodilation in renal microvasculature and increases renal blood flow. We conducted a clinical study that aimed to evaluate the effect of nicorandil on renal artery blood flow in comparison with nitroglycerin by using color Doppler ultrasound. The present study enrolled 40 patients with stable coronary artery disease who had no renal arterial stenosis and renal parenchymal disease. The patients received intravenous administration of nicorandil (n=20) or nitroglycerin (n=20). Before and after the administration, renal artery blood flow velocity was measured by color-guided pulsed-wave Doppler. The peak-systolic, end-diastolic, and mean renal artery blood flow velocities before the administration were not different between the nicorandil group and the nitroglycerin group. The peak-systolic (79±15cm/s to 99±21cm/s, p<0.001; and 78±19cm/s to 85±19cm/s, p=0.004), end-diastolic (22±5cm/s to 28±8cm/s, p<0.001; and 24±6cm/s to 26±6cm/s, p=0.005) and mean (41±6cm/s to 49±9cm/s, p<0.001; and 43±9cm/s to 45±9cm/s, p=0.009) renal artery flow velocities increased significantly in either group. The nominal changes in the peak-systolic (20±10cm/s vs. 7±8cm/s, p<0.001), end-diastolic (5±4cm/s vs. 2±3cm/s, p=0.001), and mean (8±5cm/s vs. 2±2cm/s, p<0.001) renal artery blood flow velocities were significantly greater in the nicorandil group compared with the nitroglycerin group. Intravenous nicorandil increased renal artery blood flow velocity in comparison with nitroglycerin. Nicorandil has a significant effect on renal hemodynamics. Copyright © 2016 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  15. Altered iPSC-derived neurons’ sodium channel properties in subjects with Monge's disease

    PubMed Central

    Zhao, Huiwen W.; Gu, Xiang Q.; Chailangkarn, Thanathom; Perkins, Guy; Callacondo, David; Appenzeller, Otto; Poulsen, Orit; Zhou, Dan; Muotri, Alysson R.; Haddad, Gabriel G.

    2015-01-01

    Monge's disease, also known as chronic mountain sickness (CMS), is a disease that potentially threatens more than 140 million highlanders during extended time living at a high altitude (over 2500m). The prevalence of CMS in Andeans is about 15-20%, suggesting that the majority of highlanders (non-CMS) are rather healthy at the high altitude; however, CMS subjects experience severe hypoxemia, erythrocytosis and many neurologic manifestations including migraine, headache, mental fatigue, confusion, and memory loss. The underlying mechanisms of CMS neuropathology are not well understood and no ideal treatment is available to prevent or cure CMS, except for phlebotomy. In the current study, we reprogrammed fibroblast cells from both CMS and non-CMS subjects’ skin biopsies into the induced pluripotent stem cells (iPSCs), then differentiated into neurons and compared their neuronal properties. We discovered that CMS neurons were much less excitable (higher rheobase) than non-CMS neurons. This decreased excitability was not caused by differences in passive neuronal properties, but instead by a significantly lowered Na+ channel current density and by a shift of the voltage-conductance curve in the depolarization direction. Our findings provide, for the first time, evidence of a neuronal abnormality in CMS subjects as compared to non-CMS subjects, hoping that such studies can pave the way to a better understanding of the neuropathology in CMS. PMID:25559931

  16. DoD Application Store: Enabling C2 Agility?

    DTIC Science & Technology

    2014-06-01

    Framework, will include automated delivery of software patches, web applications, widgets and mobile application packages. The envisioned DoD...Marketplace within the Ozone Widget Framework, will include automated delivery of software patches, web applications, widgets and mobile application...current needs. DoD has started to make inroads within this environment with several Programs of Record (PoR) embracing widgets and other mobile

  17. An application framework for computer-aided patient positioning in radiation therapy.

    PubMed

    Liebler, T; Hub, M; Sanner, C; Schlegel, W

    2003-09-01

    The importance of exact patient positioning in radiation therapy increases with the ongoing improvements in irradiation planning and treatment. Therefore, new ways to overcome precision limitations of current positioning methods in fractionated treatment have to be found. The Department of Medical Physics at the German Cancer Research Centre (DKFZ) follows different video-based approaches to increase repositioning precision. In this context, the modular software framework FIVE (Fast Integrated Video-based Environment) has been designed and implemented. It is both hardware- and platform-independent and supports merging position data by integrating various computer-aided patient positioning methods. A highly precise optical tracking system and several subtraction imaging techniques have been realized as modules to supply basic video-based repositioning techniques. This paper describes the common framework architecture, the main software modules and their interfaces. An object-oriented software engineering process has been applied using the UML, C + + and the Qt library. The significance of the current framework prototype for the application in patient positioning as well as the extension to further application areas will be discussed. Particularly in experimental research, where special system adjustments are often necessary, the open design of the software allows problem-oriented extensions and adaptations.

  18. GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data

    NASA Astrophysics Data System (ADS)

    Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.

    2016-08-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.

  19. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  20. Abstracted Workow Framework with a Structure from Motion Application

    NASA Astrophysics Data System (ADS)

    Rossi, Adam J.

    In scientific and engineering disciplines, from academia to industry, there is an increasing need for the development of custom software to perform experiments, construct systems, and develop products. The natural mindset initially is to shortcut and bypass all overhead and process rigor in order to obtain an immediate result for the problem at hand, with the misconception that the software will simply be thrown away at the end. In a majority of the cases, it turns out the software persists for many years, and likely ends up in production systems for which it was not initially intended. In the current study, a framework that can be used in both industry and academic applications mitigates underlying problems associated with developing scientific and engineering software. This results in software that is much more maintainable, documented, and usable by others, specifically allowing new users to extend capabilities of components already implemented in the framework. There is a multi-disciplinary need in the fields of imaging science, computer science, and software engineering for a unified implementation model, which motivates the development of an abstracted software framework. Structure from motion (SfM) has been identified as one use case where the abstracted workflow framework can improve research efficiencies and eliminate implementation redundancies in scientific fields. The SfM process begins by obtaining 2D images of a scene from different perspectives. Features from the images are extracted and correspondences are established. This provides a sufficient amount of information to initialize the problem for fully automated processing. Transformations are established between views, and 3D points are established via triangulation algorithms. The parameters for the camera models for all views / images are solved through bundle adjustment, establishing a highly consistent point cloud. The initial sparse point cloud and camera matrices are used to generate a dense point cloud through patch based techniques or densification algorithms such as Semi-Global Matching (SGM). The point cloud can be visualized or exploited by both humans and automated techniques. In some cases the point cloud is "draped" with original imagery in order to enhance the 3D model for a human viewer. The SfM workflow can be implemented in the abstracted framework, making it easily leverageable and extensible by multiple users. Like many processes in scientific and engineering domains, the workflow described for SfM is complex and requires many disparate components to form a functional system, often utilizing algorithms implemented by many users in different languages / environments and without knowledge of how the component fits into the larger system. In practice, this generally leads to issues interfacing the components, building the software for desired platforms, understanding its concept of operations, and how it can be manipulated in order to fit the desired function for a particular application. In addition, other scientists and engineers instinctively wish to analyze the performance of the system, establish new algorithms, optimize existing processes, and establish new functionality based on current research. This requires a framework whereby new components can be easily plugged in without affecting the current implemented functionality. The need for a universal programming environment establishes the motivation for the development of the abstracted workflow framework. This software implementation, named Catena, provides base classes from which new components must derive in order to operate within the framework. The derivation mandates requirements be satisfied in order to provide a complete implementation. Additionally, the developer must provide documentation of the component in terms of its overall function and inputs. The interface input and output values corresponding to the component must be defined in terms of their respective data types, and the implementation uses mechanisms within the framework to retrieve and send the values. This process requires the developer to componentize their algorithm rather than implement it monolithically. Although the requirements of the developer are slightly greater, the benefits realized from using Catena far outweigh the overhead, and results in extensible software. This thesis provides a basis for the abstracted workflow framework concept and the Catena software implementation. The benefits are also illustrated using a detailed examination of the SfM process as an example application.

  1. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  2. Recent developments in the CCP-EM software suite.

    PubMed

    Burnley, Tom; Palmer, Colin M; Winn, Martyn

    2017-06-01

    As part of its remit to provide computational support to the cryo-EM community, the Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has produced a software framework which enables easy access to a range of programs and utilities. The resulting software suite incorporates contributions from different collaborators by encapsulating them in Python task wrappers, which are then made accessible via a user-friendly graphical user interface as well as a command-line interface suitable for scripting. The framework includes tools for project and data management. An overview of the design of the framework is given, together with a survey of the functionality at different levels. The current CCP-EM suite has particular strength in the building and refinement of atomic models into cryo-EM reconstructions, which is described in detail.

  3. Recent developments in the CCP-EM software suite

    PubMed Central

    Burnley, Tom

    2017-01-01

    As part of its remit to provide computational support to the cryo-EM community, the Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has produced a software framework which enables easy access to a range of programs and utilities. The resulting software suite incorporates contributions from different collaborators by encapsulating them in Python task wrappers, which are then made accessible via a user-friendly graphical user interface as well as a command-line interface suitable for scripting. The framework includes tools for project and data management. An overview of the design of the framework is given, together with a survey of the functionality at different levels. The current CCP-EM suite has particular strength in the building and refinement of atomic models into cryo-EM reconstructions, which is described in detail. PMID:28580908

  4. Fabrication of fibrillized collagen microspheres with the microstructure resembling an extracellular matrix.

    PubMed

    Matsuhashi, Aki; Nam, Kwangwoo; Kimura, Tsuyoshi; Kishida, Akio

    2015-04-14

    Microspheres using artificial or natural materials have been widely applied in the field of tissue engineering and drug delivery systems. Collagen is being widely used for microspheres because of its abundancy in the extracellular matrix (ECM), and its good biocompatibility. The purpose of this study is to establish the appropriate condition for preparing collagen microspheres (CMS) and fibrillized collagen microspheres (fCMS) using water-in-oil (W/O) emulsion. Collagen can be tailored to mimic the native cell environment possessing a similar microstructure to that of the ECM by conditioning the aqueous solution. We focused on the preparation of stable and injectable CMS and fCMS which is stable and would promote the healing response. Controlling the interfacial properties of hydrophilic-lipophilic balance (HLB), we obtained CMS and fCMS with various sizes and various morphologies. The microsphere prepared with wetting agents showed good microsphere formation, but too low or too high HLB value caused low yield and uncontrollable size distribution. The change in the surfactant amount and the rotor speed also affected the formation of the CMS and fCMS, where the low surfactant amount and fast rotor speed produced smaller CMS and fCMS. In the case of fCMS, the presence of NaCl made it possible to prepare stable fCMS without using any cross-linker due to fibrillogenesis and gelling of collagen molecules. The microstructure of fCMS was similar to that of the native tissue indicating that the fCMS would replicate its function in vivo.

  5. Distributed software framework and continuous integration in hydroinformatics systems

    NASA Astrophysics Data System (ADS)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  6. Penetration, Completeness, and Representativeness of The Society of Thoracic Surgeons Adult Cardiac Surgery Database.

    PubMed

    Jacobs, Jeffrey P; Shahian, David M; He, Xia; O'Brien, Sean M; Badhwar, Vinay; Cleveland, Joseph C; Furnary, Anthony P; Magee, Mitchell J; Kurlansky, Paul A; Rankin, J Scott; Welke, Karl F; Filardo, Giovanni; Dokholyan, Rachel S; Peterson, Eric D; Brennan, J Matthew; Han, Jane M; McDonald, Donna; Schmitz, DeLaine; Edwards, Fred H; Prager, Richard L; Grover, Frederick L

    2016-01-01

    The Society of Thoracic Surgeons (STS) Adult Cardiac Surgery Database (ACSD) has been successfully linked to the Centers for Medicare and Medicaid (CMS) Medicare database, thereby facilitating comparative effectiveness research and providing information about long-term follow-up and cost. The present study uses this link to determine contemporary completeness, penetration, and representativeness of the STS ACSD. Using variables common to both STS and CMS databases, STS operations were linked to CMS data for all CMS coronary artery bypass graft (CABG) surgery hospitalizations discharged between 2000 and 2012, inclusive. For each CMS CABG hospitalization, it was determined whether a matching STS record existed. Center-level penetration (number of CMS sites with at least one matched STS participant divided by the total number of CMS CABG sites) increased from 45% in 2000 to 90% in 2012. In 2012, 973 of 1,081 CMS CABG sites (90%) were linked to an STS site. Patient-level penetration (number of CMS CABG hospitalizations done at STS sites divided by the total number of CMS CABG hospitalizations) increased from 51% in 2000 to 94% in 2012. In 2012, 71,634 of 76,072 CMS CABG hospitalizations (94%) occurred at an STS site. Completeness of case inclusion at STS sites (number of CMS CABG cases at STS sites linked to STS records divided by the total number of CMS CABG cases at STS sites) increased from 88% in 2000 to 98% in 2012. In 2012, 69,213 of 70,932 CMS CABG hospitalizations at STS sites (98%) were linked to an STS record. Linkage of STS and CMS databases demonstrates high and increasing penetration and completeness of the STS database. Linking STS and CMS data facilitates studying long-term outcomes and costs of cardiothoracic surgery. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  7. A unified software framework for deriving, visualizing, and exploring abstraction networks for ontologies

    PubMed Central

    Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A.

    2016-01-01

    Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving “live partial-area taxonomies” is demonstrated. PMID:27345947

  8. The Diamond Beamline Controls and Data Acquisition Software Architecture

    NASA Astrophysics Data System (ADS)

    Rees, N.

    2010-06-01

    The software for the Diamond Light Source beamlines[1] is based on two complementary software frameworks: low level control is provided by the Experimental Physics and Industrial Control System (EPICS) framework[2][3] and the high level user interface is provided by the Java based Generic Data Acquisition or GDA[4][5]. EPICS provides a widely used, robust, generic interface across a wide range of hardware where the user interfaces are focused on serving the needs of engineers and beamline scientists to obtain detailed low level views of all aspects of the beamline control systems. The GDA system provides a high-level system that combines an understanding of scientific concepts, such as reciprocal lattice coordinates, a flexible python syntax scripting interface for the scientific user to control their data acquisition, and graphical user interfaces where necessary. This paper describes the beamline software architecture in more detail, highlighting how these complementary frameworks provide a flexible system that can accommodate a wide range of requirements.

  9. A unified software framework for deriving, visualizing, and exploring abstraction networks for ontologies.

    PubMed

    Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A

    2016-08-01

    Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving "live partial-area taxonomies" is demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Education in the workplace for the physician: clinical management states as an organizing framework.

    PubMed

    Greenes, R A

    2000-01-01

    Medical educators are interested in approaches to making selected relevant knowledge available in the context of problem-based care. This is of value both during the process of care and as a means of organizing information for offline self-study. Four trends in health information technology are relevant to achieving the goal and can be expected to play a growing role in the future. First, health care enterprises are developing approaches for access to information resources related to the care of a patient, including clinical data and images but also communication tools, referral and other logistic tools, decision support, and educational materials. Second, information for patients and methods for patient-doctor interaction and decision making are becoming available. Third, computer-based methods for representation of practice guidelines are being developed to support applications that can incorporate their logic. Finally, considering patients as being in particular "clinical management states" (or CMSs) for specific problems, approaches are being developed to use guidelines as a kind of "predictive" framework to enable development of interfaces for problem-based clinical encounters. The guidelines for a CMS can be used to identify the kinds of resources specifically needed for clinical encounters of that type. As the above trends converge to produce problem-specific environments, professional specialty organizations and continuing medical education course designers will need to focus energies on organizing and updating medical knowledge to make it available in CMS-specific contexts.

  11. Operational Experience with the Frontier System in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumenfeld, Barry; Dykstra, Dave; Kreuzer, Peter

    2012-06-20

    The Frontier framework is used in the CMS experiment at the LHC to deliver conditions data to processing clients worldwide, including calibration, alignment, and configuration information. Each central server at CERN, called a Frontier Launchpad, uses tomcat as a servlet container to establish the communication between clients and the central Oracle database. HTTP-proxy Squid servers, located close to clients, cache the responses to queries in order to provide high performance data access and to reduce the load on the central Oracle database. Each Frontier Launchpad also has its own reverse-proxy Squid for caching. The three central servers have been deliveringmore » about 5 million responses every day since the LHC startup, containing about 40 GB data in total, to more than one hundred Squid servers located worldwide, with an average response time on the order of 10 milliseconds. The Squid caches deployed worldwide process many more requests per day, over 700 million, and deliver over 40 TB of data. Several monitoring tools of the tomcat log files, the accesses of the Squids on the central Launchpad servers, and the availability of remote Squids have been developed to guarantee the performance of the service and make the system easily maintainable. Following a brief introduction of the Frontier framework, we describe the performance of this highly reliable and stable system, detail monitoring concerns and their deployment, and discuss the overall operational experience from the first two years of LHC data-taking.« less

  12. Operational Experience with the Frontier System in CMS

    NASA Astrophysics Data System (ADS)

    Blumenfeld, Barry; Dykstra, Dave; Kreuzer, Peter; Du, Ran; Wang, Weizhen

    2012-12-01

    The Frontier framework is used in the CMS experiment at the LHC to deliver conditions data to processing clients worldwide, including calibration, alignment, and configuration information. Each central server at CERN, called a Frontier Launchpad, uses tomcat as a servlet container to establish the communication between clients and the central Oracle database. HTTP-proxy Squid servers, located close to clients, cache the responses to queries in order to provide high performance data access and to reduce the load on the central Oracle database. Each Frontier Launchpad also has its own reverse-proxy Squid for caching. The three central servers have been delivering about 5 million responses every day since the LHC startup, containing about 40 GB data in total, to more than one hundred Squid servers located worldwide, with an average response time on the order of 10 milliseconds. The Squid caches deployed worldwide process many more requests per day, over 700 million, and deliver over 40 TB of data. Several monitoring tools of the tomcat log files, the accesses of the Squids on the central Launchpad servers, and the availability of remote Squids have been developed to guarantee the performance of the service and make the system easily maintainable. Following a brief introduction of the Frontier framework, we describe the performance of this highly reliable and stable system, detail monitoring concerns and their deployment, and discuss the overall operational experience from the first two years of LHC data-taking.

  13. Towards an Open, Distributed Software Architecture for UxS Operations

    NASA Technical Reports Server (NTRS)

    Cross, Charles D.; Motter, Mark A.; Neilan, James H.; Qualls, Garry D.; Rothhaar, Paul M.; Tran, Loc; Trujillo, Anna C.; Allen, B. Danette

    2015-01-01

    To address the growing need to evaluate, test, and certify an ever expanding ecosystem of UxS platforms in preparation of cultural integration, NASA Langley Research Center's Autonomy Incubator (AI) has taken on the challenge of developing a software framework in which UxS platforms developed by third parties can be integrated into a single system which provides evaluation and testing, mission planning and operation, and out-of-the-box autonomy and data fusion capabilities. This software framework, named AEON (Autonomous Entity Operations Network), has two main goals. The first goal is the development of a cross-platform, extensible, onboard software system that provides autonomy at the mission execution and course-planning level, a highly configurable data fusion framework sensitive to the platform's available sensor hardware, and plug-and-play compatibility with a wide array of computer systems, sensors, software, and controls hardware. The second goal is the development of a ground control system that acts as a test-bed for integration of the proposed heterogeneous fleet, and allows for complex mission planning, tracking, and debugging capabilities. The ground control system should also be highly extensible and allow plug-and-play interoperability with third party software systems. In order to achieve these goals, this paper proposes an open, distributed software architecture which utilizes at its core the Data Distribution Service (DDS) standards, established by the Object Management Group (OMG), for inter-process communication and data flow. The design decisions proposed herein leverage the advantages of existing robotics software architectures and the DDS standards to develop software that is scalable, high-performance, fault tolerant, modular, and readily interoperable with external platforms and software.

  14. Software And Systems Engineering Risk Management

    DTIC Science & Technology

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  15. The Software Architecture of the Upgraded ESA DRAMA Software Suite

    NASA Astrophysics Data System (ADS)

    Kebschull, Christopher; Flegel, Sven; Gelhaus, Johannes; Mockel, Marek; Braun, Vitali; Radtke, Jonas; Wiedemann, Carsten; Vorsmann, Peter; Sanchez-Ortiz, Noelia; Krag, Holger

    2013-08-01

    In the beginnings of man's space flight activities there was the belief that space is so big that everybody could use it without any repercussions. However during the last six decades the increasing use of Earth's orbits has lead to a rapid growth in the space debris environment, which has a big influence on current and future space missions. For this reason ESA issued the "Requirements on Space Debris Mitigation for ESA Projects" [1] in 2008, which apply to all ESA missions henceforth. The DRAMA (Debris Risk Assessment and Mitigation Analysis) software suite had been developed to support the planning of space missions to comply with these requirements. During the last year the DRAMA software suite has been upgraded under ESA contract by TUBS and DEIMOS to include additional tools and increase the performance of existing ones. This paper describes the overall software architecture of the ESA DRAMA software suite. Specifically the new graphical user interface, which manages the five main tools ARES (Assessment of Risk Event Statistics), MIDAS (MASTER-based Impact Flux and Damage Assessment Software), OSCAR (Orbital Spacecraft Active Removal), CROC (Cross Section of Complex Bodies) and SARA (Re-entry Survival and Risk Analysis) is being discussed. The advancements are highlighted as well as the challenges that arise from the integration of the five tool interfaces. A framework had been developed at the ILR and was used for MASTER-2009 and PROOF-2009. The Java based GUI framework, enables the cross-platform deployment, and its underlying model-view-presenter (MVP) software pattern, meet strict design requirements necessary to ensure a robust and reliable method of operation in an environment where the GUI is separated from the processing back-end. While the GUI framework evolved with each project, allowing an increasing degree of integration of services like validators for input fields, it has also increased in complexity. The paper will conclude with an outlook on the future development of the GUI framework, where the potential for advancements will be shown.

  16. 78 FR 38986 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-28

    ... that information was collected under Part B. The QIMS Account Registration and the ESRD Application..., CMS-1728-94, CMS-10174, CMS-10305 and CMS-10488] Agency Information Collection Activities: Proposed... comment on CMS' intention to collect information from the public. Under the Paperwork Reduction Act of...

  17. 42 CFR 482.74 - Condition of participation: Notification to CMS.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition of participation: Notification to CMS... participation: Notification to CMS. (a) A transplant center must notify CMS immediately of any significant... conditions of participation. Instances in which CMS should receive information for follow up, as appropriate...

  18. A multi-GPU real-time dose simulation software framework for lung radiotherapy.

    PubMed

    Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A

    2012-09-01

    Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was <1%. Our analyses show that the framework applied to deformable lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.

  19. Perceptions, use and attitudes of pharmacy customers on complementary medicines and pharmacy practice.

    PubMed

    Braun, Lesley A; Tiralongo, Evelin; Wilkinson, Jenny M; Spitzer, Ondine; Bailey, Michael; Poole, Susan; Dooley, Michael

    2010-07-20

    Complementary medicines (CMs) are popular amongst Australians and community pharmacy is a major supplier of these products. This study explores pharmacy customer use, attitudes and perceptions of complementary medicines, and their expectations of pharmacists as they relate to these products. Pharmacy customers randomly selected from sixty large and small, metropolitan and rural pharmacies in three Australian states completed an anonymous, self administered questionnaire that had been pre-tested and validated. 1,121 customers participated (response rate 62%). 72% had used CMs within the previous 12 months, 61% used prescription medicines daily and 43% had used both concomitantly. Multivitamins, fish oils, vitamin C, glucosamine and probiotics were the five most popular CMs. 72% of people using CMs rated their products as 'very effective' or 'effective enough'. CMs were as frequently used by customers aged 60 years or older as younger customers (69% vs. 72%) although the pattern of use shifted with older age. Most customers (92%) thought pharmacists should provide safety information about CMs, 90% thought they should routinely check for interactions, 87% thought they should recommend effective CMs, 78% thought CMs should be recorded in customer's medication profile and 58% thought pharmacies stocking CMs should also employ a complementary medicine practitioner. Of those using CMs, 93% thought it important for pharmacists to be knowledgeable about CMs and 48% felt their pharmacist provides useful information about CMs. CMs are widely used by pharmacy customers of all ages who want pharmacists to be more involved in providing advice about these products.

  20. Recombination Events Involving the atp9 Gene Are Associated with Male Sterility of CMS PET2 in Sunflower.

    PubMed

    Reddemann, Antje; Horn, Renate

    2018-03-11

    Cytoplasmic male sterility (CMS) systems represent ideal mutants to study the role of mitochondria in pollen development. In sunflower, CMS PET2 also has the potential to become an alternative CMS source for commercial sunflower hybrid breeding. CMS PET2 originates from an interspecific cross of H. petiolaris and H. annuus as CMS PET1, but results in a different CMS mechanism. Southern analyses revealed differences for atp6 , atp9 and cob between CMS PET2, CMS PET1 and the male-fertile line HA89. A second identical copy of atp6 was present on an additional CMS PET2-specific fragment. In addition, the atp9 gene was duplicated. However, this duplication was followed by an insertion of 271 bp of unknown origin in the 5' coding region of the atp9 gene in CMS PET2, which led to the creation of two unique open reading frames orf288 and orf231 . The first 53 bp of orf288 are identical to the 5' end of atp9 . Orf231 consists apart from the first 3 bp, being part of the 271-bp-insertion, of the last 228 bp of atp9 . These CMS PET2-specific orfs are co-transcribed. All 11 editing sites of the atp9 gene present in orf231 are fully edited. The anther-specific reduction of the co-transcript in fertility-restored hybrids supports the involvement in male-sterility based on CMS PET2.

  1. Upper Secondary and Vocational Level Teachers at Social Software

    ERIC Educational Resources Information Center

    Valtonen, Teemu; Kontkanen, Sini; Dillon, Patrick; Kukkonen, Jari; Väisänen, Pertti

    2014-01-01

    This study focuses on upper secondary and vocational level teachers as users of social software i.e. what software they use during their leisure and work and for what purposes they use software in teaching. The study is theorised within a technological pedagogical content knowledge framework, the emphasis is especially on technological knowledge…

  2. Software Requirements Specification for Lunar IceCube

    NASA Astrophysics Data System (ADS)

    Glaser-Garbrick, Michael R.

    Lunar IceCube is a 6U satellite that will orbit the moon to measure water volatiles as a function of position, altitude, and time, and measure in its various phases. Lunar IceCube, is a collaboration between Morehead State University, Vermont Technical University, Busek, and NASA. The Software Requirements Specification will serve as contract between the overall team and the developers of the flight software. It will provide a system's overview of the software that will be developed for Lunar IceCube, in that it will detail all of the interconnects and protocols for each subsystem's that Lunar IceCube will utilize. The flight software will be written in SPARK to the fullest extent, due to SPARK's unique ability to make software free of any errors. The LIC flight software does make use of a general purpose, reusable application framework called CubedOS. This framework imposes some structuring requirements on the architecture and design of the flight software, but it does not impose any high level requirements. It will also detail the tools that we will be using for Lunar IceCube, such as why we will be utilizing VxWorks.

  3. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic earthquake rupture; SNAC, a developing 3-D coded based on the FLAC method for visco-elastoplastic deformation; SNARK, a 3-D FE-PIC method for viscoplastic deformation; and gPLATES an open source paleogeographic/plate tectonics modeling package. We will demonstrate how codes can be linked with themselves, such as a regional and global model of mantle convection and a visco-elastoplastic representation of the crust within viscous mantle flow. Finally, we will describe how http://GeoFramework.org has become a distribution site for a suite of modeling software in geophysics.

  4. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  5. Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.

    PubMed

    Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan

    2012-01-01

    The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications.

  6. Semi-NLO production of Higgs bosons in the framework of kt-factorization using KMR unintegrated parton distributions

    NASA Astrophysics Data System (ADS)

    Modarres, M.; Masouminia, M. R.; Aminzadeh Nik, R.; Hosseinkhani, H.; Olanj, N.

    2018-01-01

    The cross-section for the production of the Standard Model Higgs boson has been calculated using a mixture of LO and NLO partonic diagrams and the unintegrated parton distribution functions (UPDF) of the Kimber-Martin-Ryskin (KMR) from the kt-factorization framework. The UPDF are prepared using the phenomenological libraries of Martin-Motylinski-Harland Lang-Thorne (MMHT 2014). The results are compared against the existing experimental data from the CMS and the ATLAS collaborations and available pQCD calculation. It is shown that, while the present calculation is in agreement with the experimental data, it is comparable with the pQCD results. It is also concluded that the K-factor approximation is comparable with the semi-NLOkt-factorization predictions.

  7. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    NASA Astrophysics Data System (ADS)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, meteorological observational data for the territory of the former USSR for the 20th century, and others. Current version of the system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The software framework presented allows rapid development of Web-GIS systems for geophysical data analysis thus providing specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. This work is partially supported by RFBR grants #10-07-00547, #11-05-01190, and SB RAS projects 4.31.1.5, 4.31.2.7, 4, 8, 9, 50 and 66.

  8. Model and Interoperability using Meta Data Annotations

    NASA Astrophysics Data System (ADS)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside. While providing all those capabilities, a significant reduction in the size of the model source code was achieved. To support the benefit of annotations for a modeler, studies were conducted to evaluate the effectiveness of an annotation based framework approach with other modeling frameworks and libraries, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A typical hydrological model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks.

  9. 42 CFR 423.2063 - Applicability of laws, regulations and CMS Rulings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Applicability of laws, regulations and CMS Rulings..., ALJ Hearings, MAC review, and Judicial Review § 423.2063 Applicability of laws, regulations and CMS... on ALJs and the MAC. (b) CMS Rulings are published under the authority of the CMS Administrator...

  10. 42 CFR 426.517 - CMS' statement regarding new evidence.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false CMS' statement regarding new evidence. 426.517... DETERMINATIONS Review of an NCD § 426.517 CMS' statement regarding new evidence. (a) CMS may review any new... experts; and (5) Presented during any hearing. (b) CMS may submit a statement regarding whether the new...

  11. 42 CFR 405.1063 - Applicability of laws, regulations and CMS Rulings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Applicability of laws, regulations and CMS Rulings... Medicare Coverage Policies § 405.1063 Applicability of laws, regulations and CMS Rulings. (a) All laws and... the MAC. (b) CMS Rulings are published under the authority of the Administrator, CMS. Consistent with...

  12. How to Spot Congenital Myasthenic Syndromes Resembling the Lambert-Eaton Myasthenic Syndrome? A Brief Review of Clinical, Electrophysiological, and Genetics Features.

    PubMed

    Lorenzoni, Paulo José; Scola, Rosana Herminia; Kay, Claudia Suemi Kamoi; Werneck, Lineu Cesar; Horvath, Rita; Lochmüller, Hanns

    2018-06-01

    Congenital myasthenic syndromes (CMS) are heterogeneous genetic diseases in which neuromuscular transmission is compromised. CMS resembling the Lambert-Eaton myasthenic syndrome (CMS-LEMS) are emerging as a rare group of distinct presynaptic CMS that share the same electrophysiological features. They have low compound muscular action potential amplitude that increment after brief exercise (facilitation) or high-frequency repetitive nerve stimulation. Although clinical signs similar to LEMS can be present, the main hallmark is the electrophysiological findings, which are identical to autoimmune LEMS. CMS-LEMS occurs due to deficits in acetylcholine vesicle release caused by dysfunction of different components in its pathway. To date, the genes that have been associated with CMS-LEMS are AGRN, SYT2, MUNC13-1, VAMP1, and LAMA5. Clinicians should keep in mind these newest subtypes of CMS-LEMS to achieve the correct diagnosis and therapy. We believe that CMS-LEMS must be included as an important diagnostic clue to genetic investigation in the diagnostic algorithms to CMS. We briefly review the main features of CMS-LEMS.

  13. Software Geometry in Simulations

    NASA Astrophysics Data System (ADS)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  14. Exploring the Use of a Test Automation Framework

    NASA Technical Reports Server (NTRS)

    Cervantes, Alex

    2009-01-01

    It is known that software testers, more often than not, lack the time needed to fully test the delivered software product within the time period allotted to them. When problems in the implementation phase of a development project occur, it normally causes the software delivery date to slide. As a result, testers either need to work longer hours, or supplementary resources need to be added to the test team in order to meet aggressive test deadlines. One solution to this problem is to provide testers with a test automation framework to facilitate the development of automated test solutions.

  15. A novel photo-grafting of acrylamide onto carboxymethyl starch. 1. Utilization of CMS-g-PAAm in easy care finishing of cotton fabrics.

    PubMed

    El-Sheikh, Manal A

    2016-11-05

    The photosensitized grafting of vinyl monomers onto a range of polymeric substrates has been the subject of particular interest in the recent past. Carboxymethyl starch (CMS)-poly acrylamide (PAAm) graft copolymer (CMS-g-PAAm) with high graft yield was successfully prepared by grafting of acrylamide onto CMS using UV irradiation in the presence of the water soluble 4-(trimethyl ammoniummethyl) benzophenone chloride photoinitiator. CMS-g-PAAm with nitrogen content of 8.3% and grafting efficiency up to 98.9% was obtained using 100% AAm, a material: liquor ratio of 1:14 and 1% photinitiator at 30°C for 1h of UV irradiation. The synthesis of CMS-g-PAAm was confirmed by FTIR and Nitrogen content (%). Surface morphology of CMS and surface morphological changes of CMS after grafting with AAm were studied using SEM. Thermal properties of both CMS and CMS-g-PAAm were studied using TGA and DSC. To impart easy care finishing to cotton fabrics, aqueous formulations of: CMS-g-PAAm, dimethylol dihydroxy ethylene urea (DMDHEU), CMS-g-PAAm-DMDHEU mixture or methylolated CMS-g-PAAm were used. Cotton fabrics were padded in these formulations, squeezed to a wet pick up 100%, dried at 100°C for 5min, cured at 150°C for 5min, washed at 50°C for 10min and air-dried. CRA (crease recovery angle) of untreated fabrics and fabrics finished with a mixture of 2% CMS-g-PAAm and 10% DMDHEU or methylolated CMS-g-PAAm (10% formaldehyde) were: 136°, 190°, 288° respectively. Increasing the number of washing cycles up to five cycles results in an insignificant decrease in the CRA and a significant decrease in RF (releasable formaldehyde) of finished fabric samples. The morphologies of the finished and unfinished cotton fabrics were performed by SEM. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Methodology for Software Reliability Prediction. Volume 1.

    DTIC Science & Technology

    1987-11-01

    SPACECRAFT 0 MANNED SPACECRAFT B ATCH SYSTEM AIRBORNE AVIONICS 0 UNMANNED EVENT C014TROL a REAL TIME CLOSED 0 UNMANNED SPACECRAFT LOOP OPERATINS SPACECRAFT...software reliability. A Software Reliability Measurement Framework was established which spans the life cycle of a software system and includes the...specification, prediction, estimation, and assessment of software reliability. Data from 59 systems , representing over 5 million lines of code, were

  17. Perceptions, use and attitudes of pharmacy customers on complementary medicines and pharmacy practice

    PubMed Central

    2010-01-01

    Background Complementary medicines (CMs) are popular amongst Australians and community pharmacy is a major supplier of these products. This study explores pharmacy customer use, attitudes and perceptions of complementary medicines, and their expectations of pharmacists as they relate to these products. Methods Pharmacy customers randomly selected from sixty large and small, metropolitan and rural pharmacies in three Australian states completed an anonymous, self administered questionnaire that had been pre-tested and validated. Results 1,121 customers participated (response rate 62%). 72% had used CMs within the previous 12 months, 61% used prescription medicines daily and 43% had used both concomitantly. Multivitamins, fish oils, vitamin C, glucosamine and probiotics were the five most popular CMs. 72% of people using CMs rated their products as 'very effective' or 'effective enough'. CMs were as frequently used by customers aged 60 years or older as younger customers (69% vs. 72%) although the pattern of use shifted with older age. Most customers (92%) thought pharmacists should provide safety information about CMs, 90% thought they should routinely check for interactions, 87% thought they should recommend effective CMs, 78% thought CMs should be recorded in customer's medication profile and 58% thought pharmacies stocking CMs should also employ a complementary medicine practitioner. Of those using CMs, 93% thought it important for pharmacists to be knowledgeable about CMs and 48% felt their pharmacist provides useful information about CMs. Conclusions CMs are widely used by pharmacy customers of all ages who want pharmacists to be more involved in providing advice about these products. PMID:20646290

  18. Recombination Events Involving the atp9 Gene Are Associated with Male Sterility of CMS PET2 in Sunflower

    PubMed Central

    Reddemann, Antje; Horn, Renate

    2018-01-01

    Cytoplasmic male sterility (CMS) systems represent ideal mutants to study the role of mitochondria in pollen development. In sunflower, CMS PET2 also has the potential to become an alternative CMS source for commercial sunflower hybrid breeding. CMS PET2 originates from an interspecific cross of H. petiolaris and H. annuus as CMS PET1, but results in a different CMS mechanism. Southern analyses revealed differences for atp6, atp9 and cob between CMS PET2, CMS PET1 and the male-fertile line HA89. A second identical copy of atp6 was present on an additional CMS PET2-specific fragment. In addition, the atp9 gene was duplicated. However, this duplication was followed by an insertion of 271 bp of unknown origin in the 5′ coding region of the atp9 gene in CMS PET2, which led to the creation of two unique open reading frames orf288 and orf231. The first 53 bp of orf288 are identical to the 5′ end of atp9. Orf231 consists apart from the first 3 bp, being part of the 271-bp-insertion, of the last 228 bp of atp9. These CMS PET2-specific orfs are co-transcribed. All 11 editing sites of the atp9 gene present in orf231 are fully edited. The anther-specific reduction of the co-transcript in fertility-restored hybrids supports the involvement in male-sterility based on CMS PET2. PMID:29534485

  19. Living Design Memory: Framework, Implementation, Lessons Learned.

    ERIC Educational Resources Information Center

    Terveen, Loren G.; And Others

    1995-01-01

    Discusses large-scale software development and describes the development of the Designer Assistant to improve software development effectiveness. Highlights include the knowledge management problem; related work, including artificial intelligence and expert systems, software process modeling research, and other approaches to organizational memory;…

  20. Hybrid male sterility in Mimulus (Phrymaceae) is associated with a geographically restricted mitochondrial rearrangement.

    PubMed

    Case, Andrea L; Willis, John H

    2008-05-01

    Cytoplasmic male sterility (CMS) and nuclear fertility restoration (Rf) involves intergenomic coevolution. Although male-sterile phenotypes are rarely expressed in natural populations of angiosperms, CMS genes are thought to be common. The evolutionary dynamics of CMS/Rf systems are poorly understood, leaving gaps in our understanding of mechanisms and consequences of cytonuclear interactions. We characterized the molecular basis and geographic distribution of a CMS gene in Mimulus guttatus. We used outcrossing M. guttatus (with CMS and Rf) to self-fertilizing M. nasutus (lacking CMS and Rf) to generate hybrids segregating for CMS. Mitochondrial transcripts containing an essential gene (nad6) were perfectly associated with male sterility. The CMS mitotype was completely absent in M. nasutus, present in all genotypes collected from the original collection site, but in only two individuals from 34 other M. guttatus populations. This pattern suggests that the CMS likely originated at a single locality, spread to fixation within the population, but has not spread to other populations, indicating possible ecological or genetic constraints on dispersal of this CMS mitotype between populations. Extreme localization may be characteristic of CMS in hermaphroditic species, in contrast to geographically widespread mitotypes commonly found in gynodioecious species, and could directly contribute to hybrid incompatibilities in nature.

  1. Angle-corrected imaging transcranial doppler sonography versus imaging and nonimaging transcranial doppler sonography in children with sickle cell disease.

    PubMed

    Krejza, J; Rudzinski, W; Pawlak, M A; Tomaszewski, M; Ichord, R; Kwiatkowski, J; Gor, D; Melhem, E R

    2007-09-01

    Nonimaging transcranial Doppler sonography (TCD) and imaging TCD (TCDI) are used for determination of the risk of stroke in children with sickle cell disease (SCD). The purpose was to compare angle-corrected, uncorrected TCDI, and TCD blood flow velocities in children with SCD. A total of 37 children (mean age, 7.8 +/- 3.0 years) without intracranial arterial narrowing determined with MR angiography, were studied with use of TCD and TCDI at the same session. Depth of insonation and TCDI mean velocities with and without correction for the angle of insonation in the terminal internal carotid artery (ICA) and middle (MCA), anterior (ACA), and posterior (PCA) cerebral arteries were compared with TCD velocities with use of a paired t test. Two arteries were not found on TCDI compared with 15 not found on TCD. Average angle of insonation in the MCA, ACA, ICA, and PCA was 31 degrees , 44 degrees , 25 degrees , and 29 degrees , respectively. TCDI and TCD mean depth of insonation for all arteries did not differ significantly; however, individual differences varied substantially. TCDI velocities were significantly lower than TCD velocities, respectively, for the right and left sides (mean +/- SD): MCA, 106 +/- 22 cm/s and 111 +/- 33 cm/s versus 130 +/- 19 cm/s and 134 +/- 26 cm/s; ICA, 90 +/- 14 cm/s and 98 +/- 27 cm/s versus 117 +/- 18 cm/s and 119 +/- 23 cm/s; ACA, 74 +/- 24 cm/s and 88 +/- 25 cm/s versus 105 +/- 23 cm/s and 105 +/- 31 cm/s; and PCA, 84 +/- 27 cm/s and 82 +/- 21 cm/s versus 95 +/- 23 cm/s and 94 +/- 20 cm/s. TCD and angle-corrected TCDI velocities were not statistically different except for higher angle-corrected TCDI values in the left ACA and right PCA. TCD velocities are significantly higher than TCDI velocities but are not different from the angle-corrected TCDI velocities. TCDI identifies the major intracranial arteries more effectively than TCD.

  2. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  3. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  4. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  5. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  6. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  7. 45 CFR 150.317 - Factors CMS uses to determine the amount of penalty.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Factors CMS uses to determine the amount of... RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement With Respect to Issuers and Non-Federal Governmental Plans-Civil Money Penalties § 150.317 Factors CMS...

  8. 40 CFR Table 1 to Subpart Qqqqq of... - Applicability of General Provisions to Subpart QQQQQ

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... § 63.8(e) CMS Performance Evaluation No Subpart QQQQQ does not require CMS performance evaluations... QQQQQ does not require performance tests or CMS performance evaluations. § 63.9(e) Notification of... CMS No Subpart QQQQQ does not require CMS performance evaluations. § 63.10(a), (b), (d)(1), (d)(4)-(5...

  9. The Need for V&V in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.

  10. Hybrid Optimization Parallel Search PACKage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2009-11-10

    HOPSPACK is open source software for solving optimization problems without derivatives. Application problems may have a fully nonlinear objective function, bound constraints, and linear and nonlinear constraints. Problem variables may be continuous, integer-valued, or a mixture of both. The software provides a framework that supports any derivative-free type of solver algorithm. Through the framework, solvers request parallel function evaluation, which may use MPI (multiple machines) or multithreading (multiple processors/cores on one machine). The framework provides a Cache and Pending Cache of saved evaluations that reduces execution time and facilitates restarts. Solvers can dynamically create other algorithms to solve subproblems, amore » useful technique for handling multiple start points and integer-valued variables. HOPSPACK ships with the Generating Set Search (GSS) algorithm, developed at Sandia as part of the APPSPACK open source software project.« less

  11. HelioScan: a software framework for controlling in vivo microscopy setups with high hardware flexibility, functional diversity and extendibility.

    PubMed

    Langer, Dominik; van 't Hoff, Marcel; Keller, Andreas J; Nagaraja, Chetan; Pfäffli, Oliver A; Göldi, Maurice; Kasper, Hansjörg; Helmchen, Fritjof

    2013-04-30

    Intravital microscopy such as in vivo imaging of brain dynamics is often performed with custom-built microscope setups controlled by custom-written software to meet specific requirements. Continuous technological advancement in the field has created a need for new control software that is flexible enough to support the biological researcher with innovative imaging techniques and provide the developer with a solid platform for quickly and easily implementing new extensions. Here, we introduce HelioScan, a software package written in LabVIEW, as a platform serving this dual role. HelioScan is designed as a collection of components that can be flexibly assembled into microscope control software tailored to the particular hardware and functionality requirements. Moreover, HelioScan provides a software framework, within which new functionality can be implemented in a quick and structured manner. A specific HelioScan application assembles at run-time from individual software components, based on user-definable configuration files. Due to its component-based architecture, HelioScan can exploit synergies of multiple developers working in parallel on different components in a community effort. We exemplify the capabilities and versatility of HelioScan by demonstrating several in vivo brain imaging modes, including camera-based intrinsic optical signal imaging for functional mapping of cortical areas, standard two-photon laser-scanning microscopy using galvanometric mirrors, and high-speed in vivo two-photon calcium imaging using either acousto-optic deflectors or a resonant scanner. We recommend HelioScan as a convenient software framework for the in vivo imaging community. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. 42 CFR 422.510 - Termination of contract by CMS.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) MEDICARE PROGRAM MEDICARE ADVANTAGE PROGRAM Application Procedures and Contracts for Medicare Advantage Organizations § 422.510 Termination of contract by CMS. (a) Termination by CMS. CMS may at any...

  13. The diverse use of clouds by CMS

    DOE PAGES

    Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; ...

    2015-12-23

    The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of themore » trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. Lastly, we present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.« less

  14. The Status of the Cms Experiment

    NASA Astrophysics Data System (ADS)

    Green, Dan

    The CMS experiment was completely assembled in the fall of 2008 after a decade of design, construction and installation. During the last two years, cosmic ray data were taken on a regular basis. These data have enabled CMS to align the detector components, both spatially and temporally. Initial use of muons has also established the relative alignment of the CMS tracking and muon systems. In addition, the CMS calorimetry has been crosschecked with test beam data, thus providing an initial energy calibration of CMS calorimetry to about 5%. The CMS magnet has been powered and field mapped. The trigger and data acquisition systems have been installed and run at full speed. The tiered data analysis system has been exercised at full design bandwidth for Tier0, Tier1 and Tier2 sites. Monte Carlo simulation of the CMS detector has been constructed at a detailed geometric level and has been tuned to test beam and other production data to provide a realistic model of the CMS detector prior to first collisions.

  15. Rapid Development of Custom Software Architecture Design Environments

    DTIC Science & Technology

    1999-08-01

    the tools themselves. This dissertation describes a new approach to capturing and using architectural design expertise in software architecture design environments...A language and tools are presented for capturing and encapsulating software architecture design expertise within a conceptual framework...of architectural styles and design rules. The design expertise thus captured is supported with an incrementally configurable software architecture

  16. Cardiotoxicity evaluation using human embryonic stem cells and induced pluripotent stem cell-derived cardiomyocytes.

    PubMed

    Zhao, Qi; Wang, Xijie; Wang, Shuyan; Song, Zheng; Wang, Jiaxian; Ma, Jing

    2017-03-09

    Cardiotoxicity remains an important concern in drug discovery. Human pluripotent stem cell-derived cardiomyocytes (hPSC-CMs) have become an attractive platform to evaluate cardiotoxicity. However, the consistency between human embryonic stem cell-derived cardiomyocytes (hESC-CMs) and human induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs) in prediction of cardiotoxicity has yet to be elucidated. Here we screened the toxicities of four representative drugs (E-4031, isoprenaline, quinidine, and haloperidol) using both hESC-CMs and hiPSC-CMs, combined with an impedance-based bioanalytical method. It showed that both hESC-CMs and hiPSC-CMs can recapitulate cardiotoxicity and identify the effects of well-characterized compounds. The combined platform of hPSC-CMs and an impedance-based bioanalytical method could improve preclinical cardiotoxicity screening, holding great potential for increasing drug development accuracy.

  17. Resident intruder paradigm-induced aggression relieves depressive-like behaviors in male rats subjected to chronic mild stress

    PubMed Central

    Wei, Sheng; Ji, Xiao-wei; Wu, Chun-ling; Li, Zi-fa; Sun, Peng; Wang, Jie-qiong; Zhao, Qi-tao; Gao, Jie; Guo, Ying-hui; Sun, Shi-guang; Qiao, Ming-qi

    2014-01-01

    Background Accumulating epidemiological evidence shows that life event stressors are major vulnerability factors for psychiatric diseases such as major depression. It is also well known that the resident intruder paradigm (RIP) results in aggressive behavior in male rats. However, it is not known how resident intruder paradigm-induced aggression affects depressive-like behavior in isolated male rats subjected to chronic mild stress (CMS), which is an animal model of depression. Material/Methods Male Wistar rats were divided into 3 groups: non-stressed controls, isolated rats subjected to the CMS protocol, and resident intruder paradigm-exposed rats subjected to the CMS protocol. Results In the sucrose intake test, ingestion of a 1% sucrose solution by rats in the CMS group was significantly lower than in control and CMS+RIP rats after 3 weeks of stress. In the open-field test, CMS rats had significantly lower open-field scores compared to control rats. Furthermore, the total scores given the CMS group were significantly lower than in the CMS+RIP rats. In the forced swimming test (FST), the immobility times of CMS rats were significantly longer than those of the control or CMS+RIP rats. However, no differences were observed between controls and CMS+RIP rats. Conclusions Our data show that aggressive behavior evoked by the resident intruder paradigm could relieve broad-spectrum depressive-like behaviors in isolated adult male rats subjected to CMS. PMID:24911067

  18. Inclusive jet cross section and strong coupling constant measurements at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerci, Salim, E-mail: Salim.Cerci@cern.ch

    2016-03-25

    The probes which are abundantly produced in high energetic proton-proton (pp) collisions at the LHC are called jets. Events with jets can be described by Quantum Chromodynamics (QCD) in terms of parton-parton scattering. The inclusive jet cross section in pp collision is the fundamental quantity which can be measured and predicted within the framework of perturbative QCD (pQCD). The strong coupling constant α{sub S} which can be determined empirically in the limit of massless quarks, is the single parameter in QCD. The jet measurements can also be used to determine strong coupling constant α{sub S} and parton density functions (PDFs).more » The recent jet measurements which are performed with the data collected by the CMS detector at different center-of-mass energies and down to very low transverse momentum p{sub T} are presented. The measurements are compared to Monte Carlo predictions and perturbative calculations up to next-to-next-to leading order. Finally, the precision jet measurements give further insight into the QCD dynamics.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pachuilo, Andrew R; Ragan, Eric; Goodall, John R

    Visualization tools can take advantage of multiple coordinated views to support analysis of large, multidimensional data sets. Effective design of such views and layouts can be challenging, but understanding users analysis strategies can inform design improvements. We outline an approach for intelligent design configuration of visualization tools with multiple coordinated views, and we discuss a proposed software framework to support the approach. The proposed software framework could capture and learn from user interaction data to automate new compositions of views and widgets. Such a framework could reduce the time needed for meta analysis of the visualization use and lead tomore » more effective visualization design.« less

  20. Software For Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steve E.

    1992-01-01

    SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.

  1. Web Proxy Auto Discovery for the WLCG

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Blomer, J.; Blumenfeld, B.; De Salvo, A.; Dewhurst, A.; Verguilov, V.

    2017-10-01

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily support that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which they direct to the nearest publicly accessible web proxy servers. The responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.

  2. Web Proxy Auto Discovery for the WLCG

    DOE PAGES

    Dykstra, D.; Blomer, J.; Blumenfeld, B.; ...

    2017-11-23

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less

  3. Web Proxy Auto Discovery for the WLCG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykstra, D.; Blomer, J.; Blumenfeld, B.

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less

  4. 42 CFR 426.415 - CMS' role in the LCD review.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false CMS' role in the LCD review. 426.415 Section 426... Review of an LCD § 426.415 CMS' role in the LCD review. CMS may provide to the ALJ, and all parties to the LCD review, information identifying the person who represents the contractor or CMS, if necessary...

  5. CMS-Wave

    DTIC Science & Technology

    2015-10-30

    Coastal Inlets Research Program CMS -Wave CMS -Wave is a two-dimensional spectral wind-wave generation and transformation model that employs a forward...marching, finite-difference method to solve the wave action conservation equation. Capabilities of CMS -Wave include wave shoaling, refraction... CMS -Wave can be used in either on a half- or full-plane mode, with primary waves propagating from the seaward boundary toward shore. It can

  6. A net-shaped multicellular formation facilitates the maturation of hPSC-derived cardiomyocytes through mechanical and electrophysiological stimuli

    PubMed Central

    Liu, Taoyan; Huang, Chengwu; Li, Hongxia; Wu, Fujian; Luo, Jianwen; Lu, Wenjing

    2018-01-01

    The use of human-induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs) is limited in drug discovery and cardiac disease mechanism studies due to cell immaturity. Although many approaches have been reported to improve the maturation of hiPSC-CMs, the elucidation of the process of maturation is crucial. We applied a small-molecule-based differentiation method to generate cardiomyocytes (CMs) with multiple aggregation forms. The motion analysis revealed significant physical differences in the differently shaped CMs, and the net-shaped CMs had larger motion amplitudes and faster velocities than the sheet-shaped CMs. The net-shaped CMs displayed accelerated maturation at the transcriptional level and were more similar to CMs with a prolonged culture time (30 days) than to sheet-d15. Ion channel genes and gap junction proteins were up-regulated in net-shaped CMs, indicating that robust contraction was coupled with enhanced ion channel and connexin expression. The net-shaped CMs also displayed improved myofibril ultrastructure under transmission electron microscopy. In conclusion, different multicellular hPSC-CM structures, such as the net-shaped pattern, are formed using the conditioned induction method, providing a useful tool to improve cardiac maturation. PMID:29661985

  7. Integrating software architectures for distributed simulations and simulation analysis communities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context ofmore » the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.« less

  8. Human renal adipose tissue induces the invasion and progression of renal cell carcinoma.

    PubMed

    Campo-Verde-Arbocco, Fiorella; López-Laur, José D; Romeo, Leonardo R; Giorlando, Noelia; Bruna, Flavia A; Contador, David E; López-Fontana, Gastón; Santiano, Flavia E; Sasso, Corina V; Zyla, Leila E; López-Fontana, Constanza M; Calvo, Juan C; Carón, Rubén W; Creydt, Virginia Pistone

    2017-11-07

    We evaluated the effects of conditioned media (CMs) of human adipose tissue from renal cell carcinoma located near the tumor (hRATnT) or farther away from the tumor (hRATfT), on proliferation, adhesion and migration of tumor (786-O and ACHN) and non-tumor (HK-2) human renal epithelial cell lines. Human adipose tissues were obtained from patients with renal cell carcinoma (RCC) and CMs from hRATnT and hRATfT incubation. Proliferation, adhesion and migration were quantified in 786-O, ACHN and HK-2 cell lines incubated with hRATnT-, hRATfT- or control-CMs. We evaluated versican, adiponectin and leptin expression in CMs from hRATnT and hRATfT. We evaluated AdipoR1/2, ObR, pERK, pAkt y pPI3K expression on cell lines incubated with CMs. No differences in proliferation of cell lines was found after 24 h of treatment with CMs. All cell lines showed a significant decrease in cell adhesion and increase in cell migration after incubation with hRATnT-CMs vs. hRATfT- or control-CMs. hRATnT-CMs showed increased levels of versican and leptin, compared to hRATfT-CMs. AdipoR2 in 786-O and ACHN cells decreased significantly after incubation with hRATfT- and hRATnT-CMs vs. control-CMs. We observed a decrease in the expression of pAkt in HK-2, 786-O and ACHN incubated with hRATnT-CMs. This result could partially explain the observed changes in migration and cell adhesion. We conclude that hRATnT released factors, such as leptin and versican, could enhance the invasive potential of renal epithelial cell lines and could modulate the progression of the disease.

  9. Down-regulation of Inwardly Rectifying K+ Currents in Astrocytes Derived from Patients with Monge's Disease.

    PubMed

    Wu, Wei; Yao, Hang; Zhao, Helen W; Wang, Juan; Haddad, Gabriel G

    2018-03-15

    Chronic mountain sickness (CMS) or Monge's disease is a disease in highlanders. These patients have a variety of neurologic symptoms such as migraine, mental fatigue, confusion, dizziness, loss of appetite, memory loss and neuronal degeneration. The cellular and molecular mechanisms underlying CMS neuropathology is not understood. In the previous study, we demonstrated that neurons derived from CMS patients' fibroblasts have a decreased expression and altered gating properties of voltage-gated sodium channel. In this study, we further characterize the electrophysiological properties of iPSC-derived astrocytes from CMS patients. We found that the current densities of the inwardly rectifying potassium (Kir) channels in CMS astrocytes (-5.7 ± 2.2 pA/pF at -140 mV) were significantly decreased as compared to non-CMS (-28.4 ± 3.4 pA/pF at -140 mV) and sea level subjects (-28.3 ± 5.3 pA/pF at -140 mV). We further demonstrated that the reduced Kir current densities in CMS astrocytes were caused by their decreased protein expression of Kir4.1 and Kir2.3 channels, while single channel properties (i.e., P o , conductance) of Kir channel in CMS astrocytes were not altered. In addition, we found no significant differences of outward potassium currents between CMS and non-CMS astrocytes. As compared to non-CMS and sea level subjects, the K + uptake ability in CMS astrocytes was significantly decreased. Taken together, our results suggest that down-regulation of Kir channels and the resulting decreased K + uptake ability in astrocytes could be one of the major molecular mechanisms underlying the neurologic manifestations in CMS patients. Published by Elsevier Ltd.

  10. From Early Embryonic to Adult Stage: Comparative Study of Action Potentials of Native and Pluripotent Stem Cell-Derived Cardiomyocytes.

    PubMed

    Peinkofer, Gabriel; Burkert, Karsten; Urban, Katja; Krausgrill, Benjamin; Hescheler, Jürgen; Saric, Tomo; Halbach, Marcel

    2016-10-01

    Cardiomyocytes (CMs) derived from induced pluripotent stem cells (iPS-CMs) are promising candidates for cell therapy, drug screening, and developmental studies. It is known that iPS-CMs possess immature electrophysiological properties, but an exact characterization of their developmental stage and subtype differentiation is hampered by a lack of knowledge of electrophysiological properties of native CMs from different developmental stages and origins within the heart. Thus, we sought to systematically investigate action potential (AP) properties of native murine CMs and to establish a database that allows classification of stem cell-derived CMs. Hearts from 129S2PasCrl mice were harvested at days 9-10, 12-14, and 16-18 postcoitum, as well as 1 day, 3-4 days, 1-2 weeks, 3-4 weeks, and 6 weeks postpartum. AP recordings in left and right atria and at apical, medial, and basal left and right ventricles were performed with sharp glass microelectrodes. Measurements revealed significant changes in AP morphology during pre- and postnatal murine development and significant differences between atria and ventricles, enabling a classification of developmental stage and subtype differentiation of stem cell-derived CMs based on their AP properties. For iPS-CMs derived from cell line TiB7.4, a typical ventricular phenotype was demonstrated at later developmental stages, while there were electrophysiological differences from atrial as well as ventricular native CMs at earlier stages. This finding supports that iPS-CMs can develop AP properties similar to native CMs, but points to differences in the maturation process between iPS-CMs and native CMs, which may be explained by dissimilar conditions during in vitro differentiation and in vivo development.

  11. Pharmacokinetics of colistin methanesulfonate (CMS) in healthy Chinese subjects after single and multiple intravenous doses.

    PubMed

    Zhao, Miao; Wu, Xiao-Jie; Fan, Ya-Xin; Zhang, Ying-Yuan; Guo, Bei-Ning; Yu, Ji-Cheng; Cao, Guo-Ying; Chen, Yuan-Cheng; Wu, Ju-Fang; Shi, Yao-Guo; Li, Jian; Zhang, Jing

    2018-05-01

    The high prevalence of extensively drug-resistant Gram-negative pathogens has forced clinicians to use colistin as a last-line therapy. Knowledge on the pharmacokinetics of colistin methanesulfonate (CMS), an inactive prodrug, and colistin has increased substantially; however, the pharmacokinetics in the Chinese population is still unknown due to lack of a CMS product in China. This study aimed to evaluate the pharmacokinetics of a new CMS product developed in China in order to optimise dosing regimens. A total of 24 healthy subjects (12 female, 12 male) were enrolled in single- and multiple-dose pharmacokinetic (PK) studies. Concentrations of CMS and formed colistin in plasma and urine were measured, and PK analysis was conducted using a non-compartmental approach. Following a single CMS dose [2.36 mg colistin base activity (CBA) per kg, 1 h infusion], peak concentrations (C max ) of CMS and formed colistin were 18.0 mg/L and 0.661 mg/L, respectively. The estimated half-life (t 1/2 ) of CMS and colistin were 1.38 h and 4.49 h, respectively. Approximately 62.5% of the CMS dose was excreted via urine within 24 h after dosing, whilst only 1.28% was present in the form of colistin. Following multiple CMS doses, colistin reached steady-state within 24 h; there was no accumulation of CMS, but colistin accumulated slightly (R AUC  = 1.33). This study provides the first PK data in the Chinese population and is essential for designing CMS dosing regimens for use in Chinese hospitals. The urinary PK data strongly support the use of intravenous CMS for serious urinary tract infections. Copyright © 2018 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.

  12. Differences in pharmacokinetics and pharmacodynamics of colistimethate sodium (CMS) and colistin between three different CMS dosage regimens in a critically ill patient infected by a multidrug-resistant Acinetobacter baumannii.

    PubMed

    Luque, Sònia; Grau, Santiago; Valle, Marta; Sorlí, Luisa; Horcajada, Juan Pablo; Segura, Concha; Alvarez-Lerma, Francisco

    2013-08-01

    Use of colistin has re-emerged for the treatment of infections caused by multidrug-resistant (MDR) Gram-negative bacteria, but information on its pharmacokinetics and pharmacodynamics is limited, especially in critically ill patients. Recent data from pharmacokinetic/pharmacodynamic (PK/PD) population studies have suggested that this population could benefit from administration of higher than standard doses of colistimethate sodium (CMS), but the relationship between administration of incremental doses of CMS and corresponding PK/PD parameters as well as its efficacy and toxicity have not yet been investigated in a clinical setting. The objective was to study the PK/PD differences of CMS and colistin between three different CMS dosage regimens in the same critically ill patient. A critically ill patient with nosocomial pneumonia caused by a MDR Acinetobacter baumannii received incremental doses of CMS. During administration of the different CMS dosage regimens, CMS and colistin plasma concentrations were determined and PK/PD indexes were calculated. With administration of the highest CMS dose once daily (720 mg every 24h), the peak plasma concentration of CMS and colistin increased to 40.51 mg/L and 1.81 mg/L, respectively, and the AUC0-24/MIC of colistin was 184.41. This dosage regimen was efficacious, and no nephrotoxicity or neurotoxicity was observed. In conclusion, a higher and extended-interval CMS dosage made it possible to increase the exposure of CMS and colistin in a critically ill patient infected by a MDR A. baumannii and allowed a clinical and microbiological optimal response to be achieved without evidence of toxicity. Copyright © 2013 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.

  13. DEVELOP MULTI-STRESSOR, OPEN ARCHITECTURE MODELING FRAMEWORK FOR ECOLOGICAL EXPOSURE FROM SITE TO WATERSHED SCALE

    EPA Science Inventory

    A number of multimedia modeling frameworks are currently being developed. The Multimedia Integrated Modeling System (MIMS) is one of these frameworks. A framework should be seen as more of a multimedia modeling infrastructure than a single software system. This infrastructure do...

  14. UAF: a generic OPC unified architecture framework

    NASA Astrophysics Data System (ADS)

    Pessemier, Wim; Deconinck, Geert; Raskin, Gert; Saey, Philippe; Van Winckel, Hans

    2012-09-01

    As an emerging Service Oriented Architecture (SOA) specically designed for industrial automation and process control, the OPC Unied Architecture specication should be regarded as an attractive candidate for controlling scientic instrumentation. Even though an industry-backed standard such as OPC UA can oer substantial added value to these projects, its inherent complexity poses an important obstacle for adopting the technology. Building OPC UA applications requires considerable eort, even when taking advantage of a COTS Software Development Kit (SDK). The OPC Unied Architecture Framework (UAF) attempts to reduce this burden by introducing an abstraction layer between the SDK and the application code in order to achieve a better separation of the technical and the functional concerns. True to its industrial origin, the primary requirement of the framework is to maintain interoperability by staying close to the standard specications, and by expecting the minimum compliance from other OPC UA servers and clients. UAF can therefore be regarded as a software framework to quickly and comfortably develop and deploy OPC UA-based applications, while remaining compatible to third party OPC UA-compliant toolkits, servers (such as PLCs) and clients (such as SCADA software). In the rst phase, as covered by this paper, only the client-side of UAF has been tackled in order to transparently handle discovery, session management, subscriptions, monitored items etc. We describe the design principles and internal architecture of our open-source software project, the rst results of the framework running at the Mercator Telescope, and we give a preview of the planned server-side implementation.

  15. Automating Software Design Metrics.

    DTIC Science & Technology

    1984-02-01

    INTRODUCTION 1 ", ... 0..1 1.2 HISTORICAL PERSPECTIVE High quality software is of interest to both the software engineering com- munity and its users. As...contributions of many other software engineering efforts, most notably [MCC 77] and [Boe 83b], which have defined and refined a framework for quantifying...AUTOMATION OF DESIGN METRICS Software metrics can be useful within the context of an integrated soft- ware engineering environment. The purpose of this

  16. Structural and Functional Maturation of Cardiomyocytes Derived from Human Pluripotent Stem Cells

    PubMed Central

    Lundy, Scott D.; Zhu, Wei-Zhong

    2013-01-01

    Despite preclinical studies demonstrating the functional benefit of transplanting human pluripotent stem cell-derived cardiomyocytes (PSC-CMs) into damaged myocardium, the ability of these immature cells to adopt a more adult-like cardiomyocyte (CM) phenotype remains uncertain. To address this issue, we tested the hypothesis that prolonged in vitro culture of human embryonic stem cell (hESC)- and human induced pluripotent stem cell (hiPSC)-derived CMs would result in the maturation of their structural and contractile properties to a more adult-like phenotype. Compared to their early-stage counterparts (PSC-CMs after 20–40 days of in vitro differentiation and culture), late-stage hESC-CMs and hiPSC-CMs (80–120 days) showed dramatic differences in morphology, including increased cell size and anisotropy, greater myofibril density and alignment, sarcomeres visible by bright-field microscopy, and a 10-fold increase in the fraction of multinucleated CMs. Ultrastructural analysis confirmed improvements in the myofibrillar density, alignment, and morphology. We measured the contractile performance of late-stage hESC-CMs and hiPSC-CMs and noted a doubling in shortening magnitude with slowed contraction kinetics compared to the early-stage cells. We then examined changes in the calcium-handling properties of these matured CMs and found an increase in calcium release and reuptake rates with no change in the maximum amplitude. Finally, we performed electrophysiological assessments in hESC-CMs and found that late-stage myocytes have hyperpolarized maximum diastolic potentials, increased action potential amplitudes, and faster upstroke velocities. To correlate these functional changes with gene expression, we performed qPCR and found a robust induction of the key cardiac structural markers, including β-myosin heavy chain and connexin-43, in late-stage hESC-CMs and hiPSC-CMs. These findings suggest that PSC-CMs are capable of slowly maturing to more closely resemble the phenotype of adult CMs and may eventually possess the potential to regenerate the lost myocardium with robust de novo force-producing tissue. PMID:23461462

  17. CMS Connect

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B.; Gardner, R., Jr.; Hurtado Anampa, K.; Jayatilaka, B.; Aftab Khan, F.; Lannon, K.; Larson, K.; Letts, J.; Marra Da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.

    2017-10-01

    The CMS experiment collects and analyzes large amounts of data coming from high energy particle collisions produced by the Large Hadron Collider (LHC) at CERN. This involves a huge amount of real and simulated data processing that needs to be handled in batch-oriented platforms. The CMS Global Pool of computing resources provide +100K dedicated CPU cores and another 50K to 100K CPU cores from opportunistic resources for these kind of tasks and even though production and event processing analysis workflows are already managed by existing tools, there is still a lack of support to submit final stage condor-like analysis jobs familiar to Tier-3 or local Computing Facilities users into these distributed resources in an integrated (with other CMS services) and friendly way. CMS Connect is a set of computing tools and services designed to augment existing services in the CMS Physics community focusing on these kind of condor analysis jobs. It is based on the CI-Connect platform developed by the Open Science Grid and uses the CMS GlideInWMS infrastructure to transparently plug CMS global grid resources into a virtual pool accessed via a single submission machine. This paper describes the specific developments and deployment of CMS Connect beyond the CI-Connect platform in order to integrate the service with CMS specific needs, including specific Site submission, accounting of jobs and automated reporting to standard CMS monitoring resources in an effortless way to their users.

  18. Ogura-CMS in Chinese cabbage (Brassica rapa ssp. pekinensis) causes delayed expression of many nuclear genes.

    PubMed

    Dong, Xiangshu; Kim, Wan Kyu; Lim, Yong-Pyo; Kim, Yeon-Ki; Hur, Yoonkang

    2013-02-01

    We investigated the mechanism regulating cytoplasmic male sterility (CMS) in Brassica rapa ssp. pekinensis using floral bud transcriptome analyses of Ogura-CMS Chinese cabbage and its maintainer line in B. rapa 300-K oligomeric probe (Br300K) microarrays. Ogura-CMS Chinese cabbage produced few and infertile pollen grains on indehiscent anthers. Compared to the maintainer line, CMS plants had shorter filaments and plant growth, and delayed flowering and pollen development. In microarray analysis, 4646 genes showed different expression, depending on floral bud size, between Ogura-CMS and its maintainer line. We found 108 and 62 genes specifically expressed in Ogura-CMS and its maintainer line, respectively. Ogura-CMS line-specific genes included stress-related, redox-related, and B. rapa novel genes. In the maintainer line, genes related to pollen coat and germination were specifically expressed in floral buds longer than 3mm, suggesting insufficient expression of these genes in Ogura-CMS is directly related to dysfunctional pollen. In addition, many nuclear genes associated with auxin response, ATP synthesis, pollen development and stress response had delayed expression in Ogura-CMS plants compared to the maintainer line, which is consistent with the delay in growth and development of Ogura-CMS plants. Delayed expression may reduce pollen grain production and/or cause sterility, implying that mitochondrial, retrograde signaling delays nuclear gene expression. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Phospholipase A2 activity-dependent and -independent fusogenic activity of Naja nigricollis CMS-9 on zwitterionic and anionic phospholipid vesicles.

    PubMed

    Chiou, Yi-Ling; Chen, Ying-Jung; Lin, Shinne-Ren; Chang, Long-Sen

    2011-11-01

    CMS-9, a phospholipase A(2) (PLA(2)) from Naja nigricollis venom, induced the death of human breast cancer MCF-7 cells accompanied with the formation of cell clumps without clear boundaries between cells. Annexin V-FITC staining indicated that abundant phosphatidylserine appeared on the outer membrane of MCF-7 cell clumps, implying the possibility that CMS-9 may promote membrane fusion via anionic phospholipids. To validate this proposition, fusogenic activity of CMS-9 on vesicles composed of zwitterionic phospholipid alone or a combination of zwitterionic and anionic phospholipids was examined. Although CMS-9-induced fusion of zwitterionic phospholipid vesicles depended on PLA(2) activity, CMS-9-induced fusion of vesicles containing anionic phospholipids could occur without the involvement of PLA(2) activity. Membrane-damaging activity of CMS-9 was associated with its fusogenicity. Moreover, CMS-9 induced differently membrane leakage and membrane fusion of vesicles with different compositions. Membrane fluidity and binding capability with phospholipid vesicles were not related to the fusogenicity of CMS-9. However, membrane-bound conformation and mode of CMS-9 depended on phospholipid compositions. Collectively, our data suggest that PLA(2) activity-dependent and -independent fusogenicity of CMS-9 are closely related to its membrane-bound modes and targeted membrane compositions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. 42 CFR 411.382 - CMS's right to rescind advisory opinions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... rescind advisory opinions. Any advice CMS gives in an opinion does not prejudice its right to reconsider... faith reliance upon CMS's advice under this part, provided— (a) The requestor presented to CMS a full...

  1. 42 CFR 411.382 - CMS's right to rescind advisory opinions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... rescind advisory opinions. Any advice CMS gives in an opinion does not prejudice its right to reconsider... faith reliance upon CMS's advice under this part, provided— (a) The requestor presented to CMS a full...

  2. PARTONS: PARtonic Tomography Of Nucleon Software. A computing framework for the phenomenology of Generalized Parton Distributions

    NASA Astrophysics Data System (ADS)

    Berthou, B.; Binosi, D.; Chouika, N.; Colaneri, L.; Guidal, M.; Mezrag, C.; Moutarde, H.; Rodríguez-Quintero, J.; Sabatié, F.; Sznajder, P.; Wagner, J.

    2018-06-01

    We describe the architecture and functionalities of a C++ software framework, coined PARTONS, dedicated to the phenomenology of Generalized Parton Distributions. These distributions describe the three-dimensional structure of hadrons in terms of quarks and gluons, and can be accessed in deeply exclusive lepto- or photo-production of mesons or photons. PARTONS provides a necessary bridge between models of Generalized Parton Distributions and experimental data collected in various exclusive production channels. We outline the specification of the PARTONS framework in terms of practical needs, physical content and numerical capacity. This framework will be useful for physicists - theorists or experimentalists - not only to develop new models, but also to interpret existing measurements and even design new experiments.

  3. FRAMES-2.0 Software System: Frames 2.0 Pest Integration (F2PEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castleton, Karl J.; Meyer, Philip D.

    2009-06-17

    The implementation of the FRAMES 2.0 F2PEST module is described, including requirements, design, and specifications of the software. This module integrates the PEST parameter estimation software within the FRAMES 2.0 environmental modeling framework. A test case is presented.

  4. Framework for Risk Analysis in Multimedia Environmental Systems: Modeling Individual Steps of a Risk Assessment Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shah, Anuj; Castleton, Karl J.; Hoopes, Bonnie L.

    2004-06-01

    The study of the release and effects of chemicals in the environment and their associated risks to humans is central to public and private decision making. FRAMES 1.X, Framework for Risk Analysis in Multimedia Environmental Systems, is a systems modeling software platform, developed by PNNL, Pacific Northwest National Laboratory, that helps scientists study the release and effects of chemicals on a source to outcome basis, create environmental models for similar risk assessment and management problems. The unique aspect of FRAMES is to dynamically introduce software modules representing individual components of a risk assessment (e.g., source release of contaminants, fate andmore » transport in various environmental media, exposure, etc.) within a software framework, manipulate their attributes and run simulations to obtain results. This paper outlines the fundamental constituents of FRAMES 2.X, an enhanced version of FRAMES 1.X, that greatly improve the ability of the module developers to “plug” their self-developed software modules into the system. The basic design, the underlying principles and a discussion of the guidelines for module developers are presented.« less

  5. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  6. A Generic Software Architecture For Prognostics

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Daigle, Matthew J.; Sankararaman, Shankar; Goebel, Kai; Watkins, Jason

    2017-01-01

    Prognostics is a systems engineering discipline focused on predicting end-of-life of components and systems. As a relatively new and emerging technology, there are few fielded implementations of prognostics, due in part to practitioners perceiving a large hurdle in developing the models, algorithms, architecture, and integration pieces. As a result, no open software frameworks for applying prognostics currently exist. This paper introduces the Generic Software Architecture for Prognostics (GSAP), an open-source, cross-platform, object-oriented software framework and support library for creating prognostics applications. GSAP was designed to make prognostics more accessible and enable faster adoption and implementation by industry, by reducing the effort and investment required to develop, test, and deploy prognostics. This paper describes the requirements, design, and testing of GSAP. Additionally, a detailed case study involving battery prognostics demonstrates its use.

  7. Comparative analysis of mitochondrial genomes between the hau cytoplasmic male sterility (CMS) line and its iso-nuclear maintainer line in Brassica juncea to reveal the origin of the CMS-associated gene orf288.

    PubMed

    Heng, Shuangping; Wei, Chao; Jing, Bing; Wan, Zhengjie; Wen, Jing; Yi, Bin; Ma, Chaozhi; Tu, Jinxing; Fu, Tingdong; Shen, Jinxiong

    2014-04-30

    Cytoplasmic male sterility (CMS) is not only important for exploiting heterosis in crop plants, but also as a model for investigating nuclear-cytoplasmic interaction. CMS may be caused by mutations, rearrangement or recombination in the mitochondrial genome. Understanding the mitochondrial genome is often the first and key step in unraveling the molecular and genetic basis of CMS in plants. Comparative analysis of the mitochondrial genome of the hau CMS line and its maintainer line in B. juneca (Brassica juncea) may help show the origin of the CMS-associated gene orf288. Through next-generation sequencing, the B. juncea hau CMS mitochondrial genome was assembled into a single, circular-mapping molecule that is 247,903 bp in size and 45.08% in GC content. In addition to the CMS associated gene orf288, the genome contains 35 protein-encoding genes, 3 rRNAs, 25 tRNA genes and 29 ORFs of unknown function. The mitochondrial genome sizes of the maintainer line and another normal type line "J163-4" are both 219,863 bp and with GC content at 45.23%. The maintainer line has 36 genes with protein products, 3 rRNAs, 22 tRNA genes and 31 unidentified ORFs. Comparative analysis the mitochondrial genomes of the hau CMS line and its maintainer line allowed us to develop specific markers to separate the two lines at the seedling stage. We also confirmed that different mitotypes coexist substoichiometrically in hau CMS lines and its maintainer lines in B. juncea. The number of repeats larger than 100 bp in the hau CMS line (16 repeats) are nearly twice of those found in the maintainer line (9 repeats). Phylogenetic analysis of the CMS-associated gene orf288 and four other homologous sequences in Brassicaceae show that orf288 was clearly different from orf263 in Brassica tournefortii despite of strong similarity. The hau CMS mitochondrial genome was highly rearranged when compared with its iso-nuclear maintainer line mitochondrial genome. This study may be useful for studying the mechanism of natural CMS in B. juncea, performing comparative analysis on sequenced mitochondrial genomes in Brassicas, and uncovering the origin of the hau CMS mitotype and structural and evolutionary differences between different mitotypes.

  8. Composable Framework Support for Software-FMEA Through Model Execution

    NASA Astrophysics Data System (ADS)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  9. Layout Study and Application of Mobile App Recommendation Approach Based On Spark Streaming Framework

    NASA Astrophysics Data System (ADS)

    Wang, H. T.; Chen, T. T.; Yan, C.; Pan, H.

    2018-05-01

    For App recommended areas of mobile phone software, made while using conduct App application recommended combined weighted Slope One algorithm collaborative filtering algorithm items based on further improvement of the traditional collaborative filtering algorithm in cold start, data matrix sparseness and other issues, will recommend Spark stasis parallel algorithm platform, the introduction of real-time streaming streaming real-time computing framework to improve real-time software applications recommended.

  10. A loosely coupled framework for terminology controlled distributed EHR search for patient cohort identification in clinical research.

    PubMed

    Zhao, Lei; Lim Choi Keung, Sarah N; Taweel, Adel; Tyler, Edward; Ogunsina, Ire; Rossiter, James; Delaney, Brendan C; Peterson, Kevin A; Hobbs, F D Richard; Arvanitis, Theodoros N

    2012-01-01

    Heterogeneous data models and coding schemes for electronic health records present challenges for automated search across distributed data sources. This paper describes a loosely coupled software framework based on the terminology controlled approach to enable the interoperation between the search interface and heterogeneous data sources. Software components interoperate via common terminology service and abstract criteria model so as to promote component reuse and incremental system evolution.

  11. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    NASA Astrophysics Data System (ADS)

    Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.

    2017-10-01

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks. Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.

  12. (Quickly) Testing the Tester via Path Coverage

    NASA Technical Reports Server (NTRS)

    Groce, Alex

    2009-01-01

    The configuration complexity and code size of an automated testing framework may grow to a point that the tester itself becomes a significant software artifact, prone to poor configuration and implementation errors. Unfortunately, testing the tester by using old versions of the software under test (SUT) may be impractical or impossible: test framework changes may have been motivated by interface changes in the tested system, or fault detection may become too expensive in terms of computing time to justify running until errors are detected on older versions of the software. We propose the use of path coverage measures as a "quick and dirty" method for detecting many faults in complex test frameworks. We also note the possibility of using techniques developed to diversify state-space searches in model checking to diversify test focus, and an associated classification of tester changes into focus-changing and non-focus-changing modifications.

  13. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  14. A streamlined Python framework for AT-TPC data analysis

    NASA Astrophysics Data System (ADS)

    Taylor, J. Z.; Bradt, J.; Bazin, D.; Kuchera, M. P.

    2017-09-01

    User-friendly data analysis software has been developed for the Active-Target Time Projection Chamber (AT-TPC) experiment at the National Superconducting Cyclotron Laboratory at Michigan State University. The AT-TPC, commissioned in 2014, is a gas-filled detector that acts as both the detector and target for high-efficiency detection of low-intensity, exotic nuclear reactions. The pytpc framework is a Python package for analyzing AT-TPC data. The package was developed for the analysis of 46Ar(p, p) data. The existing software was used to analyze data produced by the 40Ar(p, p) experiment that ran in August, 2015. Usage of the package was documented in an analysis manual both to improve analysis steps and aid in the work of future AT-TPC users. Software features and analysis methods in the pytpc framework will be presented along with the 40Ar results.

  15. 42 CFR 405.1012 - When CMS or its contractors may be a party to a hearing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false When CMS or its contractors may be a party to a... Hearings § 405.1012 When CMS or its contractors may be a party to a hearing. (a) CMS and/or one or more of... unrepresented beneficiary. (b) CMS and/or the contractor(s) advises the ALJ, appellant, and all other parties...

  16. Development of Cytoplasmic Male Sterile IR24 and IR64 Using CW-CMS/Rf17 System.

    PubMed

    Toriyama, Kinya; Kazama, Tomohiko

    2016-12-01

    A wild-abortive-type (WA) cytoplasmic male sterility (CMS) has been almost exclusively used for breeding three-line hybrid rice. Many indica cultivars are known to carry restorer genes for WA-CMS lines and cannot be used as maintainer lines. Especially elite indica cultivars IR24 and IR64 are known to be restorer lines for WA-CMS lines, and are used as male parents for hybrid seed production. If we develop CMS IR24 and CMS IR64, the combination of F1 pairs in hybrid rice breeding programs will be greatly broadened. For production of CMS lines and restorer lines of IR24 and IR64, we employed Chinese wild rice (CW)-type CMS/Restorer of fertility 17 (Rf17) system, in which fertility is restored by a single nuclear gene, Rf17. Successive backcrossing and marker-assisted selection of Rf17 succeeded to produce completely male sterile CMS lines and fully restored restorer lines of IR24 and IR64. CW-cytoplasm did not affect agronomic characteristics. Since IR64 is one of the most popular mega-varieties and used for breeding of many modern varieties, the CW-CMS line of IR64 will be useful for hybrid rice breeding.

  17. Control software and electronics architecture design in the framework of the E-ELT instrumentation

    NASA Astrophysics Data System (ADS)

    Di Marcantonio, P.; Coretti, I.; Cirami, R.; Comari, M.; Santin, P.; Pucillo, M.

    2010-07-01

    During the last years the European Southern Observatory (ESO), in collaboration with other European astronomical institutes, has started several feasibility studies for the E-ELT (European-Extremely Large Telescope) instrumentation and post-focal adaptive optics. The goal is to create a flexible suite of instruments to deal with the wide variety of scientific questions astronomers would like to see solved in the coming decades. In this framework INAF-Astronomical Observatory of Trieste (INAF-AOTs) is currently responsible of carrying out the analysis and the preliminary study of the architecture of the electronics and control software of three instruments: CODEX (control software and electronics) and OPTIMOS-EVE/OPTIMOS-DIORAMAS (control software). To cope with the increased complexity and new requirements for stability, precision, real-time latency and communications among sub-systems imposed by these instruments, new solutions have been investigated by our group. In this paper we present the proposed software and electronics architecture based on a distributed common framework centered on the Component/Container model that uses OPC Unified Architecture as a standard layer to communicate with COTS components of three different vendors. We describe three working prototypes that have been set-up in our laboratory and discuss their performances, integration complexity and ease of deployment.

  18. Exploring and validating physicochemical properties of mangiferin through GastroPlus® software

    PubMed Central

    Khurana, Rajneet Kaur; Kaur, Ranjot; Kaur, Manninder; Kaur, Rajpreet; Kaur, Jasleen; Kaur, Harpreet; Singh, Bhupinder

    2017-01-01

    Aim: Mangiferin (Mgf), a promising therapeutic polyphenol, exhibits poor oral bioavailability. Hence, apt delivery systems are required to facilitate its gastrointestinal absorption. The requisite details on its physicochemical properties have not yet been well documented in literature. Accordingly, in order to have explicit insight into its physicochemical characteristics, the present work was undertaken using GastroPlus™ software. Results: Aqueous solubility (0.38 mg/ml), log P (-0.65), Peff (0.16 × 10-4 cm/s) and ability to act as P-gp substrate were defined. Potency to act as a P-gp substrate was verified through Caco-2 cells, while Peff was estimated through single pass intestinal perfusion studies. Characterization of Mgf through transmission electron microscopy, differential scanning calorimetry, infrared spectroscopy and powder x-ray diffraction has also been reported. Conclusion: The values of physicochemical properties for Mgf reported in the current manuscript would certainly enable the researchers to develop newer delivery systems for Mgf. PMID:28344830

  19. 42 CFR 414.68 - Imaging accreditation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) Computed tomography. (iii) Nuclear medicine. (iv) Positron emission tomography. CMS-approved accreditation... if CMS takes an adverse action based on accreditation findings. (vi) Notify CMS, in writing... organization must permit its surveyors to serve as witnesses if CMS takes an adverse action based on...

  20. 42 CFR 414.68 - Imaging accreditation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) Computed tomography. (iii) Nuclear medicine. (iv) Positron emission tomography. CMS-approved accreditation... if CMS takes an adverse action based on accreditation findings. (vi) Notify CMS, in writing... organization must permit its surveyors to serve as witnesses if CMS takes an adverse action based on...

  1. 42 CFR 414.68 - Imaging accreditation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) Computed tomography. (iii) Nuclear medicine. (iv) Positron emission tomography. CMS-approved accreditation... if CMS takes an adverse action based on accreditation findings. (vi) Notify CMS, in writing... organization must permit its surveyors to serve as witnesses if CMS takes an adverse action based on...

  2. Human embryonic and induced pluripotent stem cell-derived cardiomyocytes exhibit beat rate variability and power-law behavior.

    PubMed

    Mandel, Yael; Weissman, Amir; Schick, Revital; Barad, Lili; Novak, Atara; Meiry, Gideon; Goldberg, Stanislav; Lorber, Avraham; Rosen, Michael R; Itskovitz-Eldor, Joseph; Binah, Ofer

    2012-02-21

    The sinoatrial node is the main impulse-generating tissue in the heart. Atrioventricular conduction block and arrhythmias caused by sinoatrial node dysfunction are clinically important and generally treated with electronic pacemakers. Although an excellent solution, electronic pacemakers incorporate limitations that have stimulated research on biological pacing. To assess the suitability of potential biological pacemakers, we tested the hypothesis that the spontaneous electric activity of human embryonic stem cell-derived cardiomyocytes (hESC-CMs) and induced pluripotent stem cell-derived cardiomyocytes (iPSC-CMs) exhibit beat rate variability and power-law behavior comparable to those of human sinoatrial node. We recorded extracellular electrograms from hESC-CMs and iPSC-CMs under stable conditions for up to 15 days. The beat rate time series of the spontaneous activity were examined in terms of their power spectral density and additional methods derived from nonlinear dynamics. The major findings were that the mean beat rate of hESC-CMs and iPSC-CMs was stable throughout the 15-day follow-up period and was similar in both cell types, that hESC-CMs and iPSC-CMs exhibited intrinsic beat rate variability and fractal behavior, and that isoproterenol increased and carbamylcholine decreased the beating rate in both hESC-CMs and iPSC-CMs. This is the first study demonstrating that hESC-CMs and iPSC-CMs exhibit beat rate variability and power-law behavior as in humans, thus supporting the potential capability of these cell sources to serve as biological pacemakers. Our ability to generate sinoatrial-compatible spontaneous cardiomyocytes from the patient's own hair (via keratinocyte-derived iPSCs), thus eliminating the critical need for immunosuppression, renders these myocytes an attractive cell source as biological pacemakers.

  3. What Factors Influence States' Capacity to Report Children's Health Care Quality Measures? A Multiple-Case Study.

    PubMed

    Christensen, Anna L; Petersen, Dana M; Burton, Rachel A; Forsberg, Vanessa C; Devers, Kelly J

    2017-01-01

    Objectives The objective of this study was to describe factors that influence the ability of state Medicaid agencies to report the Centers for Medicare & Medicaid Services' (CMS) core set of children's health care quality measures (Child Core Set). Methods We conducted a multiple-case study of four high-performing states participating in the Children's Health Insurance Program Reauthorization Act (CHIPRA) Quality Demonstration Grant Program: Illinois, Maine, Pennsylvania, and Oregon. Cases were purposively selected for their diverse measurement approaches and used data from 2010 to 2015, including 154 interviews, semiannual grant progress reports, and annual public reports on Child Core Set measures. We followed Yin's multiple-case study methodology to describe how and why each state increased the number of measures reported to CMS. Results All four states increased the number of Child Core Set measures reported to CMS during the grant period. Each took a different approach to reporting, depending on the available technical, organizational, and behavioral inputs in the state. Reporting capacity was influenced by a state's Medicaid data availability, ability to link to other state data systems, past experience with quality measurement, staff time and technical expertise, and demand for the measures. These factors were enhanced by CHIPRA Quality Demonstration grant funding and other federal capacity building activities, as hypothesized in our conceptual framework. These and other states have made progress reporting the Child Core Set since 2010. Conclusion With financial support and investment in state data systems and organizational factors, states can overcome challenges to reporting most of the Child Core Set measures.

  4. CMS Innovation Center Health Care Innovation Awards

    PubMed Central

    Berry, Sandra H.; Concannon, Thomas W.; Morganti, Kristy Gonzalez; Auerbach, David I.; Beckett, Megan K.; Chen, Peggy G.; Farley, Donna O.; Han, Bing; Harris, Katherine M.; Jones, Spencer S.; Liu, Hangsheng; Lovejoy, Susan L.; Marsh, Terry; Martsolf, Grant R.; Nelson, Christopher; Okeke, Edward N.; Pearson, Marjorie L.; Pillemer, Francesca; Sorbero, Melony E.; Towe, Vivian; Weinick, Robin M.

    2013-01-01

    Abstract The Center for Medicare and Medicaid Innovation within the Centers for Medicare & Medicaid Services (CMS) has funded 108 Health Care Innovation Awards, funded through the Affordable Care Act, for applicants who proposed compelling new models of service delivery or payment improvements that promise to deliver better health, better health care, and lower costs through improved quality of care for Medicare, Medicaid, and Children's Health Insurance Program enrollees. CMS is also interested in learning how new models would affect subpopulations of beneficiaries (e.g., those eligible for Medicare and Medicaid and complex patients) who have unique characteristics or health care needs that could be related to poor outcomes. In addition, the initiative seeks to identify new models of workforce development and deployment, as well as models that can be rapidly deployed and have the promise of sustainability. This article describes a strategy for evaluating the results. The goal for the evaluation design process is to create standardized approaches for answering key questions that can be customized to similar groups of awardees and that allow for rapid and comparable assessment across awardees. The evaluation plan envisions that data collection and analysis will be carried out on three levels: at the level of the individual awardee, at the level of the awardee grouping, and as a summary evaluation that includes all awardees. Key dimensions for the evaluation framework include implementation effectiveness, program effectiveness, workforce issues, impact on priority populations, and context. The ultimate goal is to identify strategies that can be employed widely to lower cost while improving care. PMID:28083297

  5. Multidisciplinary Optimization Branch Experience Using iSIGHT Software

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Korte, J. J.; Dunn, H. J.; Salas, A. O.

    1999-01-01

    The Multidisciplinary Optimization (MDO) Branch at NASA Langley Research Center is investigating frameworks for supporting multidisciplinary analysis and optimization research. An optimization framework call improve the design process while reducing time and costs. A framework provides software and system services to integrate computational tasks and allows the researcher to concentrate more on the application and less on the programming details. A framework also provides a common working environment and a full range of optimization tools, and so increases the productivity of multidisciplinary research teams. Finally, a framework enables staff members to develop applications for use by disciplinary experts in other organizations. Since the release of version 4.0, the MDO Branch has gained experience with the iSIGHT framework developed by Engineous Software, Inc. This paper describes experiences with four aerospace applications: (1) reusable launch vehicle sizing, (2) aerospike nozzle design, (3) low-noise rotorcraft trajectories, and (4) acoustic liner design. All applications have been successfully tested using the iSIGHT framework, except for the aerospike nozzle problem, which is in progress. Brief overviews of each problem are provided. The problem descriptions include the number and type of disciplinary codes, as well as all estimate of the multidisciplinary analysis execution time. In addition, the optimization methods, objective functions, design variables, and design constraints are described for each problem. Discussions on the experience gained and lessons learned are provided for each problem. These discussions include the advantages and disadvantages of using the iSIGHT framework for each case as well as the ease of use of various advanced features. Potential areas of improvement are identified.

  6. Human renal adipose tissue induces the invasion and progression of renal cell carcinoma

    PubMed Central

    Campo-Verde-Arbocco, Fiorella; López-Laur, José D.; Romeo, Leonardo R.; Giorlando, Noelia; Bruna, Flavia A.; Contador, David E.; López-Fontana, Gastón; Santiano, Flavia E.; Sasso, Corina V.; Zyla, Leila E.; López-Fontana, Constanza M.; Calvo, Juan C.; Carón, Rubén W.; Creydt, Virginia Pistone

    2017-01-01

    We evaluated the effects of conditioned media (CMs) of human adipose tissue from renal cell carcinoma located near the tumor (hRATnT) or farther away from the tumor (hRATfT), on proliferation, adhesion and migration of tumor (786-O and ACHN) and non-tumor (HK-2) human renal epithelial cell lines. Human adipose tissues were obtained from patients with renal cell carcinoma (RCC) and CMs from hRATnT and hRATfT incubation. Proliferation, adhesion and migration were quantified in 786-O, ACHN and HK-2 cell lines incubated with hRATnT-, hRATfT- or control-CMs. We evaluated versican, adiponectin and leptin expression in CMs from hRATnT and hRATfT. We evaluated AdipoR1/2, ObR, pERK, pAkt y pPI3K expression on cell lines incubated with CMs. No differences in proliferation of cell lines was found after 24 h of treatment with CMs. All cell lines showed a significant decrease in cell adhesion and increase in cell migration after incubation with hRATnT-CMs vs. hRATfT- or control-CMs. hRATnT-CMs showed increased levels of versican and leptin, compared to hRATfT-CMs. AdipoR2 in 786-O and ACHN cells decreased significantly after incubation with hRATfT- and hRATnT-CMs vs. control-CMs. We observed a decrease in the expression of pAkt in HK-2, 786-O and ACHN incubated with hRATnT-CMs. This result could partially explain the observed changes in migration and cell adhesion. We conclude that hRATnT released factors, such as leptin and versican, could enhance the invasive potential of renal epithelial cell lines and could modulate the progression of the disease. PMID:29212223

  7. Mitochondrial nad2 gene is co-transcripted with CMS-associated orfB gene in cytoplasmic male-sterile stem mustard (Brassica juncea).

    PubMed

    Yang, Jing-Hua; Zhang, Ming-Fang; Yu, Jing-Quan

    2009-02-01

    The transcriptional patterns of mitochondrial respiratory related genes were investigated in cytoplasmic male-sterile and fertile maintainer lines of stem mustard, Brassica juncea. There were numerous differences in nad2 (subunit 2 of NADH dehydrogenase) between stem mustard CMS and its maintainer line. One novel open reading frame, hereafter named orfB gene, was located at the downstream of mitochondrial nad2 gene in the CMS. The novel orfB gene had high similarity with YMF19 family protein, orfB in Raphanus sativus, Helianthus annuus, Nicotiana tabacum and Beta vulgaris, orfB-CMS in Daucus carota, atp8 gene in Arabidopsis thaliana, 5' flanking of orf224 in B. napus (nap CMS) and 5' flanking of orf220 gene in CMS Brassica juncea. Three copies probed by specific fragment (amplified by primers of nad2F and nad2R from CMS) were found in the CMS line following Southern blotting digested with HindIII, but only a single copy in its maintainer line. Meanwhile, two transcripts were shown in the CMS line following Northern blotting while only one transcript was detected in the maintainer line, which were probed by specific fragment (amplified by primers of nad2F and nad2R from CMS). Meanwhile, the expression of nad2 gene was reduced in CMS bud compared to that in its maintainer line. We thus suggested that nad2 gene may be co-transcripted with CMS-associated orfB gene in the CMS. In addition, the specific fragment that was amplified by primers of nad2F and nad2R just spanned partial sequences of nad2 gene and orfB gene. Such alterations in the nad2 gene would impact the activity of NADH dehydrogenase, and subsequently signaling, inducing the expression of nuclear genes involved in male sterility in this type of cytoplasmic male sterility.

  8. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  9. Software for Data Analysis with Graphical Models

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Roy, H. Scott

    1994-01-01

    Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  10. Responding to GPs' information resource needs: implementation and evaluation of a complementary medicines information resource in Queensland general practice

    PubMed Central

    2011-01-01

    Background Australian General Practitioners (GPs) are in the forefront of primary health care and in an excellent position to communicate with their patients and educate them about Complementary Medicines (CMs) use. However previous studies have demonstrated that GPs lack the knowledge required about CMs to effectively communicate with patients about their CMs use and they perceive a need for information resources on CMs to use in their clinical practice. This study aimed to develop, implement, and evaluate a CMs information resource in Queensland (Qld) general practice. Methods The results of the needs assessment survey of Qld general practitioners (GPs) informed the development of a CMs information resource which was then put through an implementation and evaluation cycle in Qld general practice. The CMs information resource was a set of evidence-based herbal medicine fact sheets. This resource was utilised by 100 Qld GPs in their clinical practice for four weeks and was then evaluated. The evaluation assessed GPs' (1) utilisation of the resource (2) perceived quality, usefulness and satisfaction with the resource and (3) perceived impact of the resource on their knowledge, attitudes, and practice of CMs. Results Ninety two out of the 100 GPs completed the four week evaluation of the fact sheets and returned the post-intervention survey. The herbal medicine fact sheets produced by this study were well accepted and utilised by Qld GPs. The majority of GPs perceived that the fact sheets were a useful resource for their clinical practice. The fact sheets improved GPs' attitudes towards CMs, increased their knowledge of those herbal medicines and improved their communication with their patients about those specific herbs. Eighty-six percent of GPs agreed that if they had adequate resources on CMs, like the herbal medicine fact sheets, then they would communicate more to their patients about their use of CMs. Conclusion Further educational interventions on CMs need to be provided to GPs to increase their knowledge of CMs and to improve their communication with patients about their CMs use. PMID:21933434

  11. Repeated asenapine treatment does not participate in the mild stress induced FosB/ΔFosB expression in the rat hypothalamic paraventricular nucleus neurons.

    PubMed

    Kiss, Alexander; Majercikova, Zuzana

    2017-02-01

    Effect of repeated asenapine (ASE) treatment on FosB/ΔFosB expression was studied in the hypothalamic paraventricular nucleus (PVN) of male rats exposed to chronic mild stress (CMS) for 21days. Our intention was to find out whether repeated ASE treatment for 14days may: 1) induce FosB/ΔFosB expression in the PVN; 2) activate selected PVN neuronal phenotypes, synthesizing oxytocin (OXY), vasopressin (AVP), corticoliberin (CRH) or tyrosine hydroxylase (TH); and 3) interfere with the impact of CMS. Control, ASE, CMS, and CMS+ASE treated groups were used. CMS included restraint, social isolation, crowding, swimming, and cold. From the 7th day of CMS, rats received ASE (0.3mg/kg) or saline (300μl/rat) subcutaneously, twice a day for 14days. They were sacrificed on the day 22nd (16-18h after last treatments). FosB/ΔFosB was visualized with avidin biotin peroxidase complex and OXY, AVP, CRH or TH antibodies by fluorescent dyes. Saline and ASE did not promote FosB/ΔFosB expression in the PVN. CMS and CMS+ASE elicited FosB/ΔFosB-expression in the PVN, whereas, ASE did not augment or attenuate FosB/ΔFosB induction elicited by CMS. FosB/ΔFosB-CRH occurred after CMS and CMS+ASE treatments in the PVN middle sector, while FosB/ΔFosB-AVP and FosB/ΔFosB-OXY after CMS and CMS+ASE treatments in the PVN posterior sector. FosB/ΔFosB-TH colocalization was rare. Larger FosB/ΔFosB profiles, running above the PVN, did not show any colocalizations. The study provides an anatomical/functional knowledge about an unaccented nature of prolonged ASE treatment at the level of PVN and excludes its positive or negative interplay with CMS effect. Data indicate that long-lasting ASE treatment might not act as a stressor acting at the PVN level. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  13. The elution of colistimethate sodium from polymethylmethacrylate and calcium phosphate cement beads.

    PubMed

    Waterman, Paige; Barber, Melissa; Weintrob, Amy C; VanBrakle, Regina; Howard, Robin; Kozar, Michael P; Andersen, Romney; Wortmann, Glenn

    2012-06-01

    Gram-negative bacilli resistance to all antibiotics, except for colistimethate sodium (CMS), is an emerging healthcare concern. Incorporating CMS into orthopedic cement to treat bone and soft-tissue infections due to these bacteria is attractive, but the data regarding the elution of CMS from cement are conflicting. The in vitro analysis of the elution of CMS from polymethylmethacrylate (PMMA) and calcium phosphate (CP) cement beads is reported. PMMA and CP beads containing CMS were incubated in phosphate-buffered saline and the eluate sampled at sequential time points. The inhibition of the growth of a strain of Acinetobacter baumannii complex by the eluate was measured by disk diffusion and microbroth dilution assays, and the presence of CMS in the eluate was measured by mass spectroscopy. Bacterial growth was inhibited by the eluate from both PMMA and CP beads. Mass spectroscopy demonstrated greater elution of CMS from CP beads than PMMA beads. The dose of CMS in PMMA beads was limited by failure of bead integrity. CMS elutes from both CP and PMMA beads in amounts sufficient to inhibit bacterial growth in vitro. The clinical implications of these findings require further study.

  14. 42 CFR 423.890 - Appeals.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... sponsor or by CMS before notice of the reconsidered determination is made. (6) Decision of the informal written reconsideration. CMS informs the sponsor of the decision orally or through electronic mail. CMS sends a written decision to the sponsor on the sponsor's request. (7) Effect of CMS informal written...

  15. 45 CFR 150.221 - Transition to State enforcement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement Processes for... State enforcement. (a) If CMS determines that a State for which it has assumed enforcement authority has... appropriate to return enforcement authority to the State, CMS will enter into discussions with State officials...

  16. 45 CFR 150.213 - Form and content of notice.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement Processes for Determining... consequence of a State's failure to substantially enforce HIPAA requirements is that CMS enforces them. (d... information that the State wishes CMS to consider in making the preliminary determination described in § 150...

  17. 45 CFR 150.321 - Determining the amount of penalty-aggravating circumstances.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement..., if there are substantial or several aggravating circumstances, CMS sets the aggregate amount of the.... CMS considers the following circumstances to be aggravating circumstances: (a) The frequency of...

  18. 45 CFR 150.343 - Notice of proposed penalty.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement With Respect to Issuers and Non-Federal Governmental Plans-Civil Money Penalties § 150.343 Notice of proposed penalty. If CMS... penalty. The notice includes the following: (a) A description of the HIPAA requirements that CMS has...

  19. CMS-G from Beta vulgaris ssp. maritima is maintained in natural populations despite containing an atypical cytochrome c oxidase.

    PubMed

    Meyer, Etienne H; Lehmann, Caroline; Boivin, Stéphane; Brings, Lea; De Cauwer, Isabelle; Bock, Ralph; Kühn, Kristina; Touzet, Pascal

    2018-02-23

    While mitochondrial mutants of the respiratory machinery are rare and often lethal, cytoplasmic male sterility (CMS), a mitochondrially inherited trait that results in pollen abortion, is frequently encountered in wild populations. It generates a breeding system called gynodioecy. In Beta vulgaris ssp. maritima , a gynodioecious species, we found CMS-G to be widespread across the distribution range of the species. Despite the sequencing of the mitochondrial genome of CMS-G, the mitochondrial sterilizing factor causing CMS-G is still unknown. By characterizing biochemically CMS-G, we found that the expression of several mitochondrial proteins is altered in CMS-G plants. In particular, Cox1, a core subunit of the cytochrome c oxidase (complex IV), is larger but can still assemble into complex IV. However, the CMS-G-specific complex IV was only detected as a stabilized dimer. We did not observe any alteration of the affinity of complex IV for cytochrome c ; however, in CMS-G, complex IV capacity is reduced. Our results show that CMS-G is maintained in many natural populations despite being associated with an atypical complex IV. We suggest that the modified complex IV could incur the associated cost predicted by theoretical models to maintain gynodioecy in wild populations. © 2018 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  20. Cardiometabolic syndrome and its association with education, smoking, diet, physical activity, and social support: findings from the Pennsylvania 2007 BRFSS Survey.

    PubMed

    Liu, Longjian; Núñez, Ana E

    2010-07-01

    The authors aimed to examine the prevalence of cardiometabolic syndrome (CMS) and its association with education, smoking, diet, physical activity, and social support among white, black, and Hispanic adults using data from the 2007 Pennsylvania Behavior Risk Factor Surveillance System (BRFSS) survey, the largest population-based survey in the state. The authors examined associations between CMS and associated factors cross-sectionally using univariate and multivariate methods. The study included a representative sample of 12,629 noninstitutionalized Pennsylvanians aged > or =18. Components of CMS included obesity, hypercholesterolemia, angina (as a surrogate for decreased high-density lipoprotein), prehypertension or hypertension, and prediabetes or diabetes. CMS was identified as the presence of > or =3 CMS components. The results show that the prevalence of CMS was 20.48% in blacks, followed by Hispanics (19.14%) and whites (12.26%), (P<.01). Multivariate logistic regression analyses indicated that physical inactivity, lower educational levels, smoking, daily consumption of vegetables and/or fruits <3 servings, and lack of social support were significantly associated with the odds of having CMS. In conclusion, black and Hispanic adults have a significantly higher prevalence of CMS than whites. The significant association between CMS and risk factors provides new insights in the direction of health promotion to prevent and control CMS in those who are at high risk.

  1. The mitochondrial gene orfH79 plays a critical role in impairing both male gametophyte development and root growth in CMS-Honglian rice.

    PubMed

    Peng, Xiaojue; Wang, Kun; Hu, Chaofeng; Zhu, Youlin; Wang, Ting; Yang, Jing; Tong, Jiping; Li, Shaoqing; Zhu, Yingguo

    2010-06-24

    Cytoplasmic male sterility (CMS) has often been associated with abnormal mitochondrial open reading frames. The mitochondrial gene orfH79 is a candidate gene for causing the CMS trait in CMS-Honglian (CMS-HL) rice. However, whether the orfH79 expression can actually induce CMS in rice remains unclear. Western blot analysis revealed that the ORFH79 protein is mainly present in mitochondria of CMS-HL rice and is absent in the fertile line. To investigate the function of ORFH79 protein in mitochondria, this gene was fused to a mitochondrial transit peptide sequence and used to transform wild type rice, where its expression induced the gametophytic male sterile phenotype. In addition, excessive accumulation of reactive oxygen species (ROS) in the microspore, a reduced ATP/ADP ratio, decreased mitochondrial membrane potential and a lower respiration rate in the transgenic plants were found to be similar to those in CMS-HL rice. Moreover, retarded growth of primary and lateral roots accompanied by abnormal accumulation of ROS in the root tip was observed in both transgenic rice and CMS-HL rice (YTA). These results suggest that the expression of orfH79 in mitochondria impairs mitochondrial function, which affects the development of both male gametophytes and the roots of CMS-HL rice.

  2. CDX2 prognostic value in stage II/III resected colon cancer is related to CMS classification.

    PubMed

    Pilati, C; Taieb, J; Balogoun, R; Marisa, L; de Reyniès, A; Laurent-Puig, P

    2017-05-01

    Caudal-type homeobox transcription factor 2 (CDX2) is involved in colon cancer (CC) oncogenesis and has been proposed as a prognostic biomarker in patients with stage II or III CC. We analyzed CDX2 expression in a series of 469 CC typed for the new international consensus molecular subtype (CMS) classification, and we confirmed results in a series of 90 CC. Here, we show that lack of CDX2 expression is only present in the mesenchymal subgroup (CMS4) and in MSI-immune tumors (CMS1) and not in CMS2 and CMS3 colon cancer. Although CDX2 expression was a globally independent prognostic factor, loss of CDX2 expression is not associated with a worse prognosis in the CMS1 group, but is highly prognostic in CMS4 patients for both relapse free and overall survival. Similarly, lack of CDX2 expression was a bad prognostic factor in MSS patients, but not in MSI. Our work suggests that combination of the consensual CMS classification and lack of CDX2 expression could be a useful marker to identify CMS4/CDX2-negative patients with a very poor prognosis. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  3. Senp1 drives hypoxia-induced polycythemia via GATA1 and Bcl-xL in subjects with Monge’s disease

    PubMed Central

    Azad, Priti; Zhao, Huiwen W.; Ronen, Roy; Zhou, Dan; Poulsen, Orit; Hsiao, Yu Hsin; Bafna, Vineet

    2016-01-01

    In this study, because excessive polycythemia is a predominant trait in some high-altitude dwellers (chronic mountain sickness [CMS] or Monge’s disease) but not others living at the same altitude in the Andes, we took advantage of this human experiment of nature and used a combination of induced pluripotent stem cell technology, genomics, and molecular biology in this unique population to understand the molecular basis for hypoxia-induced excessive polycythemia. As compared with sea-level controls and non-CMS subjects who responded to hypoxia by increasing their RBCs modestly or not at all, respectively, CMS cells increased theirs remarkably (up to 60-fold). Although there was a switch from fetal to adult HgbA0 in all populations and a concomitant shift in oxygen binding, we found that CMS cells matured faster and had a higher efficiency and proliferative potential than non-CMS cells. We also established that SENP1 plays a critical role in the differential erythropoietic response of CMS and non-CMS subjects: we can convert the CMS phenotype into that of non-CMS and vice versa by altering SENP1 levels. We also demonstrated that GATA1 is an essential downstream target of SENP1 and that the differential expression and response of GATA1 and Bcl-xL are a key mechanism underlying CMS pathology. PMID:27821551

  4. Senp1 drives hypoxia-induced polycythemia via GATA1 and Bcl-xL in subjects with Monge's disease.

    PubMed

    Azad, Priti; Zhao, Huiwen W; Cabrales, Pedro J; Ronen, Roy; Zhou, Dan; Poulsen, Orit; Appenzeller, Otto; Hsiao, Yu Hsin; Bafna, Vineet; Haddad, Gabriel G

    2016-11-14

    In this study, because excessive polycythemia is a predominant trait in some high-altitude dwellers (chronic mountain sickness [CMS] or Monge's disease) but not others living at the same altitude in the Andes, we took advantage of this human experiment of nature and used a combination of induced pluripotent stem cell technology, genomics, and molecular biology in this unique population to understand the molecular basis for hypoxia-induced excessive polycythemia. As compared with sea-level controls and non-CMS subjects who responded to hypoxia by increasing their RBCs modestly or not at all, respectively, CMS cells increased theirs remarkably (up to 60-fold). Although there was a switch from fetal to adult HgbA0 in all populations and a concomitant shift in oxygen binding, we found that CMS cells matured faster and had a higher efficiency and proliferative potential than non-CMS cells. We also established that SENP1 plays a critical role in the differential erythropoietic response of CMS and non-CMS subjects: we can convert the CMS phenotype into that of non-CMS and vice versa by altering SENP1 levels. We also demonstrated that GATA1 is an essential downstream target of SENP1 and that the differential expression and response of GATA1 and Bcl-xL are a key mechanism underlying CMS pathology. © 2016 Azad et al.

  5. Enabling opportunistic resources for CMS Computing Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hufnagel, Dirk

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  6. Enabling opportunistic resources for CMS Computing Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hufnagel, Dick

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resources — resources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are usedmore » to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  7. Enabling opportunistic resources for CMS Computing Operations

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  8. A Generic Ground Framework for Image Expertise Centres and Small-Sized Production Centres

    NASA Astrophysics Data System (ADS)

    Sellé, A.

    2009-05-01

    Initiated by the Pleiadas Earth Observation Program, the CNES (French Space Agency) has developed a generic collaborative framework for its image quality centre, highly customisable for any upcoming expertise centre. This collaborative framework has been design to be used by a group of experts or scientists that want to share data and processings and manage interfaces with external entities. Its flexible and scalable architecture complies with the core requirements: defining a user data model with no impact on the software (generic access data), integrating user processings with a GUI builder and built-in APIs, and offering a scalable architecture to fit any preformance requirement and accompany growing projects. The CNES jas given licensing grants for two software companies that will be able to redistribute this framework to any customer.

  9. [Recognition of psychiatric disorders with a religious content by members of the clergy of different denominations in the Netherlands].

    PubMed

    Noort, A; Braam, A W; van Gool, A R; Verhagen, P J; Beekman, A T F

    2012-01-01

    Clergy members (CMS) frequently provide support and counselling for people with psychological and psychiatric disorders. There is evidence in the literature that CMS consider themselves to be inadequately trained to recognise psychiatric disorders. To investigate to what extent CMS are able to recognise psychiatric symptoms. CMS were recruited in the south-west of the Netherlands among various denominations (Roman Catholic, strict (orthodox) Protestant, moderate Protestant and Evangelical; n = 143) by means of a regional sampling method. The participating CMS (n = 143) and a control group consisting of mental health care professionals MPHS; n = 73) evaluated four vignettes of psychiatric problems with a religious content: two were about a psychiatric disorder (a psychotic state and a psychotic depression/melancholic state), and two concerned non-psychiatric states (a spiritual/religious experience and a mourning reaction with a religious dilemma). For each vignette the respondents scored the suitability of psychiatric medication, the desirability of mental health care, the severity of the disorder and whether there was a religious or spiritual aetiology. Some CMS were able to recognise psychiatric problems almost as well as the MHPS, but among the CMS the degree of recognition varied according to the denomination. Recognition was relatively poor among Evangelical CMS, but was best among the strict Protestant CMS. Evangelical pastors and strict Protestant CMS tended to interpret the non-psychiatric states as pathological. The findings of this study emphasise the need for collaboration between MHPS and CMS and stress the importance of consultation.

  10. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  11. Interim Open Source Software (OSS) Policy

    EPA Pesticide Factsheets

    This interim Policy establishes a framework to implement the requirements of the Office of Management and Budget's (OMB) Federal Source Code Policy to achieve efficiency, transparency and innovation through reusable and open source software.

  12. An Integrated Software Package to Enable Predictive Simulation Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang

    The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less

  13. 78 FR 56710 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-13

    ... the following transmissions: OMB, Office of Information and Regulatory Affairs Attention: CMS Desk... Identifiers: CMS-10199 and CMS-10266] Agency Information Collection Activities: Submission for OMB Review... an opportunity for the public to comment on CMS' intention to collect information from the public...

  14. 42 CFR 422.510 - Termination of contract by CMS.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...: (1) Termination of contract by CMS. (i) CMS notifies the MA organization in writing 90 days before... organization; or (B) The MA organization experiences financial difficulties so severe that its ability to make...) of this section. (ii) CMS notifies the MA organization in writing that its contract will be...

  15. 42 CFR 422.510 - Termination of contract by CMS.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...: (1) Termination of contract by CMS. (i) CMS notifies the MA organization in writing 90 days before... organization; or (B) The MA organization experiences financial difficulties so severe that its ability to make...) of this section. (ii) CMS notifies the MA organization in writing that its contract will be...

  16. 42 CFR 422.510 - Termination of contract by CMS.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) Termination of contract by CMS. (i) CMS notifies the MA organization in writing 90 days before the intended...; or (B) The MA organization experiences financial difficulties so severe that its ability to make...) of this section. (ii) CMS notifies the MA organization in writing that its contract will be...

  17. 42 CFR 401.108 - CMS rulings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false CMS rulings. 401.108 Section 401.108 Public Health... GENERAL ADMINISTRATIVE REQUIREMENTS Confidentiality and Disclosure § 401.108 CMS rulings. (a) After... regulations, but which has been adopted by CMS as having precedent, may be published in the Federal Register...

  18. 45 CFR 150.319 - Determining the amount of the penalty-mitigating circumstances.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement... guidelines for taking into account the factors listed in § 150.317, CMS considers the following: (a) Record... noncompliance without notice from CMS and voluntarily reported that noncompliance, provided that the responsible...

  19. 42 CFR 401.625 - Effect of CMS claims collection decisions on appeals.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Effect of CMS claims collection decisions on... Compromise § 401.625 Effect of CMS claims collection decisions on appeals. Any action taken under this..., is not an initial determination for purposes of CMS appeal procedures. ...

  20. 42 CFR 403.248 - Administrative review of CMS determinations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Administrative review of CMS determinations. 403... Certification Program: General Provisions § 403.248 Administrative review of CMS determinations. (a) This section provides for administrative review if CMS determines— (1) Not to certify a policy; or (2) That a...

  1. 78 FR 67149 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-08

    ... Identifier: CMS-R-216] Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY... & Medicaid Services (CMS) is announcing an opportunity for the public to comment on CMS' intention to collect... accepting comments. 2. By regular mail. You may mail written comments to the following address: CMS, Office...

  2. How Patronage Politics Undermines Parental Participation and Accountability: Community-Managed Schools in Honduras and Guatemala

    ERIC Educational Resources Information Center

    Altschuler, Daniel

    2013-01-01

    This article shows how patronage politics affects a popular international education model: community-managed schools (CMS). Focusing on Honduras's CMS initiative, PROHECO (Programa Hondureno de Educacion Comunitaria), I demonstrate how patronage can undermine CMS accountability. Whereas supporters argue that CMS increases accountability, partisan…

  3. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  4. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  5. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  6. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  7. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  8. The Profiles in Practice School Reporting Software.

    ERIC Educational Resources Information Center

    Griffin, Patrick

    "The Profiles in Practice: School Reporting Software" provides a framework for reports on different aspects of performance in an assessment program. This booklet is the installation guide and user manual for the Profiles in Practice software, which is included as a CD-ROM. The chapters of the guide are: (1) "Installation"; (2) "Starting the…

  9. Checklists for the Evaluation of Educational Software: Critical Review and Prospects.

    ERIC Educational Resources Information Center

    Tergan, Sigmar-Olaf

    1998-01-01

    Reviews strengths and weaknesses of check lists for the evaluation of computer software and outlines consequences for their practical application. Suggests an approach based on an instructional design model and a comprehensive framework to cope with problems of validity and predictive power of software evaluation. Discusses prospects of the…

  10. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    ERIC Educational Resources Information Center

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  11. Registration of cytoplasmic male-sterile oilseed sunflower genetic stocks CMS GIG2 and CMS GIG2-RV, and fertility restoration lines RF GIG2-MAX 1631 and RF GIG2-MAX 1631-RV

    USDA-ARS?s Scientific Manuscript database

    Two cytoplasmic male-sterile (CMS) oilseed sunflower (Helianthus annuus L.) genetic stocks, CMS GIG2 (Reg. No. xxx, PI xxxx), and CMS GIG2-RV (Reg. No. xxx, PI xxxx), and corresponding fertility restoration lines RF GIG2-MAX 1631 (Reg. No. xxx, PI xxxx) and RF GIG2-MAX 1631-RV (Reg. No. xxx, PI xxx...

  12. CMS Nonpayment Policy, Quality Improvement, and Hospital-Acquired Conditions: An Integrative Review.

    PubMed

    Bae, Sung-Heui

    This integrative review synthesized evidence on the consequences of the Centers for Medicare & Medicaid Services (CMS) nonpayment policy on quality improvement initiatives and hospital-acquired conditions. Fourteen articles were included. This review presents strong evidence that the CMS policy has spurred quality improvement initiatives; however, the relationships between the CMS policy and hospital-acquired conditions are inconclusive. In future research, a comprehensive model of implementation of the CMS nonpayment policy would help us understand the effectiveness of this policy.

  13. ATLAS particle detector CSC ROD software design and implementation, and, Addition of K physics to chi-squared analysis of FDQM

    NASA Astrophysics Data System (ADS)

    Hawkins, Donovan Lee

    In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.

  14. Elements of strategic capability for software outsourcing enterprises based on the resource

    NASA Astrophysics Data System (ADS)

    Shi, Wengeng

    2011-10-01

    Software outsourcing enterprises as an emerging high-tech enterprises, the rise of the speed and the number was very amazing. In addition to Chinese software outsourcing for giving preferential policies, the software outsourcing business has its ability to upgrade, and in general the software companies have not had the related characteristics. View from the resource base of the theory, the analysis software outsourcing companies have the ability and resources of rare and valuable and non-mimic, we try to give an initial framework for theoretical analysis based on this.

  15. Relations between the Test of Variables of Attention (TOVA) and the Children's Memory Scale (CMS).

    PubMed

    Riccio, Cynthia A; Garland, Beth H; Cohen, Morris J

    2007-09-01

    There is considerable overlap in the constructs of attention and memory. The objective of this study was to examine the relationship between the Test of Variables of Attention (TOVA), a measure of attention, to components of memory and learning as measured by the Children's Memory Scale (CMS). Participants (N = 105) were consecutive referrals to an out-patient facility, generally for learning or behavior problems, who were administered both the TOVA and the CMS. Significant correlations were found between the omissions score on the TOVA and subscales of the CMS. TOVA variability and TOVA reaction time correlated significantly with subscales of the CMS as well. TOVA commission errors did not correlate significantly with any CMS Index. Although significant, the correlation coefficients indicate that the CMS and TOVA are measuring either different constructs or similar constructs but in different ways. As such, both measures may be useful in distinguishing memory from attention problems.

  16. 42 CFR 423.509 - Termination of contract by CMS.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...: (1) Termination of contract by CMS. (i) CMS notifies the Part D plan sponsor in writing at least 45... experiences financial difficulties so severe that its ability to make necessary health services available is...) CMS notifies the Part D plan sponsor in writing that its contract will be terminated on a date...

  17. 77 FR 70445 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-26

    ...-wide set of credentials and single sign-on capability for multiple CMS applications. In order to prove... and answers); 4. Provide the user a single sign-on, federated CMS EIDM ID and Password; 5... Terms of Service and CMS Privacy Statement on the Web. Form Numbers: CMS-10452 (OCN: 0938-New...

  18. Increasing Honest Responding on Cognitive Distortions in Child Molesters: The Bogus Pipeline Procedure

    ERIC Educational Resources Information Center

    Gannon, Theresa A.

    2006-01-01

    Professionals conclude that child molesters (CMs) hold offense-supportive beliefs (or cognitive distortions) from CMs' questionnaire responses. Because questionnaires are easily faked, we asked 32 CMs to complete a cognitive distortion scale under standard conditions (Time 1). A week later (Time 2), the same CMs completed the scale again. This…

  19. 42 CFR 417.801 - Agreements between CMS and health care prepayment plans.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Agreements between CMS and health care prepayment... CMS and health care prepayment plans. (a) General requirement. (1) In order to participate and receive... written agreement with CMS. (2) An existing group practice prepayment plan (GPPP) that continues as an...

  20. 45 CFR 150.347 - Failure to request a hearing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement With Respect to Issuers and....343, CMS may assess the proposed civil money penalty, a less severe penalty, or a more severe penalty. CMS notifies the responsible entity in writing of any penalty that has been assessed and of the means...

  1. 42 CFR 411.379 - When CMS accepts a request.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false When CMS accepts a request. 411.379 Section 411.379... Physicians and Entities Furnishing Designated Health Services § 411.379 When CMS accepts a request. (a) Upon receiving a request for an advisory opinion, CMS promptly makes an initial determination of whether the...

  2. 42 CFR 405.1834 - CMS reviewing official procedure.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false CMS reviewing official procedure. 405.1834 Section... Determinations and Appeals § 405.1834 CMS reviewing official procedure. (a) Scope. A provider that is a party to... Administrator by a designated CMS reviewing official who considers whether the decision of the intermediary...

  3. 42 CFR 457.1003 - CMS review of waiver requests.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false CMS review of waiver requests. 457.1003 Section 457.1003 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Waivers: General Provisions § 457.1003 CMS review of waiver requests. CMS will review the waiver requests...

  4. 45 CFR 150.301 - General rule regarding the imposition of civil money penalties.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS... to CMS's enforcement authority under § 150.101(b)(2), or any non-Federal governmental plan (or employer that sponsors a non-Federal governmental plan) that is subject to CMS's enforcement authority...

  5. 42 CFR 421.114 - Assignment and reassignment of providers by CMS.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Assignment and reassignment of providers by CMS. 421.114 Section 421.114 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... Assignment and reassignment of providers by CMS. CMS may assign or reassign any provider to any intermediary...

  6. 45 CFR 150.209 - Verification of exhaustion of remedies and contact with State officials.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS... § 150.209 Verification of exhaustion of remedies and contact with State officials. If CMS receives a complaint or other information indicating that a State is failing to enforce HIPAA requirements, CMS...

  7. Sleep disturbances in long-term immigrants with chronic mountain sickness: a comparison with healthy immigrants at high altitude.

    PubMed

    Guan, Wei; Ga, Qin; Li, Rong; Bai, Zhen-Zhong; Wuren, Tana; Wang, Jin; Yang, Ying-Zhong; Li, Yu-Hong; Ge, Ri-Li

    2015-01-15

    The aim of this study was to examine sleep disturbances in patients with chronic mountain sickness (CMS). The sleep of 14 patients with CMS and 11 healthy controls with or without sleep disorders (control N: without sleep disorders; control D: with sleep disorders) was studied by polysomnography. Hypopnea was the sleep disorder most commonly suffered by CMS patients and control D subjects. No major differences were observed in sleep structure between CMS and control groups, with the exception of shorter rapid eye movement latency in controls and increased deep non-rapid eye movement in the control N group. Periodic breathing was observed in only two study participants, one each in the CMS and control D groups. The level of saturated oxygen was significantly lower in the CMS group during sleep than the control groups (P<0.05). CMS scores were positively correlated with the apnea-hypopnea index, and negatively correlated with saturated oxygen levels. These results demonstrate that sleep disorders and nocturnal hypoxia are important in the development of CMS. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Altered calcium handling and increased contraction force in human embryonic stem cell derived cardiomyocytes following short term dexamethasone exposure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kosmidis, Georgios; Bellin, Milena; Ribeiro, Marcelo C.

    One limitation in using human pluripotent stem cell derived cardiomyocytes (hPSC-CMs) for disease modeling and cardiac safety pharmacology is their immature functional phenotype compared with adult cardiomyocytes. Here, we report that treatment of human embryonic stem cell derived cardiomyocytes (hESC-CMs) with dexamethasone, a synthetic glucocorticoid, activated glucocorticoid signaling which in turn improved their calcium handling properties and contractility. L-type calcium current and action potential properties were not affected by dexamethasone but significantly faster calcium decay, increased forces of contraction and sarcomeric lengths, were observed in hESC-CMs after dexamethasone exposure. Activating the glucocorticoid pathway can thus contribute to mediating hPSC-CMs maturation.more » - Highlights: • Dexamethasone accelerates Ca{sup 2+} transient decay in hESC-CMs. • Dexamethasone enhances SERCA and NCX function in hESC-CMs. • Dexamethasone increases force of contraction and sarcomere length in hESC-CMs. • Dexamethasone does not alter I{sub Ca,L} and action potential characteristics in hESC-CMs.« less

  9. Incorporating cost-benefit analyses into software assurance planning

    NASA Technical Reports Server (NTRS)

    Feather, M. S.; Sigal, B.; Cornford, S. L.; Hutchinson, P.

    2001-01-01

    The objective is to use cost-benefit analyses to identify, for a given project, optimal sets of software assurance activities. Towards this end we have incorporated cost-benefit calculations into a risk management framework.

  10. Re-designing the PhEDEx Security Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, C.-H.; Wildish, T.; Zhang, X.

    2014-01-01

    PhEDEx, the data-placement tool used by the CMS experiment at the LHC, was conceived in a more trusting time. The security model provided a safe environment for site agents and operators, but offerred little more protection than that. Data was not sufficiently protected against loss caused by operator error or software bugs or by deliberate manipulation of the database. Operators were given high levels of access to the database, beyond what was actually needed to accomplish their tasks. This exposed them to the risk of suspicion should an incident occur. Multiple implementations of the security model led to difficulties maintainingmore » code, which can lead to degredation of security over time. In order to meet the simultaneous goals of protecting CMS data, protecting the operators from undue exposure to risk, increasing monitoring capabilities and improving maintainability of the security model, the PhEDEx security model was redesigned and re-implemented. Security was moved from the application layer into the database itself, fine-grained access roles were established, and tools and procedures created to control the evolution of the security model over time. In this paper we describe this work, we describe the deployment of the new security model, and we show how these enhancements improve security on several fronts simultaneously.« less

  11. Online data handling and storage at the CMS experiment

    NASA Astrophysics Data System (ADS)

    Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Demiragli, Z.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gómez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, RK; Morovic, S.; Nuñez-Barranco-Fernández, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ∼62 sources produced with an aggregate rate of ∼2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.

  12. Re-designing the PhEDEx Security Model

    NASA Astrophysics Data System (ADS)

    C-H, Huang; Wildish, T.; X, Zhang

    2014-06-01

    PhEDEx, the data-placement tool used by the CMS experiment at the LHC, was conceived in a more trusting time. The security model provided a safe environment for site agents and operators, but offerred little more protection than that. Data was not sufficiently protected against loss caused by operator error or software bugs or by deliberate manipulation of the database. Operators were given high levels of access to the database, beyond what was actually needed to accomplish their tasks. This exposed them to the risk of suspicion should an incident occur. Multiple implementations of the security model led to difficulties maintaining code, which can lead to degredation of security over time. In order to meet the simultaneous goals of protecting CMS data, protecting the operators from undue exposure to risk, increasing monitoring capabilities and improving maintainability of the security model, the PhEDEx security model was redesigned and re-implemented. Security was moved from the application layer into the database itself, fine-grained access roles were established, and tools and procedures created to control the evolution of the security model over time. In this paper we describe this work, we describe the deployment of the new security model, and we show how these enhancements improve security on several fronts simultaneously.

  13. Online Data Handling and Storage at the CMS Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andre, J. M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced bymore » the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.« less

  14. Pharmacokinetics of Colistin Methansulphonate (CMS) and Colistin after CMS Nebulisation in Baboon Monkeys.

    PubMed

    Marchand, Sandrine; Bouchene, Salim; de Monte, Michèle; Guilleminault, Laurent; Montharu, Jérôme; Cabrera, Maria; Grégoire, Nicolas; Gobin, Patrice; Diot, Patrice; Couet, William; Vecellio, Laurent

    2015-10-01

    The objective of this study was to compare two different nebulizers: Eflow rapid® and Pari LC star® by scintigraphy and PK modeling to simulate epithelial lining fluid concentrations from measured plasma concentrations, after nebulization of CMS in baboons. Three baboons received CMS by IV infusion and by 2 types of aerosols generators and colistin by subcutaneous infusion. Gamma imaging was performed after nebulisation to determine colistin distribution in lungs. Blood samples were collected during 9 h and colistin and CMS plasma concentrations were measured by LC-MS/MS. A population pharmacokinetic analysis was conducted and simulations were performed to predict lung concentrations after nebulization. Higher aerosol distribution into lungs was observed by scintigraphy, when CMS was nebulized with Pari LC® star than with Eflow Rapid® nebulizer. This observation was confirmed by the fraction of CMS deposited into the lung (respectively 3.5% versus 1.3%).CMS and colistin simulated concentrations in epithelial lining fluid were higher after using the Pari LC star® than the Eflow rapid® system. A limited fraction of CMS reaches lungs after nebulization, but higher colistin plasma concentrations were measured and higher intrapulmonary colistin concentrations were simulated with the Pari LC Star® than with the Eflow Rapid® system.

  15. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  16. Wake Turbulence Mitigation for Departures (WTMD) Prototype System - Software Design Document

    NASA Technical Reports Server (NTRS)

    Sturdy, James L.

    2008-01-01

    This document describes the software design of a prototype Wake Turbulence Mitigation for Departures (WTMD) system that was evaluated in shadow mode operation at the Saint Louis (KSTL) and Houston (KIAH) airports. This document describes the software that provides the system framework, communications, user displays, and hosts the Wind Forecasting Algorithm (WFA) software developed by the M.I.T. Lincoln Laboratory (MIT-LL). The WFA algorithms and software are described in a separate document produced by MIT-LL.

  17. Impact of chronic maternal stress during early gestation on maternal-fetal stress transfer and fetal stress sensitivity in sheep.

    PubMed

    Dreiling, Michelle; Schiffner, Rene; Bischoff, Sabine; Rupprecht, Sven; Kroegel, Nasim; Schubert, Harald; Witte, Otto W; Schwab, Matthias; Rakers, Florian

    2018-01-01

    Acute stress-induced reduction of uterine blood flow (UBF) is an indirect mechanism of maternal-fetal stress transfer during late gestation. Effects of chronic psychosocial maternal stress (CMS) during early gestation, as may be experienced by many working women, on this stress signaling mechanism are unclear. We hypothesized that CMS in sheep during early gestation augments later acute stress-induced decreases of UBF, and aggravates the fetal hormonal, cardiovascular, and metabolic stress responses during later development. Six pregnant ewes underwent repeated isolation stress (CMS) between 30 and 100 days of gestation (dGA, term: 150 dGA) and seven pregnant ewes served as controls. At 110 dGA, ewes were chronically instrumented and underwent acute isolation stress. The acute stress decreased UBF by 19% in both the CMS and control groups (p < .05), but this was prolonged in CMS versus control ewes (74 vs. 30 min, p < .05). CMS increased fetal circulating baseline and stress-induced cortisol and norepinephrine concentrations indicating a hyperactive hypothalamus-pituitary-adrenal (HPA)-axis and sympathetic-adrenal-medullary system. Increased fetal norepinephrine is endogenous as maternal catecholamines do not cross the placenta. Cortisol in the control but not in the CMS fetuses was correlated with maternal cortisol blood concentrations; these findings indicate: (1) no increased maternal-fetal cortisol transfer with CMS, (2) cortisol production in CMS fetuses when the HPA-axis is normally inactive, due to early maturation of the fetal HPA-axis. CMS fetuses were better oxygenated, without shift towards acidosis compared to the controls, potentially reflecting adaptation to repeated stress. Hence, CMS enhances maternal-fetal stress transfer by prolonged reduction in UBF and increased fetal HPA responsiveness.

  18. Blockade of AT1 type receptors for angiotensin II prevents cardiac microvascular fibrosis induced by chronic stress in Sprague-Dawley rats.

    PubMed

    Firoozmand, Lília Taddeo; Sanches, Andrea; Damaceno-Rodrigues, Nilsa Regina; Perez, Juliana Dinéia; Aragão, Danielle Sanches; Rosa, Rodolfo Mattar; Marcondes, Fernanda Klein; Casarini, Dulce Elena; Caldini, Elia Garcia; Cunha, Tatiana Sousa

    2018-04-20

    To test the effects of chronic-stress on the cardiovascular system, the model of chronic mild unpredictable stress (CMS) has been widely used. The CMS protocol consists of the random, intermittent, and unpredictable exposure of laboratory animals to a variety of stressors, during 3 consecutive weeks. In this study, we tested the hypothesis that exposure to the CMS protocol leads to left ventricle microcirculatory remodeling that can be attenuated by angiotensin II receptor blockade. Male Sprague-Dawley rats were randomly assigned into four groups: Control, Stress, Control + losartan, and Stress + losartan (N = 6, each group, losartan: 20 mg/kg/day). The rats were euthanized 15 days after CMS exposure, and blood samples and left ventricle were collected. Rats submitted to CMS presented increased glycemia, corticosterone, noradrenaline and adrenaline concentration, and losartan reduced the concentration of the circulating amines. Cardiac angiotensin II, measured by high-performance liquid chromatography (HPLC), was significantly increased in the CMS group, and losartan treatment reduced it, while angiotensin 1-7 was significantly higher in the CMS losartan-treated group as compared with CMS. Histological analysis, verified by transmission electron microscopy, showed that rats exposed to CMS presented increased perivascular collagen and losartan effectively prevented the development of this process. Hence, CMS induced a state of microvascular disease, with increased perivascular collagen deposition, that may be the trigger for further development of cardiovascular disease. In this case, CMS fibrosis is associated with increased production of catecholamines and with a disruption of renin-angiotensin system balance, which can be prevented by angiotensin II receptor blockade.

  19. Stability of colistin methanesulfonate in pharmaceutical products and solutions for administration to patients.

    PubMed

    Wallace, Stephanie J; Li, Jian; Rayner, Craig R; Coulthard, Kingsley; Nation, Roger L

    2008-09-01

    Colistin methanesulfonate (CMS) has the potential to hydrolyze in aqueous solution to liberate colistin, its microbiologically active and more toxic parent compound. While conversion of CMS to colistin in vivo is important for bactericidal activity, liberation of colistin during storage and/or use of pharmaceutical formulations may potentiate the toxicity of CMS. To date, there has been no information available regarding the stability of CMS in pharmaceutical preparations. Two commercial CMS formulations were investigated for stability with respect to colistin content, which was measured by a specific high-performance liquid chromatography method. Coly-Mycin M Parenteral (colistimethate lyophilized powder) was stable (<0.1% of CMS present as colistin) for at least 20 weeks at 4 degrees C and 25 degrees C at 60% relative humidity. When Coly-Mycin M was reconstituted with 2 ml of water to a CMS concentration of 200 mg/ml for injection, Coly-Mycin M was stable (<0.1% colistin formed) for at least 7 days at both 4 degrees C and 25 degrees C. When further diluted to 4 mg/ml in a glucose (5%) or saline (0.9%) infusion solution as directed, CMS hydrolyzed faster at 25 degrees C (<4% colistin formed after 48 h) than at 4 degrees C (0.3% colistin formed). The second formulation, CMS Solution for Inhalation (77.5 mg/ml), was stable at 4 degrees C and 25 degrees C for at least 12 months, as determined based on colistin content (<0.1%). This study demonstrated the concentration- and temperature-dependent hydrolysis of CMS. The information provided by this study has important implications for the formulation and clinical use of CMS products.

  20. Evaluation of Flagging Criteria of United States Kidney Transplant Center Performance: How to Best Define Outliers?

    PubMed

    Schold, Jesse D; Miller, Charles M; Henry, Mitchell L; Buccini, Laura D; Flechner, Stuart M; Goldfarb, David A; Poggio, Emilio D; Andreoni, Kenneth A

    2017-06-01

    Scientific Registry of Transplant Recipients report cards of US organ transplant center performance are publicly available and used for quality oversight. Low center performance (LP) evaluations are associated with changes in practice including reduced transplant rates and increased waitlist removals. In 2014, Scientific Registry of Transplant Recipients implemented new Bayesian methodology to evaluate performance which was not adopted by Center for Medicare and Medicaid Services (CMS). In May 2016, CMS altered their performance criteria, reducing the likelihood of LP evaluations. Our aims were to evaluate incidence, survival rates, and volume of LP centers with Bayesian, historical (old-CMS) and new-CMS criteria using 6 consecutive program-specific reports (PSR), January 2013 to July 2015 among adult kidney transplant centers. Bayesian, old-CMS and new-CMS criteria identified 13.4%, 8.3%, and 6.1% LP PSRs, respectively. Over the 3-year period, 31.9% (Bayesian), 23.4% (old-CMS), and 19.8% (new-CMS) of centers had 1 or more LP evaluation. For small centers (<83 transplants/PSR), there were 4-fold additional LP evaluations (52 vs 13 PSRs) for 1-year mortality with Bayesian versus new-CMS criteria. For large centers (>183 transplants/PSR), there were 3-fold additional LP evaluations for 1-year mortality with Bayesian versus new-CMS criteria with median differences in observed and expected patient survival of -1.6% and -2.2%, respectively. A significant proportion of kidney transplant centers are identified as low performing with relatively small survival differences compared with expected. Bayesian criteria have significantly higher flagging rates and new-CMS criteria modestly reduce flagging. Critical appraisal of performance criteria is needed to assess whether quality oversight is meeting intended goals and whether further modifications could reduce risk aversion, more efficiently allocate resources, and increase transplant opportunities.

  1. Substantial Targeting Advantage Achieved by Pulmonary Administration of Colistin Methanesulfonate in a Large-Animal Model

    PubMed Central

    Nguyen, Tri-Hung; Lieu, Linh Thuy; Nguyen, Gary; Bischof, Robert J.; Meeusen, Els N.; Li, Jian; Nation, Roger L.

    2016-01-01

    ABSTRACT Colistin, administered as its inactive prodrug colistin methanesulfonate (CMS), is often used in multidrug-resistant Gram-negative pulmonary infections. The CMS and colistin pharmacokinetics in plasma and epithelial lining fluid (ELF) following intravenous and pulmonary dosing have not been evaluated in a large-animal model with pulmonary architecture similar to that of humans. Six merino sheep (34 to 43 kg body weight) received an intravenous or pulmonary dose of 4 to 8 mg/kg CMS (sodium) or 2 to 3 mg/kg colistin (sulfate) in a 4-way crossover study. Pulmonary dosing was achieved via jet nebulization through an endotracheal tube cuff. CMS and colistin were quantified in plasma and bronchoalveolar lavage fluid (BALF) samples by high-performance liquid chromatography (HPLC). ELF concentrations were calculated via the urea method. CMS and colistin were comodeled in S-ADAPT. Following intravenous CMS or colistin administration, no concentrations were quantifiable in BALF samples. Elimination clearance was 1.97 liters/h (4% interindividual variability) for CMS (other than conversion to colistin) and 1.08 liters/h (25%) for colistin. On average, 18% of a CMS dose was converted to colistin. Following pulmonary delivery, colistin was not quantifiable in plasma and CMS was detected in only one sheep. Average ELF concentrations (standard deviations [SD]) of formed colistin were 400 (243), 384 (187), and 184 (190) mg/liter at 1, 4, and 24 h after pulmonary CMS administration. The population pharmacokinetic model described well CMS and colistin in plasma and ELF following intravenous and pulmonary administration. Pulmonary dosing provided high ELF and low plasma colistin concentrations, representing a substantial targeting advantage over intravenous administration. Predictions from the pharmacokinetic model indicate that sheep are an advantageous model for translational research. PMID:27821445

  2. Underreporting of nursing home utilization on the CMS-2728 in older incident dialysis patients and implications for assessing mortality risk.

    PubMed

    Bowling, C Barrett; Zhang, Rebecca; Franch, Harold; Huang, Yijian; Mirk, Anna; McClellan, William M; Johnson, Theodore M; Kutner, Nancy G

    2015-03-21

    The usage of nursing home (NH) services is a marker of frailty among older adults. Although the Centers for Medicare & Medicaid Services (CMS) revised the Medical Evidence Report Form CMS-2728 in 2005 to include data collection on NH institutionalization, the validity of this item has not been reported. There were 27,913 patients ≥ 75 years of age with incident end-stage renal disease (ESRD) in 2006, which constituted our analysis cohort. We determined the accuracy of the CMS-2728 using a matched cohort that included the CMS Minimum Data Set (MDS) 2.0, often employed as a "gold standard" metric for identifying patients receiving NH care. We calculated sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) for the CMS-2728 NH item. Next, we compared characteristics and mortality risk by CMS-2728 and MDS NH status agreement. The sensitivity, specificity, PPV and NPV of the CMS-2728 for NH status were 33%, 97%, 80% and 79%, respectively. Compared to those without the MDS or CMS-2728 NH indicator (No MDS/No 2728), multivariable adjusted hazard ratios (95% confidence interval) for mortality associated with NH status were 1.55 (1.46 - 1.64) for MDS/2728, 1.48 (1.42 - 1.54) for MDS/No 2728, and 1.38 (1.25 - 1.52) for No MDS/2728. NH utilization was more strongly associated with mortality than other CMS-2728 items in the model. The CMS-2728 underestimated NH utilization among older adults with incident ESRD. The potential for misclassification may have important ramifications for assessing prognosis, developing advanced care plans and providing coordinated care.

  3. Characterization and classification of one new cytoplasmic male sterility (CMS) line based on morphological, cytological and molecular markers in non-heading Chinese cabbage (Brassica rapa L.).

    PubMed

    Heng, Shuangping; Shi, Dianyi; Hu, Zhenhua; Huang, Tao; Li, Jinping; Liu, Liyan; Xia, Chunxiu; Yuan, Zhenzhen; Xu, Yuejin; Fu, Tingdong; Wan, Zhengjie

    2015-09-01

    A new non-heading Chinese cabbage CMS line M119A was characterized and specific molecular markers were developed to classify different CMS types. One new non-heading Chinese cabbage (Brassica rapa L.) cytoplasmic male sterile (CMS) line M119A was obtained by interspecific crosses between the recently discovered hau CMS line of Brassica juncea and B. rapa. Furthermore, the line was characterized and compared with other five isonuclear-alloplasmic CMS lines. The M119A line produced six stamens without pollen and only two stamen fused together in fewer flowers. Tissue section indicated that anther abortion in M119A may have occurred during differentiation of the archesporial cells without pollen sac. All the six CMS lines were grouped into three types based on the presence of three PCR fragments of 825, 465 and 772 bp amplified with different mitochondrial genes specific primers. The 825-bp fragment was amplified both in 09-10A and H201A using the specific primer pair P-orf224-atp6, and showed 100 % identity with the mitochondrial gene of pol CMS. The 465-bp fragment was amplified in 30A and 105A using the primer pair P-orf138 and shared 100 % identity with the mitochondrial gene of ogu CMS. The 772-bp fragment was amplified in M119A and H203A using the primer pair P-orf288 and showed 100 % identity with the mitochondrial gene of hau CMS. Therefore, these markers could efficiently distinguish different types of isonuclear-alloplasmic CMS lines of non-heading Chinese cabbage, which were useful for improving the efficiency of cross-breeding and heterosis utilization in cruciferous vegetables.

  4. Stability of Colistin Methanesulfonate in Pharmaceutical Products and Solutions for Administration to Patients▿

    PubMed Central

    Wallace, Stephanie J.; Li, Jian; Rayner, Craig. R.; Coulthard, Kingsley; Nation, Roger L.

    2008-01-01

    Colistin methanesulfonate (CMS) has the potential to hydrolyze in aqueous solution to liberate colistin, its microbiologically active and more toxic parent compound. While conversion of CMS to colistin in vivo is important for bactericidal activity, liberation of colistin during storage and/or use of pharmaceutical formulations may potentiate the toxicity of CMS. To date, there has been no information available regarding the stability of CMS in pharmaceutical preparations. Two commercial CMS formulations were investigated for stability with respect to colistin content, which was measured by a specific high-performance liquid chromatography method. Coly-Mycin M Parenteral (colistimethate lyophilized powder) was stable (<0.1% of CMS present as colistin) for at least 20 weeks at 4°C and 25°C at 60% relative humidity. When Coly-Mycin M was reconstituted with 2 ml of water to a CMS concentration of 200 mg/ml for injection, Coly-Mycin M was stable (<0.1% colistin formed) for at least 7 days at both 4°C and 25°C. When further diluted to 4 mg/ml in a glucose (5%) or saline (0.9%) infusion solution as directed, CMS hydrolyzed faster at 25°C (<4% colistin formed after 48 h) than at 4°C (0.3% colistin formed). The second formulation, CMS Solution for Inhalation (77.5 mg/ml), was stable at 4°C and 25°C for at least 12 months, as determined based on colistin content (<0.1%). This study demonstrated the concentration- and temperature-dependent hydrolysis of CMS. The information provided by this study has important implications for the formulation and clinical use of CMS products. PMID:18606838

  5. Neutron imaging data processing using the Mantid framework

    NASA Astrophysics Data System (ADS)

    Pouzols, Federico M.; Draper, Nicholas; Nagella, Sri; Yang, Erica; Sajid, Ahmed; Ross, Derek; Ritchie, Brian; Hill, John; Burca, Genoveva; Minniti, Triestino; Moreton-Smith, Christopher; Kockelmann, Winfried

    2016-09-01

    Several imaging instruments are currently being constructed at neutron sources around the world. The Mantid software project provides an extensible framework that supports high-performance computing for data manipulation, analysis and visualisation of scientific data. At ISIS, IMAT (Imaging and Materials Science & Engineering) will offer unique time-of-flight neutron imaging techniques which impose several software requirements to control the data reduction and analysis. Here we outline the extensions currently being added to Mantid to provide specific support for neutron imaging requirements.

  6. Open source libraries and frameworks for biological data visualisation: a guide for developers.

    PubMed

    Wang, Rui; Perez-Riverol, Yasset; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2015-04-01

    Recent advances in high-throughput experimental techniques have led to an exponential increase in both the size and the complexity of the data sets commonly studied in biology. Data visualisation is increasingly used as the key to unlock this data, going from hypothesis generation to model evaluation and tool implementation. It is becoming more and more the heart of bioinformatics workflows, enabling scientists to reason and communicate more effectively. In parallel, there has been a corresponding trend towards the development of related software, which has triggered the maturation of different visualisation libraries and frameworks. For bioinformaticians, scientific programmers and software developers, the main challenge is to pick out the most fitting one(s) to create clear, meaningful and integrated data visualisation for their particular use cases. In this review, we introduce a collection of open source or free to use libraries and frameworks for creating data visualisation, covering the generation of a wide variety of charts and graphs. We will focus on software written in Java, JavaScript or Python. We truly believe this software offers the potential to turn tedious data into exciting visual stories. © 2014 The Authors. PROTEOMICS published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Open source libraries and frameworks for biological data visualisation: A guide for developers

    PubMed Central

    Wang, Rui; Perez-Riverol, Yasset; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2015-01-01

    Recent advances in high-throughput experimental techniques have led to an exponential increase in both the size and the complexity of the data sets commonly studied in biology. Data visualisation is increasingly used as the key to unlock this data, going from hypothesis generation to model evaluation and tool implementation. It is becoming more and more the heart of bioinformatics workflows, enabling scientists to reason and communicate more effectively. In parallel, there has been a corresponding trend towards the development of related software, which has triggered the maturation of different visualisation libraries and frameworks. For bioinformaticians, scientific programmers and software developers, the main challenge is to pick out the most fitting one(s) to create clear, meaningful and integrated data visualisation for their particular use cases. In this review, we introduce a collection of open source or free to use libraries and frameworks for creating data visualisation, covering the generation of a wide variety of charts and graphs. We will focus on software written in Java, JavaScript or Python. We truly believe this software offers the potential to turn tedious data into exciting visual stories. PMID:25475079

  8. Factors affecting pharmacists’ recommendation of complementary medicines – a qualitative pilot study of Australian pharmacists

    PubMed Central

    2012-01-01

    Background Complementary medicines (CMs) are widely used by the Australian public, and pharmacies are major suppliers of these medicines. The integration of CMs into pharmacy practice is well documented, but the behaviours of pharmacists in recommending CMs to customers are less well studied. This study reports on factors that influence whether or not pharmacists in Australia recommend CMs to their customers. Methods Data were collected from semi-structured interviews with twelve practicing pharmacists based in Brisbane, Australia. The qualitative data were analysed by thematic analysis. Results The primary driver of the recommendation of CMs was a desire to provide a health benefit to the customer. Other important drivers were an awareness of evidence of efficacy, customer feedback and pharmacy protocols to recommend a CM alongside a particular pharmaceutical medication. The primary barrier to the recommendation of CMs was safety concerns around patients on multiple medications or with complex health issues. Also, a lack of knowledge of CMs, a perceived lack of evidence or a lack of time to counsel patients were identified as barriers. There was a desire to see a greater integration of CM into formal pharmacy education. Additionally, the provision of good quality educational materials was seen as important to allow pharmacists to assess levels of evidence for CMs and educate them on their safe and appropriate use. Conclusions Pharmacists who frequently recommend CMs identify many potential benefits for patients and see it as an important part of providing a ‘healthcare solution’. To encourage the informed use of CMs in pharmacy there is a need for the development of accessible, quality resources on CMs. In addition, incorporation of CM education into pharmacy curricula would better prepare graduate pharmacists for community practice. Ultimately, such moves would contribute to the safe and effective use of CMs to the benefit of consumers. PMID:23051066

  9. Pharmacokinetics of Chinese medicines: strategies and perspectives.

    PubMed

    Yan, Ru; Yang, Ying; Chen, Yijia

    2018-01-01

    The modernization and internationalization of Chinese medicines (CMs) are hampered by increasing concerns on the safety and the efficacy. Pharmacokinetic (PK) study is indispensable to establish concentration-activity/toxicity relationship and facilitate target identification and new drug discovery from CMs. To cope with tremendous challenges rooted from chemical complexity of CMs, the classic PK strategies have evolved rapidly from PK study focusing on marker/main drug components to PK-PD correlation study adopting metabolomics approaches to characterize associations between disposition of global drug-related components and host metabolic network shifts. However, the majority of PK studies of CMs have adopted the approaches tailored for western medicines and focused on the systemic exposures of drug-related components, most of which were found to be too low to account for the holistic benefits of CMs. With an area under concentration-time curve- or activity-weighted approach, integral PK attempts to understand the PK-PD relevance with the integrated PK profile of multiple co-existing structural analogs (prototyes/metabolites). Cellular PK-PD complements traditional PK-PD when drug targets localize inside the cells, instead of at the surface of cell membrane or extracellular space. Considering the validated clinical benefits of CMs, reverse pharmacology-based reverse PK strategy was proposed to facilitate target identification and new drug discovery. Recently, gut microbiota have demonstrated multifaceted roles in drug efficacy/toxicity. In traditional oral intake, the presystemic interactions of CMs with gut microbiota seem inevitable, which can contribute to the holistic benefits of CMs through biotransforming CMs components, acting as the peripheral target, and regulating host drug disposition. Hence, we propose a global PK-PD approach which includes the presystemic interaction of CMs with gut microbiota and combines omics with physiologically based pharmacokinetic modeling to offer a comprehensive understanding of the PK-PD relationship of CMs. Moreover, validated clinical benefits of CMs and poor translational potential of animal PK data urge more research efforts in human PK study.

  10. Burden of disease resulting from chronic mountain sickness among young Chinese male immigrants in Tibet

    PubMed Central

    2012-01-01

    Background In young Chinese men of the highland immigrant population, chronic mountain sickness (CMS) is a major public health problem. The aim of this study was to measure the disease burden of CMS in this population. Methods We used disability-adjusted life years (DALYs) to estimate the disease burden of CMS. Disability weights were derived using the person trade-off methodology. CMS diagnoses, symptom severity, and individual characteristics were obtained from surveys collected in Tibet in 2009 and 2010. The DALYs of individual patients and the DALYs/1,000 were calculated. Results Disability weights were obtained for 21 CMS health stages. The results of the analyses of the two surveys were consistent with each other. At different altitudes, the CMS rates ranged from 2.1-37.4%; the individual DALYs of patients ranged from 0.13-0.33, and the DALYs/1,000 ranged from 3.60-52.78. The age, highland service years, blood pressure, heart rate, smoking rate, and proportion of the sample working in engineering or construction were significantly higher in the CMS group than in the non-CMS group (p < 0.05). These variables were also positively associated with the individual DALYs (p < 0.05). Among the symptoms, headaches caused the largest proportion of DALYs. Conclusion The results show that CMS imposes a considerable burden on Chinese immigrants to Tibet. Immigrants with characteristics such as a higher residential altitude, more advanced age, longer highland service years, being a smoker, and working in engineering or construction were more likely to develop CMS and to increase the disease burden. Higher blood pressure and heart rate as a result of CMS were also positively associated with the disease burden. The authorities should pay attention to the highland disease burden and support the development and application of DALYs studies of CMS and other highland diseases. PMID:22672510

  11. Human adipose tissue from normal and tumoral breast regulates the behavior of mammary epithelial cells.

    PubMed

    Pistone Creydt, Virginia; Fletcher, Sabrina Johanna; Giudice, Jimena; Bruzzone, Ariana; Chasseing, Norma Alejandra; Gonzalez, Eduardo Gustavo; Sacca, Paula Alejandra; Calvo, Juan Carlos

    2013-02-01

    Stromal-epithelial interactions mediate both breast development and breast cancer progression. In the present work, we evaluated the effects of conditioned media (CMs) of human adipose tissue explants from normal (hATN) and tumor (hATT) breast on proliferation, adhesion, migration and metalloproteases activity on tumor (MCF-7 and IBH-7) and non-tumor (MCF-10A) human breast epithelial cell lines. Human adipose tissues were obtained from patients and the conditioned medium from hATN and hATT collected after 24 h of incubation. MCF-10A, MCF-7 and IBH-7 cells were grown and incubated with CMs and proliferation and adhesion, as well as migration ability and metalloprotease activity, of epithelial cells after exposing cell cultures to hATN- or hATT-CMs were quantified. The statistical significance between different experimental conditions was evaluated by one-way ANOVA. Tukey's post hoc tests were performed. Tumor and non-tumor breast epithelial cells significantly increased their proliferation activity after 24 h of treatment with hATT-CMs compared to control-CMs. Furthermore, cellular adhesion of these two tumor cell lines was significantly lower with hATT-CMs than with hATN-CMs. Therefore, hATT-CMs seem to induce significantly lower expression or less activity of the components involved in cellular adhesion than hATN-CMs. In addition, hATT-CMs induced pro-MMP-9 and MMP-9 activity and increased the migration of MCF-7 and IBH-7 cells compared to hATN-CMs. We conclude that the microenvironment of the tumor interacts in a dynamic way with the mutated epithelium. This evidence leads to the possibility to modify the tumor behavior/phenotype through the regulation or modification of its microenvironment. We developed a model in which we obtained CMs from adipose tissue explants completely, either from normal or tumor breast. In this way, we studied the contribution of soluble factors independently of the possible effects of direct cell contact.

  12. A UNIX SVR4-OS 9 distributed data acquisition for high energy physics

    NASA Astrophysics Data System (ADS)

    Drouhin, F.; Schwaller, B.; Fontaine, J. C.; Charles, F.; Pallares, A.; Huss, D.

    1998-08-01

    The distributed data acquisition (DAQ) system developed by the GRPHE (Groupe de Recherche en Physique des Hautes Energies) group is a combination of hardware and software dedicated to high energy physics. The system described here is used in the beam tests of the CMS tracker. The central processor of the system is a RISC CPU hosted in a VME card, running a POSIX compliant UNIX system. Specialized real-time OS9 VME cards perform the instrumentation control. The main data flow goes over a deterministic high speed network. The UNIX system manages a list of OS9 front-end systems with a synchronisation protocol running over a TCP/IP layer.

  13. The Offline Software Framework of the NA61/SHINE Experiment

    NASA Astrophysics Data System (ADS)

    Sipos, Roland; Laszlo, Andras; Marcinek, Antoni; Paul, Tom; Szuba, Marek; Unger, Michael; Veberic, Darko; Wyszynski, Oskar

    2012-12-01

    NA61/SHINE (SHINE = SPS Heavy Ion and Neutrino Experiment) is an experiment at the CERN SPS using the upgraded NA49 hadron spectrometer. Among its physics goals are precise hadron production measurements for improving calculations of the neutrino beam flux in the T2K neutrino oscillation experiment as well as for more reliable simulations of cosmic-ray air showers. Moreover, p+p, p+Pb and nucleus+nucleus collisions will be studied extensively to allow for a study of properties of the onset of deconfinement and search for the critical point of strongly interacting matter. Currently NA61/SHINE uses the old NA49 software framework for reconstruction, simulation and data analysis. The core of this legacy framework was developed in the early 1990s. It is written in different programming and scripting languages (C, pgi-Fortran, shell) and provides several concurrent data formats for the event data model, which includes also obsolete parts. In this contribution we will introduce the new software framework, called Shine, that is written in C++ and designed to comprise three principal parts: a collection of processing modules which can be assembled and sequenced by the user via XML files, an event data model which contains all simulation and reconstruction information based on STL and ROOT streaming, and a detector description which provides data on the configuration and state of the experiment. To assure a quick migration to the Shine framework, wrappers were introduced that allow to run legacy code parts as modules in the new framework and we will present first results on the cross validation of the two frameworks.

  14. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult to achieve using LabVIEW. The

  15. JRTF: A Flexible Software Framework for Real-Time Control in Magnetic Confinement Nuclear Fusion Experiments

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Zheng, G. Z.; Zheng, W.; Chen, Z.; Yuan, T.; Yang, C.

    2016-04-01

    The magnetic confinement nuclear fusion experiments require various real-time control applications like plasma control. ITER has designed the Fast Plant System Controller (FPSC) for this job. ITER provided hardware and software standards and guidelines for building a FPSC. In order to develop various real-time FPSC applications efficiently, a flexible real-time software framework called J-TEXT real-time framework (JRTF) is developed by J-TEXT tokamak team. JRTF allowed developers to implement different functions as independent and reusable modules called Application Blocks (AB). The AB developers only need to focus on implementing the control tasks or the algorithms. The timing, scheduling, data sharing and eventing are handled by the JRTF pipelines. JRTF provides great flexibility on developing ABs. Unit test against ABs can be developed easily and ABs can even be used in non-JRTF applications. JRTF also provides interfaces allowing JRTF applications to be configured and monitored at runtime. JRTF is compatible with ITER standard FPSC hardware and ITER (Control, Data Access and Communication) CODAC Core software. It can be configured and monitored using (Experimental Physics and Industrial Control System) EPICS. Moreover the JRTF can be ported to different platforms and be integrated with supervisory control software other than EPICS. The paper presents the design and implementation of JRTF as well as brief test results.

  16. Tools and Approaches for the Construction of Knowledge Models from the Neuroscientific Literature

    PubMed Central

    Burns, Gully A. P. C.; Khan, Arshad M.; Ghandeharizadeh, Shahram; O’Neill, Mark A.; Chen, Yi-Shin

    2015-01-01

    Within this paper, we describe a neuroinformatics project (called “NeuroScholar,” http://www.neuroscholar.org/) that enables researchers to examine, manage, manipulate, and use the information contained within the published neuroscientific literature. The project is built within a multi-level, multi-component framework constructed with the use of software engineering methods that themselves provide code-building functionality for neuroinformaticians. We describe the different software layers of the system. First, we present a hypothetical usage scenario illustrating how NeuroScholar permits users to address large-scale questions in a way that would otherwise be impossible. We do this by applying NeuroScholar to a “real-world” neuroscience question: How is stress-related information processed in the brain? We then explain how the overall design of NeuroScholar enables the system to work and illustrate different components of the user interface. We then describe the knowledge management strategy we use to store interpretations. Finally, we describe the software engineering framework we have devised (called the “View-Primitive-Data Model framework,” [VPDMf]) to provide an open-source, accelerated software development environment for the project. We believe that NeuroScholar will be useful to experimental neuroscientists by helping them interact with the primary neuroscientific literature in a meaningful way, and to neuroinformaticians by providing them with useful, affordable software engineering tools. PMID:15055395

  17. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  18. 42 CFR 423.509 - Termination of contract by CMS.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...: (1) Termination of contract by CMS. (i) CMS notifies the Part D plan in writing 90 days before the... difficulties so severe that its ability to make necessary health services available is impaired to the point of... writing that its contract will be terminated on a date specified by CMS. If a termination in is effective...

  19. 42 CFR 423.509 - Termination of contract by CMS.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...: (1) Termination of contract by CMS. (i) CMS notifies the Part D plan in writing 90 days before the... difficulties so severe that its ability to make necessary health services available is impaired to the point of... writing that its contract will be terminated on a date specified by CMS. If a termination in is effective...

  20. 42 CFR 423.758 - Collection of civil money penalties imposed by CMS.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Collection of civil money penalties imposed by CMS... BENEFIT Intermediate Sanctions § 423.758 Collection of civil money penalties imposed by CMS. (a) When a Part D plan sponsor does not request a hearing CMS initiates collection of the civil money penalty...

Top