Sample records for computation package cms

  1. Fast emulation of track reconstruction in the CMS simulation

    NASA Astrophysics Data System (ADS)

    Komm, Matthias; CMS Collaboration

    2017-10-01

    Simulated samples of various physics processes are a key ingredient within analyses to unlock the physics behind LHC collision data. Samples with more and more statistics are required to keep up with the increasing amounts of recorded data. During sample generation, significant computing time is spent on the reconstruction of charged particle tracks from energy deposits which additionally scales with the pileup conditions. In CMS, the FastSimulation package is developed for providing a fast alternative to the standard simulation and reconstruction workflow. It employs various techniques to emulate track reconstruction effects in particle collision events. Several analysis groups in CMS are utilizing the package, in particular those requiring many samples to scan the parameter space of physics models (e.g. SUSY) or for the purpose of estimating systematic uncertainties. The strategies for and recent developments in this emulation are presented, including a novel, flexible implementation of tracking emulation while retaining a sufficient, tuneable accuracy.

  2. Managing the CMS Data and Monte Carlo Processing during LHC Run 2

    NASA Astrophysics Data System (ADS)

    Wissing, C.; CMS Collaboration

    2017-10-01

    In order to cope with the challenges expected during the LHC Run 2 CMS put in a number of enhancements into the main software packages and the tools used for centrally managed processing. In the presentation we will highlight these improvements that allow CMS to deal with the increased trigger output rate, the increased pileup and the evolution in computing technology. The overall system aims at high flexibility, improved operational flexibility and largely automated procedures. The tight coupling of workflow classes to types of sites has been drastically relaxed. Reliable and high-performing networking between most of the computing sites and the successful deployment of a data-federation allow the execution of workflows using remote data access. That required the development of a largely automatized system to assign workflows and to handle necessary pre-staging of data. Another step towards flexibility has been the introduction of one large global HTCondor Pool for all types of processing workflows and analysis jobs. Besides classical Grid resources also some opportunistic resources as well as Cloud resources have been integrated into that Pool, which gives reach to more than 200k CPU cores.

  3. A search for a heavy Majorana neutrino and a radiation damage simulation for the HF detector

    NASA Astrophysics Data System (ADS)

    Wetzel, James William

    A search for heavy Majorana neutrinos is performed using an event signature defined by two same-sign muons accompanied by two jets. This search is an extension of previous searches, (L3, DELPHI, CMS, ATLAS), using 19.7 fb -1 of data from the 2012 Large Hadron Collider experimental run collected by the Compact Muon Solenoid experiment. A mass window of 40-500 GeV/ c2 is explored. No excess events above Standard Model backgrounds is observed, and limits are set on the mixing element squared, |VmuN|2, as a function of Majorana neutFnrino mass. The Hadronic Forward (HF) Detector's performance will degrade as a function of the number of particles delivered to the detector over time, a quantity referred to as integrated luminosity and measured in inverse femtobarns (fb-1). In order to better plan detector upgrades, the CMS Forward Calorimetry Task Force (FCAL) group and the CMS Hadronic Calorimeter (HCAL) group have requested that radiation damage be simulated and the subsequent performance of the HF subdetector be studied. The simulation was implemented into both the CMS FastSim and CMS FullSim simulation packages. Standard calorimetry performance metrics were computed and are reported. The HF detector can expect to perform well through the planned delivery of 3000 fb-1.

  4. Medicare's "Global" terrorism: where is the pay for performance?

    PubMed

    Reed, R Lawrence; Luchette, Fred A; Esposito, Thomas J; Pyrz, Karen; Gamelli, Richard L

    2008-02-01

    Medicare and Medicaid Services (CMS) payment policies for surgical operations are based on a global package concept. CMS' physician fee schedule splits the global package into preoperative, intraoperative, and postoperative components of each procedure. We hypothesized that these global package component valuations were often lower than comparable evaluation and management (E&M) services and that billing for E&M services instead of the operation could often be more profitable. Our billing database and Trauma Registry were queried for the operative procedures and hospital lengths of stay for trauma patients during the past 5 years. Determinations of preoperative, intraoperative, and postoperative payments were calculated for 10-day and 90-day global packages, comparing them to CMS payments for comparable E&M codes. Of 90-day and 10-day Current Procedural Terminology codes, 88% and 100%, respectively, do not pay for the comprehensive history and physical that trauma patients usually receive, whereas 41% and 98%, respectively, do not even meet payment levels for a simple history and physical. Of 90-day global package procedures, 70% would have generated more revenue had comprehensive daily visits been billed instead of the operation ($3,057,500 vs. $1,658,058). For 10-day global package procedures, 56% would have generated more revenue with merely problem-focused daily visits instead of the operation ($161,855 vs. $156,318). Medicare's global surgical package underpays E&M services in trauma patients. In most cases, trauma surgeons would fare better by not billing for operations to receive higher reimbursement for E&M services that are considered "bundled" in the global package payment.

  5. 42 CFR 422.254 - Submission of bids.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... part or may choose not to renew the contract. (4) Substantial differences between bids. An MA organization's bid submissions must reflect differences in benefit packages or plan costs that CMS determines to represent substantial differences relative to a sponsor's other bid submissions. (5) CMS may...

  6. 42 CFR 422.254 - Submission of bids.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... this part or may choose not to renew the contract. (4) Substantial differences between bids. An MA organization's bid submissions must reflect differences in benefit packages or plan costs that CMS determines to represent substantial differences relative to a sponsor's other bid submissions. (5) CMS may...

  7. 42 CFR 422.254 - Submission of bids.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... this part or may choose not to renew the contract. (4) Substantial differences between bids. An MA organization's bid submissions must reflect differences in benefit packages or plan costs that CMS determines to represent substantial differences relative to a sponsor's other bid submissions. (5) CMS may...

  8. Guide to using Cuechart, Tellagraf, and Disspla at ANL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertoncini, P.J.; Thommes, M.M.

    1986-01-01

    Guide to Curchart, Tellagraf, and Disspla at ANL provides information necessary for using the three ISSCO graphics packages at Argonne: Cuechart is a cue-and-response program available in CMS that aids users in creating bar charts, line charts, pie charts, and word charts. It is appropriate for users with little or no previous graphics experience. Cuechart provides much of the capability of Tellagraf without the user's having to learn Tellagraf commands. Tellagraf is a more powerful, easy-to-use graphics package also available in CMS. With a little training, scientists, administrators, and secretaries can produce sophisticated publication-quality log or linear plots, bar charts,more » pie charts, tables, or posters. Disspla is a more versatile and sophisticated graphics package. It is available in both CMS and batch and consists of several hundred Fortran-callable and PL/I-callable subroutines that will enable you to obtain professional quality plots. In addition to log or linear plots, bar charts, pie charts, and pages of text, Disspla provides subroutines for contour plots, 3-D plots, and world maps.« less

  9. Enabling opportunistic resources for CMS Computing Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hufnagel, Dirk

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  10. Enabling opportunistic resources for CMS Computing Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hufnagel, Dick

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resources — resources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are usedmore » to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  11. Enabling opportunistic resources for CMS Computing Operations

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  12. CMS results in the Combined Computing Readiness Challenge CCRC'08

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Bauerdick, L.; CMS Collaboration

    2009-12-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed workflows - are presented and discussed.

  13. Exploiting analytics techniques in CMS computing monitoring

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Kuznetsov, V.; Magini, N.; Repečka, A.; Vaandering, E.

    2017-10-01

    The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster for further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.

  14. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    NASA Astrophysics Data System (ADS)

    Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.

    2011-12-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  15. Bringing the CMS distributed computing system into scalable operations

    NASA Astrophysics Data System (ADS)

    Belforte, S.; Fanfani, A.; Fisk, I.; Flix, J.; Hernández, J. M.; Kress, T.; Letts, J.; Magini, N.; Miccio, V.; Sciabà, A.

    2010-04-01

    Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure and optimize the networking and storage through scale testing data transfers and data processing, and improving the efficiency of accessing data across the CMS computing system from global transfers to local access. This contribution reports on the tools and procedures developed by CMS for computing commissioning and scale testing as well as the improvements accomplished towards efficient, reliable and scalable computing operations. The activities include the development and operation of load generators for job submission and data transfers with the aim of stressing the experiment and Grid data management and workload management systems, site commissioning procedures and tools to monitor and improve site availability and reliability, as well as activities targeted to the commissioning of the distributed production, user analysis and monitoring systems.

  16. CMS Connect

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B.; Gardner, R., Jr.; Hurtado Anampa, K.; Jayatilaka, B.; Aftab Khan, F.; Lannon, K.; Larson, K.; Letts, J.; Marra Da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.

    2017-10-01

    The CMS experiment collects and analyzes large amounts of data coming from high energy particle collisions produced by the Large Hadron Collider (LHC) at CERN. This involves a huge amount of real and simulated data processing that needs to be handled in batch-oriented platforms. The CMS Global Pool of computing resources provide +100K dedicated CPU cores and another 50K to 100K CPU cores from opportunistic resources for these kind of tasks and even though production and event processing analysis workflows are already managed by existing tools, there is still a lack of support to submit final stage condor-like analysis jobs familiar to Tier-3 or local Computing Facilities users into these distributed resources in an integrated (with other CMS services) and friendly way. CMS Connect is a set of computing tools and services designed to augment existing services in the CMS Physics community focusing on these kind of condor analysis jobs. It is based on the CI-Connect platform developed by the Open Science Grid and uses the CMS GlideInWMS infrastructure to transparently plug CMS global grid resources into a virtual pool accessed via a single submission machine. This paper describes the specific developments and deployment of CMS Connect beyond the CI-Connect platform in order to integrate the service with CMS specific needs, including specific Site submission, accounting of jobs and automated reporting to standard CMS monitoring resources in an effortless way to their users.

  17. Exploiting Analytics Techniques in CMS Computing Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonacorsi, D.; Kuznetsov, V.; Magini, N.

    The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster formore » further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.« less

  18. CMS Distributed Computing Integration in the LHC sustained operations era

    NASA Astrophysics Data System (ADS)

    Grandi, C.; Bockelman, B.; Bonacorsi, D.; Fisk, I.; González Caballero, I.; Farina, F.; Hernández, J. M.; Padhi, S.; Sarkar, S.; Sciabà, A.; Sfiligoi, I.; Spiga, F.; Úbeda García, M.; Van Der Ster, D. C.; Zvada, M.

    2011-12-01

    After many years of preparation the CMS computing system has reached a situation where stability in operations limits the possibility to introduce innovative features. Nevertheless it is the same need of stability and smooth operations that requires the introduction of features that were considered not strategic in the previous phases. Examples are: adequate authorization to control and prioritize the access to storage and computing resources; improved monitoring to investigate problems and identify bottlenecks on the infrastructure; increased automation to reduce the manpower needed for operations; effective process to deploy in production new releases of the software tools. We present the work of the CMS Distributed Computing Integration Activity that is responsible for providing a liaison between the CMS distributed computing infrastructure and the software providers, both internal and external to CMS. In particular we describe the introduction of new middleware features during the last 18 months as well as the requirements to Grid and Cloud software developers for the future.

  19. Monitoring techniques and alarm procedures for CMS services and sites in WLCG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molina-Perez, J.; Bonacorsi, D.; Gutsche, O.

    2012-01-01

    The CMS offline computing system is composed of roughly 80 sites (including most experienced T3s) and a number of central services to distribute, process and analyze data worldwide. A high level of stability and reliability is required from the underlying infrastructure and services, partially covered by local or automated monitoring and alarming systems such as Lemon and SLS, the former collects metrics from sensors installed on computing nodes and triggers alarms when values are out of range, the latter measures the quality of service and warns managers when service is affected. CMS has established computing shift procedures with personnel operatingmore » worldwide from remote Computing Centers, under the supervision of the Computing Run Coordinator at CERN. This dedicated 24/7 computing shift personnel is contributing to detect and react timely on any unexpected error and hence ensure that CMS workflows are carried out efficiently and in a sustained manner. Synergy among all the involved actors is exploited to ensure the 24/7 monitoring, alarming and troubleshooting of the CMS computing sites and services. We review the deployment of the monitoring and alarming procedures, and report on the experience gained throughout the first two years of LHC operation. We describe the efficiency of the communication tools employed, the coherent monitoring framework, the proactive alarming systems and the proficient troubleshooting procedures that helped the CMS Computing facilities and infrastructure to operate at high reliability levels.« less

  20. 75 FR 30839 - Privacy Act of 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-02

    ... of the Matching Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L.... 100-503, the Computer Matching and Privacy Protection Act (CMPPA) of 1988), the Office of Management... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS...

  1. The diverse use of clouds by CMS

    DOE PAGES

    Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; ...

    2015-12-23

    The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of themore » trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. Lastly, we present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.« less

  2. 78 FR 73195 - Privacy Act of 1974: CMS Computer Matching Program Match No. 2013-01; HHS Computer Matching...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-05

    .... Description of the Matching Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub... 1974: CMS Computer Matching Program Match No. 2013-01; HHS Computer Matching Program Match No. 1312...). ACTION: Notice of Computer Matching Program (CMP). SUMMARY: In accordance with the requirements of the...

  3. Software Description for the O’Hare Runway Configuration Management System. Volume I. Technical Description,

    DTIC Science & Technology

    1982-10-01

    spent in preparing this document. 00. EXECUTIVE SUMMARY The O’Hare Runway Configuration Management System (CMS) is an interactive multi-user computer ...MITRE Washington’s Computer Center. Currently, CMS is housed in an IBM 4341 computer with VM/SP operating system. CMS employs the IBM’s Display...iV 0O, o 0 .r4L /~ wA 0U 00 00 0 w vi O’Hare, it will operate on a dedicated mini- computer which permits multi-tasking (that is, multiple users

  4. Implementation of NASTRAN on the IBM/370 CMS operating system

    NASA Technical Reports Server (NTRS)

    Britten, S. S.; Schumacker, B.

    1980-01-01

    The NASA Structural Analysis (NASTRAN) computer program is operational on the IBM 360/370 series computers. While execution of NASTRAN has been described and implemented under the virtual storage operating systems of the IBM 370 models, the IBM 370/168 computer can also operate in a time-sharing mode under the virtual machine operating system using the Conversational Monitor System (CMS) subset. The changes required to make NASTRAN operational under the CMS operating system are described.

  5. 42 CFR 414.68 - Imaging accreditation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) Computed tomography. (iii) Nuclear medicine. (iv) Positron emission tomography. CMS-approved accreditation... if CMS takes an adverse action based on accreditation findings. (vi) Notify CMS, in writing... organization must permit its surveyors to serve as witnesses if CMS takes an adverse action based on...

  6. 42 CFR 414.68 - Imaging accreditation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) Computed tomography. (iii) Nuclear medicine. (iv) Positron emission tomography. CMS-approved accreditation... if CMS takes an adverse action based on accreditation findings. (vi) Notify CMS, in writing... organization must permit its surveyors to serve as witnesses if CMS takes an adverse action based on...

  7. 42 CFR 414.68 - Imaging accreditation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) Computed tomography. (iii) Nuclear medicine. (iv) Positron emission tomography. CMS-approved accreditation... if CMS takes an adverse action based on accreditation findings. (vi) Notify CMS, in writing... organization must permit its surveyors to serve as witnesses if CMS takes an adverse action based on...

  8. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    PubMed

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  9. SiteDB: Marshalling people and resources available to CMS

    NASA Astrophysics Data System (ADS)

    Metson, S.; Bonacorsi, D.; Dias Ferreira, M.; Egeland, R.

    2010-04-01

    In a collaboration the size of CMS (approx. 3000 users, and almost 100 computing centres of varying size) communication and accurate information about the sites it has access to is vital in co-ordinating the multitude of computing tasks required for smooth running. SiteDB is a tool developed by CMS to track sites available to the collaboration, the allocation to CMS of resources available at those sites and the associations between CMS members and the sites (as either a manager/operator of the site or a member of a group associated to the site). It is used to track the roles a person has for an associated site or group. SiteDB eases the coordination load for the operations teams by providing a consistent interface to manage communication with the people working at a site, by identifying who is responsible for a given task or service at a site and by offering a uniform interface to information on CMS contacts and sites. SiteDB provides api's and reports for other CMS tools to use to access the information it contains, for instance enabling CRAB to use "user friendly" names when black/white listing CE's, providing role based authentication and authorisation for other web based services and populating various troubleshooting squads in external ticketing systems in use daily by CMS Computing operations.

  10. CMS Centres Worldwide - a New Collaborative Infrastructure

    NASA Astrophysics Data System (ADS)

    Taylor, Lucas

    2011-12-01

    The CMS Experiment at the LHC has established a network of more than fifty inter-connected "CMS Centres" at CERN and in institutes in the Americas, Asia, Australasia, and Europe. These facilities are used by people doing CMS detector and computing grid operations, remote shifts, data quality monitoring and analysis, as well as education and outreach. We present the computing, software, and collaborative tools and videoconferencing systems. These include permanently running "telepresence" video links (hardware-based H.323, EVO and Vidyo), Webcasts, and generic Web tools such as CMS-TV for broadcasting live monitoring and outreach information. Being Web-based and experiment-independent, these systems could easily be extended to other organizations. We describe the experiences of using CMS Centres Worldwide in the CMS data-taking operations as well as for major media events with several hundred TV channels, radio stations, and many more press journalists simultaneously around the world.

  11. 76 FR 14669 - Privacy Act of 1974; CMS Computer Match No. 2011-02; HHS Computer Match No. 1007

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-17

    ... (CMS); and Department of Defense (DoD), Manpower Data Center (DMDC), Defense Enrollment and Eligibility... the results of the computer match and provide the information to TMA for use in its matching program... under TRICARE. DEERS will receive the results of the computer match and provide the information provided...

  12. Grid site availability evaluation and monitoring at CMS

    DOE PAGES

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  13. Grid site availability evaluation and monitoring at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  14. Grid site availability evaluation and monitoring at CMS

    NASA Astrophysics Data System (ADS)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.

  15. Scaling up a CMS tier-3 site with campus resources and a 100 Gb/s network connection: what could go wrong?

    NASA Astrophysics Data System (ADS)

    Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Tovar, Benjamin; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2017-10-01

    The University of Notre Dame (ND) CMS group operates a modest-sized Tier-3 site suitable for local, final-stage analysis of CMS data. However, through the ND Center for Research Computing (CRC), Notre Dame researchers have opportunistic access to roughly 25k CPU cores of computing and a 100 Gb/s WAN network link. To understand the limits of what might be possible in this scenario, we undertook to use these resources for a wide range of CMS computing tasks from user analysis through large-scale Monte Carlo production (including both detector simulation and data reconstruction.) We will discuss the challenges inherent in effectively utilizing CRC resources for these tasks and the solutions deployed to overcome them.

  16. Exploiting volatile opportunistic computing resources with Lobster

    NASA Astrophysics Data System (ADS)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  17. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  18. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE PAGES

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  19. Prioritizing Strategic Interests in South Asia

    DTIC Science & Technology

    2010-06-01

    rolled out “Aghaz-e-Haqooq Balochistan ”—its by far the most serious fallout from the conflict in Afghanistan is the increasing radicalization of...Foreign Policy, August 2006, available at <www.foreignpolicy.com/ story/cms.php?story_id=3578>. 17 “Aghaz-e-Haqooq Balochistan Package,” Dawn. com...November 16, 2009, available at <www.dawn.com/ wps/wcm/connect/dawn-content-library/dawn/news/ pakistan/13+aghaz-e-haqooq+ balochistan +package- za-05

  20. Cost-Benefit Analysis for ECIA Chapter 1 and State DPPF Programs Comparing Groups Receiving Regular Program Instruction and Groups Receiving Computer Assisted Instruction/Computer Management System (CAI/CMS). 1986-87.

    ERIC Educational Resources Information Center

    Chamberlain, Ed

    A cost benefit study was conducted to determine the effectiveness of a computer assisted instruction/computer management system (CAI/CMS) as an alternative to conventional methods of teaching reading within Chapter 1 and DPPF funded programs of the Columbus (Ohio) Public Schools. The Chapter 1 funded Compensatory Language Experiences and Reading…

  1. Dosimetric investigation of LDR brachytherapy ¹⁹²Ir wires by Monte Carlo and TPS calculations.

    PubMed

    Bozkurt, Ahmet; Acun, Hediye; Kemikler, Gonul

    2013-01-01

    The aim of this study was to investigate the dose rate distribution around (192)Ir wires used as radioactive sources in low-dose-rate brachytherapy applications. Monte Carlo modeling of a 0.3-mm diameter source and its surrounding water medium was performed for five different wire lengths (1-5 cm) using the MCNP software package. The computed dose rates per unit of air kerma at distances from 0.1 up to 10 cm away from the source were first verified with literature data sets. Then, the simulation results were compared with the calculations from the XiO CMS commercial treatment planning system. The study results were found to be in concordance with the treatment planning system calculations except for the shorter wires at close distances.

  2. A transient response analysis of the space shuttle vehicle during liftoff

    NASA Technical Reports Server (NTRS)

    Brunty, J. A.

    1990-01-01

    A proposed transient response method is formulated for the liftoff analysis of the space shuttle vehicles. It uses a power series approximation with unknown coefficients for the interface forces between the space shuttle and mobile launch platform. This allows the equation of motion of the two structures to be solved separately with the unknown coefficients at the end of each step. These coefficients are obtained by enforcing the interface compatibility conditions between the two structures. Once the unknown coefficients are determined, the total response is computed for that time step. The method is validated by a numerical example of a cantilevered beam and by the liftoff analysis of the space shuttle vehicles. The proposed method is compared to an iterative transient response analysis method used by Martin Marietta for their space shuttle liftoff analysis. It is shown that the proposed method uses less computer time than the iterative method and does not require as small a time step for integration. The space shuttle vehicle model is reduced using two different types of component mode synthesis (CMS) methods, the Lanczos method and the Craig and Bampton CMS method. By varying the cutoff frequency in the Craig and Bampton method it was shown that the space shuttle interface loads can be computed with reasonable accuracy. Both the Lanczos CMS method and Craig and Bampton CMS method give similar results. A substantial amount of computer time is saved using the Lanczos CMS method over that of the Craig and Bampton method. However, when trying to compute a large number of Lanczos vectors, input/output computer time increased and increased the overall computer time. The application of several liftoff release mechanisms that can be adapted to the proposed method are discussed.

  3. Challenging data and workload management in CMS Computing with network-aware systems

    NASA Astrophysics Data System (ADS)

    D, Bonacorsi; T, Wildish

    2014-06-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of Intelligent Network Services, including also bandwidth on demand concepts. In this paper, we will review the work done in CMS on this, and the next steps.

  4. The Particle Physics Playground website: tutorials and activities using real experimental data

    NASA Astrophysics Data System (ADS)

    Bellis, Matthew; CMS Collaboration

    2016-03-01

    The CERN Open Data Portal provides access to data from the LHC experiments to anyone with the time and inclination to learn the analysis procedures. The CMS experiment has made a significant amount of data availible in basically the same format the collaboration itself uses, along with software tools and a virtual enviroment in which to run those tools. These same data have also been mined for educational exercises that range from very simple .csv files that can be analyzed in a spreadsheet to more sophisticated formats that use ROOT, a dominant software package in experimental particle physics but not used as much in the general computing community. This talk will present the Particle Physics Playground website (http://particle-physics-playground.github.io/), a project that uses data from the CMS experiment, as well as the older CLEO experiment, in tutorials and exercises aimed at high school and undergraduate students and other science enthusiasts. The data are stored as text files and the users are provided with starter Python/Jupyter notebook programs and accessor functions which can be modified to perform fairly high-level analyses. The status of the project, success stories, and future plans for the website will be presented. This work was supported in part by NSF Grant PHY-1307562.

  5. Analyzing the Use of Concept Maps in Computer Science: A Systematic Mapping Study

    ERIC Educational Resources Information Center

    dos Santos, Vinicius; de Souza, Érica F.; Felizardo, Katia R; Vijaykumar, Nandamudi L.

    2017-01-01

    Context: concept Maps (CMs) enable the creation of a schematic representation of a domain knowledge. For this reason, CMs have been applied in different research areas, including Computer Science. Objective: the objective of this paper is to present the results of a systematic mapping study conducted to collect and evaluate existing research on…

  6. Investigating Gender and Racial/Ethnic Invariance in Use of a Course Management System in Higher Education

    ERIC Educational Resources Information Center

    Li, Yi; Wang, Qiu; Campbell, John

    2015-01-01

    This study focused on learning equity in colleges and universities where teaching and learning depends heavily on computer technologies. The study used the Structural Equation Modeling (SEM) to investigate gender and racial/ethnic heterogeneity in the use of a computer based course management system (CMS). Two latent variables (CMS usage and…

  7. PyCMSXiO: an external interface to script treatment plans for the Elekta® CMS XiO treatment planning system

    NASA Astrophysics Data System (ADS)

    Xing, Aitang; Arumugam, Sankar; Holloway, Lois; Goozee, Gary

    2014-03-01

    Scripting in radiotherapy treatment planning systems not only simplifies routine planning tasks but can also be used for clinical research. Treatment planning scripting can only be utilized in a system that has a built-in scripting interface. Among the commercially available treatment planning systems, Pinnacle (Philips) and Raystation (Raysearch Lab.) have inherent scripting functionality. CMS XiO (Elekta) is a widely used treatment planning system in radiotherapy centres around the world, but it does not have an interface that allows the user to script radiotherapy plans. In this study an external scripting interface, PyCMSXiO, was developed for XiO using the Python programming language. The interface was implemented as a python package/library using a modern object-oriented programming methodology. The package was organized as a hierarchy of different classes (objects). Each class (object) corresponds to a plan object such as the beam of a clinical radiotherapy plan. The interface of classes was implemented as object functions. Scripting in XiO using PyCMSXiO is comparable with Pinnacle scripting. This scripting package has been used in several research projects including commissioning of a beam model, independent three-dimensional dose verification for IMRT plans and a setup-uncertainty study. Ease of use and high-level functions provided in the package achieve a useful research tool. It was released as an open-source tool that may benefit the medical physics community.

  8. The CMS High Level Trigger System: Experience and Future Development

    NASA Astrophysics Data System (ADS)

    Bauer, G.; Behrens, U.; Bowen, M.; Branson, J.; Bukowiec, S.; Cittolin, S.; Coarasa, J. A.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Flossdorf, A.; Gigi, D.; Glege, F.; Gomez-Reino, R.; Hartl, C.; Hegeman, J.; Holzner, A.; Hwong, Y. L.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Polese, G.; Racz, A.; Raginel, O.; Sakulin, H.; Sani, M.; Schwick, C.; Shpakov, D.; Simon, S.; Spataru, A. C.; Sumorok, K.

    2012-12-01

    The CMS experiment at the LHC features a two-level trigger system. Events accepted by the first level trigger, at a maximum rate of 100 kHz, are read out by the Data Acquisition system (DAQ), and subsequently assembled in memory in a farm of computers running a software high-level trigger (HLT), which selects interesting events for offline storage and analysis at a rate of order few hundred Hz. The HLT algorithms consist of sequences of offline-style reconstruction and filtering modules, executed on a farm of 0(10000) CPU cores built from commodity hardware. Experience from the operation of the HLT system in the collider run 2010/2011 is reported. The current architecture of the CMS HLT, its integration with the CMS reconstruction framework and the CMS DAQ, are discussed in the light of future development. The possible short- and medium-term evolution of the HLT software infrastructure to support extensions of the HLT computing power, and to address remaining performance and maintenance issues, are discussed.

  9. 42 CFR 423.265 - Submission of bids and related information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to offer in the subsequent calendar year. (2) Substantial differences between bids. Potential Part D sponsors' bid submissions must reflect differences in benefit packages or plan costs that CMS determines to...) Bid submission—(1) General. Not later than the first Monday in June, each potential Part D sponsor...

  10. 42 CFR 423.265 - Submission of bids and related information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to offer in the subsequent calendar year. (2) Substantial differences between bids. Potential Part D sponsors' bid submissions must reflect differences in benefit packages or plan costs that CMS determines to...) Bid submission—(1) General. Not later than the first Monday in June, each potential Part D sponsor...

  11. 42 CFR 423.265 - Submission of bids and related information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to offer in the subsequent calendar year. (2) Substantial differences between bids. Potential Part D sponsors' bid submissions must reflect differences in benefit packages or plan costs that CMS determines to...) Bid submission—(1) General. Not later than the first Monday in June, each potential Part D sponsor...

  12. 42 CFR 423.265 - Submission of bids and related information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... calendar year. (2) Substantial differences between bids. Potential Part D sponsors' bid submissions must reflect differences in benefit packages or plan costs that CMS determines to represent substantial...) General. Not later than the first Monday in June, each potential Part D sponsor must submit bids and...

  13. 42 CFR 423.265 - Submission of bids and related information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... calendar year. (2) Substantial differences between bids. Potential Part D sponsors' bid submissions must reflect differences in benefit packages or plan costs that CMS determines to represent substantial...) General. Not later than the first Monday in June, each potential Part D sponsor must submit bids and...

  14. Experience in using commercial clouds in CMS

    NASA Astrophysics Data System (ADS)

    Bauerdick, L.; Bockelman, B.; Dykstra, D.; Fuess, S.; Garzoglio, G.; Girone, M.; Gutsche, O.; Holzman, B.; Hufnagel, D.; Kim, H.; Kennedy, R.; Mason, D.; Spentzouris, P.; Timm, S.; Tiradani, A.; Vaandering, E.; CMS Collaboration

    2017-10-01

    Historically high energy physics computing has been performed on large purpose-built computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.

  15. Experience in using commercial clouds in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauerdick, L.; Bockelman, B.; Dykstra, D.

    Historically high energy physics computing has been performed on large purposebuilt computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is amore » growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.« less

  16. 78 FR 42080 - Privacy Act of 1974; CMS Computer Match No. 2013-07; HHS Computer Match No. 1303; DoD-DMDC Match...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-15

    ... with the Department of Defense (DoD), Defense Manpower Data Center (DMDC). We have provided background... & Medicaid Services and the Department of Defense, Defense Manpower Data Center for the Determination of...), Centers for Medicare & Medicaid Services (CMS), and Department of Defense (DoD), Defense Manpower Data...

  17. Large scale commissioning and operational experience with tier-2 to tier-2 data transfer links in CMS

    NASA Astrophysics Data System (ADS)

    Letts, J.; Magini, N.

    2011-12-01

    Tier-2 to Tier-2 data transfers have been identified as a necessary extension of the CMS computing model. The Debugging Data Transfers (DDT) Task Force in CMS was charged with commissioning Tier-2 to Tier-2 PhEDEx transfer links beginning in late 2009, originally to serve the needs of physics analysis groups for the transfer of their results between the storage elements of the Tier-2 sites associated with the groups. PhEDEx is the data transfer middleware of the CMS experiment. For analysis jobs using CRAB, the CMS Remote Analysis Builder, the challenges of remote stage out of job output at the end of the analysis jobs led to the introduction of a local fallback stage out, and will eventually require the asynchronous transfer of user data over essentially all of the Tier-2 to Tier-2 network using the same PhEDEx infrastructure. In addition, direct file sharing of physics and Monte Carlo simulated data between Tier-2 sites can relieve the operational load of the Tier-1 sites in the original CMS Computing Model, and already represents an important component of CMS PhEDEx data transfer volume. The experience, challenges and methods used to debug and commission the thousands of data transfers links between CMS Tier-2 sites world-wide are explained and summarized. The resulting operational experience with Tier-2 to Tier-2 transfers is also presented.

  18. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance is good.

  19. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... when it makes the determination. (2) Enrollment. CMS makes a further adjustment to remove the cost...) Age, sex, and disability status. CMS makes adjustments to reflect the age and sex distribution and the...

  20. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... when it makes the determination. (2) Enrollment. CMS makes a further adjustment to remove the cost...) Age, sex, and disability status. CMS makes adjustments to reflect the age and sex distribution and the...

  1. WLCG scale testing during CMS data challenges

    NASA Astrophysics Data System (ADS)

    Gutsche, O.; Hajdu, C.

    2008-07-01

    The CMS computing model to process and analyze LHC collision data follows a data-location driven approach and is using the WLCG infrastructure to provide access to GRID resources. As a preparation for data taking, CMS tests its computing model during dedicated data challenges. An important part of the challenges is the test of the user analysis which poses a special challenge for the infrastructure with its random distributed access patterns. The CMS Remote Analysis Builder (CRAB) handles all interactions with the WLCG infrastructure transparently for the user. During the 2006 challenge, CMS set its goal to test the infrastructure at a scale of 50,000 user jobs per day using CRAB. Both direct submissions by individual users and automated submissions by robots were used to achieve this goal. A report will be given about the outcome of the user analysis part of the challenge using both the EGEE and OSG parts of the WLCG. In particular, the difference in submission between both GRID middlewares (resource broker vs. direct submission) will be discussed. In the end, an outlook for the 2007 data challenge is given.

  2. iSpy: a powerful and lightweight event display

    NASA Astrophysics Data System (ADS)

    Alverson, G.; Eulisse, G.; McCauley, T.; Taylor, L.

    2012-12-01

    iSpy is a general-purpose event data and detector visualization program that was developed as an event display for the CMS experiment at the LHC and has seen use by the general public and teachers and students in the context of education and outreach. Central to the iSpy design philosophy is ease of installation, use, and extensibility. The application itself uses the open-access packages Qt4 and Open Inventor and is distributed either as a fully-bound executable or a standard installer package: one can simply download and double-click to begin. Mac OSX, Linux, and Windows are supported. iSpy renders the standard 2D, 3D, and tabular views, and the architecture allows for a generic approach to production of new views and projections. iSpy reads and displays data in the ig format: event information is written in compressed JSON format files designed for distribution over a network. This format is easily extensible and makes the iSpy client indifferent to the original input data source. The ig format is the one used for release of approved CMS data to the public.

  3. The Role of Computational Modeling and Simulation in the Total Product Life Cycle of Peripheral Vascular Devices

    PubMed Central

    Morrison, Tina M.; Dreher, Maureen L.; Nagaraja, Srinidhi; Angelone, Leonardo M.; Kainz, Wolfgang

    2018-01-01

    The total product life cycle (TPLC) of medical devices has been defined by four stages: discovery and ideation, regulatory decision, product launch, and postmarket monitoring. Manufacturers of medical devices intended for use in the peripheral vasculature, such as stents, inferior vena cava (IVC) filters, and stent-grafts, mainly use computational modeling and simulation (CM&S) to aid device development and design optimization, supplement bench testing for regulatory decisions, and assess postmarket changes or failures. For example, computational solid mechanics and fluid dynamics enable the investigation of design limitations in the ideation stage. To supplement bench data in regulatory submissions, manufactures can evaluate the effects of anatomical characteristics and expected in vivo loading environment on device performance. Manufacturers might also harness CM&S to aid root-cause analyses that are necessary when failures occur postmarket, when the device is exposed to broad clinical use. Once identified, CM&S tools can then be used for redesign to address the failure mode and re-establish the performance profile with the appropriate models. The Center for Devices and Radiological Health (CDRH) wants to advance the use of CM&S for medical devices and supports the development of virtual physiological patients, clinical trial simulations, and personalized medicine. Thus, the purpose of this paper is to describe specific examples of how CM&S is currently used to support regulatory submissions at different phases of the TPLC and to present some of the stakeholder-led initiatives for advancing CM&S for regulatory decision-making. PMID:29479395

  4. The Role of Computational Modeling and Simulation in the Total Product Life Cycle of Peripheral Vascular Devices.

    PubMed

    Morrison, Tina M; Dreher, Maureen L; Nagaraja, Srinidhi; Angelone, Leonardo M; Kainz, Wolfgang

    2017-01-01

    The total product life cycle (TPLC) of medical devices has been defined by four stages: discovery and ideation, regulatory decision, product launch, and postmarket monitoring. Manufacturers of medical devices intended for use in the peripheral vasculature, such as stents, inferior vena cava (IVC) filters, and stent-grafts, mainly use computational modeling and simulation (CM&S) to aid device development and design optimization, supplement bench testing for regulatory decisions, and assess postmarket changes or failures. For example, computational solid mechanics and fluid dynamics enable the investigation of design limitations in the ideation stage. To supplement bench data in regulatory submissions, manufactures can evaluate the effects of anatomical characteristics and expected in vivo loading environment on device performance. Manufacturers might also harness CM&S to aid root-cause analyses that are necessary when failures occur postmarket, when the device is exposed to broad clinical use. Once identified, CM&S tools can then be used for redesign to address the failure mode and re-establish the performance profile with the appropriate models. The Center for Devices and Radiological Health (CDRH) wants to advance the use of CM&S for medical devices and supports the development of virtual physiological patients, clinical trial simulations, and personalized medicine. Thus, the purpose of this paper is to describe specific examples of how CM&S is currently used to support regulatory submissions at different phases of the TPLC and to present some of the stakeholder-led initiatives for advancing CM&S for regulatory decision-making.

  5. Using the CMS threaded framework in a production environment

    DOE PAGES

    Jones, C. D.; Contreras, L.; Gartung, P.; ...

    2015-12-23

    During 2014, the CMS Offline and Computing Organization completed the necessary changes to use the CMS threaded framework in the full production environment. We will briefly discuss the design of the CMS Threaded Framework, in particular how the design affects scaling performance. We will then cover the effort involved in getting both the CMSSW application software and the workflow management system ready for using multiple threads for production. Finally, we will present metrics on the performance of the application and workflow system as well as the difficulties which were uncovered. As a result, we will end with CMS' plans formore » using the threaded framework to do production for LHC Run 2.« less

  6. Patch-Clamp Recording from Human Induced Pluripotent Stem Cell-Derived Cardiomyocytes: Improving Action Potential Characteristics through Dynamic Clamp

    PubMed Central

    Veerman, Christiaan C.; Zegers, Jan G.; Mengarelli, Isabella; Bezzina, Connie R.

    2017-01-01

    Human induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs) hold great promise for studying inherited cardiac arrhythmias and developing drug therapies to treat such arrhythmias. Unfortunately, until now, action potential (AP) measurements in hiPSC-CMs have been hampered by the virtual absence of the inward rectifier potassium current (IK1) in hiPSC-CMs, resulting in spontaneous activity and altered function of various depolarising and repolarising membrane currents. We assessed whether AP measurements in “ventricular-like” and “atrial-like” hiPSC-CMs could be improved through a simple, highly reproducible dynamic clamp approach to provide these cells with a substantial IK1 (computed in real time according to the actual membrane potential and injected through the patch-clamp pipette). APs were measured at 1 Hz using perforated patch-clamp methodology, both in control cells and in cells treated with all-trans retinoic acid (RA) during the differentiation process to increase the number of cells with atrial-like APs. RA-treated hiPSC-CMs displayed shorter APs than control hiPSC-CMs and this phenotype became more prominent upon addition of synthetic IK1 through dynamic clamp. Furthermore, the variability of several AP parameters decreased upon IK1 injection. Computer simulations with models of ventricular-like and atrial-like hiPSC-CMs demonstrated the importance of selecting an appropriate synthetic IK1. In conclusion, the dynamic clamp-based approach of IK1 injection has broad applicability for detailed AP measurements in hiPSC-CMs. PMID:28867785

  7. Opportunistic Resource Usage in CMS

    NASA Astrophysics Data System (ADS)

    Kreuzer, Peter; Hufnagel, Dirk; Dykstra, D.; Gutsche, O.; Tadel, M.; Sfiligoi, I.; Letts, J.; Wuerthwein, F.; McCrea, A.; Bockelman, B.; Fajardo, E.; Linares, L.; Wagner, R.; Konstantinov, P.; Blumenfeld, B.; Bradley, D.; Cms Collaboration

    2014-06-01

    CMS is using a tiered setup of dedicated computing resources provided by sites distributed over the world and organized in WLCG. These sites pledge resources to CMS and are preparing them especially for CMS to run the experiment's applications. But there are more resources available opportunistically both on the GRID and in local university and research clusters which can be used for CMS applications. We will present CMS' strategy to use opportunistic resources and prepare them dynamically to run CMS applications. CMS is able to run its applications on resources that can be reached through the GRID, through EC2 compliant cloud interfaces. Even resources that can be used through ssh login nodes can be harnessed. All of these usage modes are integrated transparently into the GlideIn WMS submission infrastructure, which is the basis of CMS' opportunistic resource usage strategy. Technologies like Parrot to mount the software distribution via CVMFS and xrootd for access to data and simulation samples via the WAN are used and will be described. We will summarize the experience with opportunistic resource usage and give an outlook for the restart of LHC data taking in 2015.

  8. A comparison between physicians and computer algorithms for form CMS-2728 data reporting.

    PubMed

    Malas, Mohammed Said; Wish, Jay; Moorthi, Ranjani; Grannis, Shaun; Dexter, Paul; Duke, Jon; Moe, Sharon

    2017-01-01

    CMS-2728 form (Medical Evidence Report) assesses 23 comorbidities chosen to reflect poor outcomes and increased mortality risk. Previous studies questioned the validity of physician reporting on forms CMS-2728. We hypothesize that reporting of comorbidities by computer algorithms identifies more comorbidities than physician completion, and, therefore, is more reflective of underlying disease burden. We collected data from CMS-2728 forms for all 296 patients who had incident ESRD diagnosis and received chronic dialysis from 2005 through 2014 at Indiana University outpatient dialysis centers. We analyzed patients' data from electronic medical records systems that collated information from multiple health care sources. Previously utilized algorithms or natural language processing was used to extract data on 10 comorbidities for a period of up to 10 years prior to ESRD incidence. These algorithms incorporate billing codes, prescriptions, and other relevant elements. We compared the presence or unchecked status of these comorbidities on the forms to the presence or absence according to the algorithms. Computer algorithms had higher reporting of comorbidities compared to forms completion by physicians. This remained true when decreasing data span to one year and using only a single health center source. The algorithms determination was well accepted by a physician panel. Importantly, algorithms use significantly increased the expected deaths and lowered the standardized mortality ratios. Using computer algorithms showed superior identification of comorbidities for form CMS-2728 and altered standardized mortality ratios. Adapting similar algorithms in available EMR systems may offer more thorough evaluation of comorbidities and improve quality reporting. © 2016 International Society for Hemodialysis.

  9. A Non-Equilibrium Sediment Transport Model for Coastal Inlets and Navigation Channels

    DTIC Science & Technology

    2011-01-01

    exchange of water , sediment, and nutrients between estuaries and the ocean. Because of the multiple interacting forces (waves, wind, tide, river...in parallel using OpenMP. The CMS takes advantage of the Surface- water Modeling System (SMS) interface for grid generation and model setup, as well...as for plotting and post- processing (Zundel, 2000). The circulation model in the CMS (called CMS-Flow) computes the unsteady water level and

  10. Elastic Extension of a CMS Computing Centre Resources on External Clouds

    NASA Astrophysics Data System (ADS)

    Codispoti, G.; Di Maria, R.; Aiftimiei, C.; Bonacorsi, D.; Calligola, P.; Ciaschini, V.; Costantini, A.; Dal Pra, S.; DeGirolamo, D.; Grandi, C.; Michelotto, D.; Panella, M.; Peco, G.; Sapunenko, V.; Sgaravatto, M.; Taneja, S.; Zizzi, G.

    2016-10-01

    After the successful LHC data taking in Run-I and in view of the future runs, the LHC experiments are facing new challenges in the design and operation of the computing facilities. The computing infrastructure for Run-II is dimensioned to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, CMS - along the lines followed by other LHC experiments - is exploring the opportunity to access Cloud resources provided by external partners or commercial providers. Specific use cases have already been explored and successfully exploited during Long Shutdown 1 (LS1) and the first part of Run 2. In this work we present the proof of concept of the elastic extension of a CMS site, specifically the Bologna Tier-3, on an external OpenStack infrastructure. We focus on the “Cloud Bursting” of a CMS Grid site using a newly designed LSF configuration that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on the OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage. The amount of resources allocated thus can be elastically modeled to cope up with the needs of CMS experiment and local users. Moreover, a direct access/integration of OpenStack resources to the CMS workload management system is explored. In this paper we present this approach, we report on the performances of the on-demand allocated resources, and we discuss the lessons learned and the next steps.

  11. Opportunistic Resource Usage in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreuzer, Peter; Hufnagel, Dirk; Dykstra, D.

    2014-01-01

    CMS is using a tiered setup of dedicated computing resources provided by sites distributed over the world and organized in WLCG. These sites pledge resources to CMS and are preparing them especially for CMS to run the experiment's applications. But there are more resources available opportunistically both on the GRID and in local university and research clusters which can be used for CMS applications. We will present CMS' strategy to use opportunistic resources and prepare them dynamically to run CMS applications. CMS is able to run its applications on resources that can be reached through the GRID, through EC2 compliantmore » cloud interfaces. Even resources that can be used through ssh login nodes can be harnessed. All of these usage modes are integrated transparently into the GlideIn WMS submission infrastructure, which is the basis of CMS' opportunistic resource usage strategy. Technologies like Parrot to mount the software distribution via CVMFS and xrootd for access to data and simulation samples via the WAN are used and will be described. We will summarize the experience with opportunistic resource usage and give an outlook for the restart of LHC data taking in 2015.« less

  12. 42 CFR 423.272 - Review and negotiation of bid and approval of plans submitted by potential Part D sponsors.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... submitted by potential Part D sponsors. 423.272 Section 423.272 Public Health CENTERS FOR MEDICARE... and negotiation of bid and approval of plans submitted by potential Part D sponsors. (a) Review and...) Substantial differences between bids—(i) General. CMS approves a bid only if it finds that the benefit package...

  13. 42 CFR 423.272 - Review and negotiation of bid and approval of plans submitted by potential Part D sponsors.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... submitted by potential Part D sponsors. 423.272 Section 423.272 Public Health CENTERS FOR MEDICARE... and negotiation of bid and approval of plans submitted by potential Part D sponsors. (a) Review and...) Substantial differences between bids—(i) General. CMS approves a bid only if it finds that the benefit package...

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; Clemencic, M.; Dykstra, D.

    The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less

  15. Health and performance monitoring of the online computer cluster of CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauer, G.; et al.

    2012-01-01

    The CMS experiment at the LHC features over 2'500 devices that need constant monitoring in order to ensure proper data taking. The monitoring solution has been migrated from Nagios to Icinga, with several useful plugins. The motivations behind the migration and the selection of the plugins are discussed.

  16. Integration of end-user Cloud storage for CMS analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riahi, Hassen; Aimar, Alberto; Ayllon, Alejandro Alvarez

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achievemore » results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with the CMS distributed computing model. We describe the new challenges faced in data management between Grid and Cloud and how they were addressed, along with details of the support for Cloud storage recently introduced into the WLCG data movement middleware, FTS3. Finally, the commissioning experience of CERNBox for the distributed data analysis activity is also presented.« less

  17. Integration of end-user Cloud storage for CMS analysis

    DOE PAGES

    Riahi, Hassen; Aimar, Alberto; Ayllon, Alejandro Alvarez; ...

    2017-05-19

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achievemore » results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with the CMS distributed computing model. We describe the new challenges faced in data management between Grid and Cloud and how they were addressed, along with details of the support for Cloud storage recently introduced into the WLCG data movement middleware, FTS3. Finally, the commissioning experience of CERNBox for the distributed data analysis activity is also presented.« less

  18. Muons in the CMS High Level Trigger System

    NASA Astrophysics Data System (ADS)

    Verwilligen, Piet; CMS Collaboration

    2016-04-01

    The trigger systems of LHC detectors play a fundamental role in defining the physics capabilities of the experiments. A reduction of several orders of magnitude in the rate of collected events, with respect to the proton-proton bunch crossing rate generated by the LHC, is mandatory to cope with the limits imposed by the readout and storage system. An accurate and efficient online selection mechanism is thus required to fulfill the task keeping maximal the acceptance to physics signals. The CMS experiment operates using a two-level trigger system. Firstly a Level-1 Trigger (L1T) system, implemented using custom-designed electronics, is designed to reduce the event rate to a limit compatible to the CMS Data Acquisition (DAQ) capabilities. A High Level Trigger System (HLT) follows, aimed at further reducing the rate of collected events finally stored for analysis purposes. The latter consists of a streamlined version of the CMS offline reconstruction software and operates on a computer farm. It runs algorithms optimized to make a trade-off between computational complexity, rate reduction and high selection efficiency. With the computing power available in 2012 the maximum reconstruction time at HLT was about 200 ms per event, at the nominal L1T rate of 100 kHz. An efficient selection of muons at HLT, as well as an accurate measurement of their properties, such as transverse momentum and isolation, is fundamental for the CMS physics programme. The performance of the muon HLT for single and double muon triggers achieved in Run I will be presented. Results from new developments, aimed at improving the performance of the algorithms for the harsher scenarios of collisions per event (pile-up) and luminosity expected for Run II will also be discussed.

  19. Are computational models of any use to psychiatry?

    PubMed

    Huys, Quentin J M; Moutoussis, Michael; Williams, Jonathan

    2011-08-01

    Mathematically rigorous descriptions of key hypotheses and theories are becoming more common in neuroscience and are beginning to be applied to psychiatry. In this article two fictional characters, Dr. Strong and Mr. Micawber, debate the use of such computational models (CMs) in psychiatry. We present four fundamental challenges to the use of CMs in psychiatry: (a) the applicability of mathematical approaches to core concepts in psychiatry such as subjective experiences, conflict and suffering; (b) whether psychiatry is mature enough to allow informative modelling; (c) whether theoretical techniques are powerful enough to approach psychiatric problems; and (d) the issue of communicating clinical concepts to theoreticians and vice versa. We argue that CMs have yet to influence psychiatric practice, but that they help psychiatric research in two fundamental ways: (a) to build better theories integrating psychiatry with neuroscience; and (b) to enforce explicit, global and efficient testing of hypotheses through more powerful analytical methods. CMs allow the complexity of a hypothesis to be rigorously weighed against the complexity of the data. The paper concludes with a discussion of the path ahead. It points to stumbling blocks, like the poor communication between theoretical and medical communities. But it also identifies areas in which the contributions of CMs will likely be pivotal, like an understanding of social influences in psychiatry, and of the co-morbidity structure of psychiatric diseases. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2009-11-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  1. How to Compute a Slot Marker - Calculation of Controller Managed Spacing Tools for Efficient Descents with Precision Scheduling

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas

    2012-01-01

    This paper describes the underlying principles and algorithms for computing the primary controller managed spacing (CMS) tools developed at NASA for precisely spacing aircraft along efficient descent paths. The trajectory-based CMS tools include slot markers, delay indications and speed advisories. These tools are one of three core NASA technologies integrated in NASAs ATM technology demonstration-1 (ATD-1) that will operationally demonstrate the feasibility of fuel-efficient, high throughput arrival operations using Automatic Dependent Surveillance Broadcast (ADS-B) and ground-based and airborne NASA technologies for precision scheduling and spacing.

  2. HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation

    DOE PAGES

    Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian; ...

    2017-09-29

    Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less

  3. HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian

    Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less

  4. Theory of compressive modeling and simulation

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith

    2013-05-01

    Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .

  5. Stability and Scalability of the CMS Global Pool: Pushing HTCondor and GlideinWMS to New Limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balcas, J.; Bockelman, B.; Hufnagel, D.

    The CMS Global Pool, based on HTCondor and glideinWMS, is the main computing resource provisioning system for all CMS workflows, including analysis, Monte Carlo production, and detector data reprocessing activities. The total resources at Tier-1 and Tier-2 grid sites pledged to CMS exceed 100,000 CPU cores, while another 50,000 to 100,000 CPU cores are available opportunistically, pushing the needs of the Global Pool to higher scales each year. These resources are becoming more diverse in their accessibility and configuration over time. Furthermore, the challenge of stably running at higher and higher scales while introducing new modes of operation such asmore » multi-core pilots, as well as the chaotic nature of physics analysis workflows, places huge strains on the submission infrastructure. This paper details some of the most important challenges to scalability and stability that the CMS Global Pool has faced since the beginning of the LHC Run II and how they were overcome.« less

  6. Connecting Restricted, High-Availability, or Low-Latency Resources to a Seamless Global Pool for CMS

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B.; Hufnagel, D.; Hurtado Anampa, K.; Jayatilaka, B.; Khan, F.; Larson, K.; Letts, J.; Mascheroni, M.; Mohapatra, A.; Marra Da Silva, J.; Mason, D.; Perez-Calero Yzquierdo, A.; Piperov, S.; Tiradani, A.; Verguilov, V.; CMS Collaboration

    2017-10-01

    The connection of diverse and sometimes non-Grid enabled resource types to the CMS Global Pool, which is based on HTCondor and glideinWMS, has been a major goal of CMS. These resources range in type from a high-availability, low latency facility at CERN for urgent calibration studies, called the CAF, to a local user facility at the Fermilab LPC, allocation-based computing resources at NERSC and SDSC, opportunistic resources provided through the Open Science Grid, commercial clouds, and others, as well as access to opportunistic cycles on the CMS High Level Trigger farm. In addition, we have provided the capability to give priority to local users of beyond WLCG pledged resources at CMS sites. Many of the solutions employed to bring these diverse resource types into the Global Pool have common elements, while some are very specific to a particular project. This paper details some of the strategies and solutions used to access these resources through the Global Pool in a seamless manner.

  7. Stability and scalability of the CMS Global Pool: Pushing HTCondor and glideinWMS to new limits

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B.; Hufnagel, D.; Hurtado Anampa, K.; Aftab Khan, F.; Larson, K.; Letts, J.; Marra da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.

    2017-10-01

    The CMS Global Pool, based on HTCondor and glideinWMS, is the main computing resource provisioning system for all CMS workflows, including analysis, Monte Carlo production, and detector data reprocessing activities. The total resources at Tier-1 and Tier-2 grid sites pledged to CMS exceed 100,000 CPU cores, while another 50,000 to 100,000 CPU cores are available opportunistically, pushing the needs of the Global Pool to higher scales each year. These resources are becoming more diverse in their accessibility and configuration over time. Furthermore, the challenge of stably running at higher and higher scales while introducing new modes of operation such as multi-core pilots, as well as the chaotic nature of physics analysis workflows, places huge strains on the submission infrastructure. This paper details some of the most important challenges to scalability and stability that the CMS Global Pool has faced since the beginning of the LHC Run II and how they were overcome.

  8. Validation and Application of a Real-time PCR Protocol for the Specific Detection and Quantification of Clavibacter michiganensis subsp. sepedonicus in Potato.

    PubMed

    Cho, Min Seok; Park, Duck Hwan; Namgung, Min; Ahn, Tae-Young; Park, Dong Suk

    2015-06-01

    Clavibacter michiganensis subsp. sepedonicus (Cms) multiplies very rapidly, passing through the vascular strands and into the stems and petioles of a diseased potato. Therefore, the rapid and specific detection of this pathogen is highly important for the effective control of the pathogen. Although several PCR assays have been developed for detection, they cannot afford specific detection of Cms. Therefore, in this study, a computational genome analysis was performed to compare the sequenced genomes of the C. michiganensis subspecies and to identify an appropriate gene for the development of a subspecies-specific PCR primer set (Cms89F/R). The specificity of the primer set based on the putative phage-related protein was evaluated using genomic DNA from seven isolates of Cms and 27 other reference strains. The Cms89F/R primer set was more specific and sensitive than the existing assays in detecting Cms in in vitro using Cms cells and its genomic DNA. This assay was also able to detect at least 1.47×10(2) copies/μl of cloned-amplified target DNA, 5 fg of DNA using genomic DNA or 10(-6) dilution point of 0.12 at OD600 units of cells per reaction using a calibrated cell suspension.

  9. Validation and Application of a Real-time PCR Protocol for the Specific Detection and Quantification of Clavibacter michiganensis subsp. sepedonicus in Potato

    PubMed Central

    Cho, Min Seok; Park, Duck Hwan; Namgung, Min; Ahn, Tae-Young; Park, Dong Suk

    2015-01-01

    Clavibacter michiganensis subsp. sepedonicus (Cms) multiplies very rapidly, passing through the vascular strands and into the stems and petioles of a diseased potato. Therefore, the rapid and specific detection of this pathogen is highly important for the effective control of the pathogen. Although several PCR assays have been developed for detection, they cannot afford specific detection of Cms. Therefore, in this study, a computational genome analysis was performed to compare the sequenced genomes of the C. michiganensis subspecies and to identify an appropriate gene for the development of a subspecies-specific PCR primer set (Cms89F/R). The specificity of the primer set based on the putative phage-related protein was evaluated using genomic DNA from seven isolates of Cms and 27 other reference strains. The Cms89F/R primer set was more specific and sensitive than the existing assays in detecting Cms in in vitro using Cms cells and its genomic DNA. This assay was also able to detect at least 1.47×102 copies/μl of cloned-amplified target DNA, 5 fg of DNA using genomic DNA or 10−6 dilution point of 0.12 at OD600 units of cells per reaction using a calibrated cell suspension. PMID:26060431

  10. LCG Persistency Framework (CORAL, COOL, POOL): Status and Outlook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; /CERN; Clemencic, M.

    2012-04-19

    The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less

  11. Computing Fiber/Matrix Interfacial Effects In SiC/RBSN

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Hopkins, Dale A.

    1996-01-01

    Computational study conducted to demonstrate use of boundary-element method in analyzing effects of fiber/matrix interface on elastic and thermal behaviors of representative laminated composite materials. In study, boundary-element method implemented by Boundary Element Solution Technology - Composite Modeling System (BEST-CMS) computer program.

  12. Benchmarking high performance computing architectures with CMS’ skeleton framework

    NASA Astrophysics Data System (ADS)

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    2017-10-01

    In 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta, Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.

  13. Computer Aided Drafting Packages for Secondary Education. Edition 2. PC DOS Compatible Programs. A MicroSIFT Quarterly Report.

    ERIC Educational Resources Information Center

    Pollard, Jim

    This report reviews eight IBM-compatible software packages that are available to secondary schools to teach computer-aided drafting (CAD). Software packages to be considered were selected following reviews of CAD periodicals, computers in education periodicals, advertisements, and recommendations of teachers. The packages were then rated by…

  14. 78 FR 48169 - Privacy Act of 1974; CMS Computer Match No. 2013-02; HHS Computer Match No. 1306; DoD-DMDC Match...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-07

    ...), Defense Manpower Data Center (DMDC) and the Office of the Assistant Secretary of Defense (Health Affairs.../TRICARE. DMDC will receive the results of the computer match and provide the information to TMA for use in...

  15. 75 FR 54162 - Privacy Act of 1974

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503), amended the... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare and Medicaid Services [CMS Computer Match No. 2010-01; HHS Computer Match No. 1006] Privacy Act of 1974 AGENCY: Department of Health and...

  16. 78 FR 69926 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-21

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2013-0059] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare & Medicaid Services (CMS))--Match Number 1076 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching...

  17. 76 FR 21091 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-14

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2011-0022] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare & Medicaid Services (CMS))--Match Number 1076 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching...

  18. Benchmarking high performance computing architectures with CMS’ skeleton framework

    DOE PAGES

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    2017-11-23

    Here, in 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta,more » Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.« less

  19. Exploiting multicore compute resources in the CMS experiment

    NASA Astrophysics Data System (ADS)

    Ramírez, J. E.; Pérez-Calero Yzquierdo, A.; Hernández, J. M.; CMS Collaboration

    2016-10-01

    CMS has developed a strategy to efficiently exploit the multicore architecture of the compute resources accessible to the experiment. A coherent use of the multiple cores available in a compute node yields substantial gains in terms of resource utilization. The implemented approach makes use of the multithreading support of the event processing framework and the multicore scheduling capabilities of the resource provisioning system. Multicore slots are acquired and provisioned by means of multicore pilot agents which internally schedule and execute single and multicore payloads. Multicore scheduling and multithreaded processing are currently used in production for online event selection and prompt data reconstruction. More workflows are being adapted to run in multicore mode. This paper presents a review of the experience gained in the deployment and operation of the multicore scheduling and processing system, the current status and future plans.

  20. Benchmarking high performance computing architectures with CMS’ skeleton framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    Here, in 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta,more » Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.« less

  1. Model-free quantification of dynamic PET data using nonparametric deconvolution

    PubMed Central

    Zanderigo, Francesca; Parsey, Ramin V; Todd Ogden, R

    2015-01-01

    Dynamic positron emission tomography (PET) data are usually quantified using compartment models (CMs) or derived graphical approaches. Often, however, CMs either do not properly describe the tracer kinetics, or are not identifiable, leading to nonphysiologic estimates of the tracer binding. The PET data are modeled as the convolution of the metabolite-corrected input function and the tracer impulse response function (IRF) in the tissue. Using nonparametric deconvolution methods, it is possible to obtain model-free estimates of the IRF, from which functionals related to tracer volume of distribution and binding may be computed, but this approach has rarely been applied in PET. Here, we apply nonparametric deconvolution using singular value decomposition to simulated and test–retest clinical PET data with four reversible tracers well characterized by CMs ([11C]CUMI-101, [11C]DASB, [11C]PE2I, and [11C]WAY-100635), and systematically compare reproducibility, reliability, and identifiability of various IRF-derived functionals with that of traditional CMs outcomes. Results show that nonparametric deconvolution, completely free of any model assumptions, allows for estimates of tracer volume of distribution and binding that are very close to the estimates obtained with CMs and, in some cases, show better test–retest performance than CMs outcomes. PMID:25873427

  2. 77 FR 33547 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare and Medicaid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0015] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare and Medicaid Services (CMS))--Match Number 1094 AGENCY: Social Security Administration (SSA). ACTION: Notice of a new computer matching program that will expire...

  3. CompHEP: developments and applications

    NASA Astrophysics Data System (ADS)

    Boos, E. E.; Bunichev, V. E.; Dubinin, M. N.; Ilyin, V. A.; Savrin, V. I.; CompHEP Collaboration

    2017-11-01

    New developments of the CompHEP package and its applications to the top quark and the Higgs boson physics at the LHC collider are reviewed. These developments were motivated mainly by the needs of experimental searches of DO (Tevatron) and CMS (LHC) collaborations where identification of the top quark and the Higgs boson in the framework of the Standard Model (SM) or possible extensions of the SM played an important role. New useful features of the CompHEP Graphics User Interface (GUI) are described.

  4. Human induced pluripotent stem cell‐derived versus adult cardiomyocytes: an in silico electrophysiological study on effects of ionic current block

    PubMed Central

    Paci, M; Hyttinen, J; Rodriguez, B

    2015-01-01

    Background and Purpose Two new technologies are likely to revolutionize cardiac safety and drug development: in vitro experiments on human‐induced pluripotent stem cell‐derived cardiomyocytes (hiPSC‐CMs) and in silico human adult ventricular cardiomyocyte (hAdultV‐CM) models. Their combination was recently proposed as a potential replacement for the present hERG‐based QT study for pharmacological safety assessments. Here, we systematically compared in silico the effects of selective ionic current block on hiPSC‐CM and hAdultV‐CM action potentials (APs), to identify similarities/differences and to illustrate the potential of computational models as supportive tools for evaluating new in vitro technologies. Experimental Approach In silico AP models of ventricular‐like and atrial‐like hiPSC‐CMs and hAdultV‐CM were used to simulate the main effects of four degrees of block of the main cardiac transmembrane currents. Key Results Qualitatively, hiPSC‐CM and hAdultV‐CM APs showed similar responses to current block, consistent with results from experiments. However, quantitatively, hiPSC‐CMs were more sensitive to block of (i) L‐type Ca2+ currents due to the overexpression of the Na+/Ca2+ exchanger (leading to shorter APs) and (ii) the inward rectifier K+ current due to reduced repolarization reserve (inducing diastolic potential depolarization and repolarization failure). Conclusions and Implications In silico hiPSC‐CMs and hAdultV‐CMs exhibit a similar response to selective current blocks. However, overall hiPSC‐CMs show greater sensitivity to block, which may facilitate in vitro identification of drug‐induced effects. Extrapolation of drug effects from hiPSC‐CM to hAdultV‐CM and pro‐arrhythmic risk assessment can be facilitated by in silico predictions using biophysically‐based computational models. PMID:26276951

  5. Evolution of CMS workload management towards multicore job support

    NASA Astrophysics Data System (ADS)

    Pérez-Calero Yzquierdo, A.; Hernández, J. M.; Khan, F. A.; Letts, J.; Majewski, K.; Rodrigues, A. M.; McCrea, A.; Vaandering, E.

    2015-12-01

    The successful exploitation of multicore processor architectures is a key element of the LHC distributed computing system in the coming era of the LHC Run 2. High-pileup complex-collision events represent a challenge for the traditional sequential programming in terms of memory and processing time budget. The CMS data production and processing framework is introducing the parallel execution of the reconstruction and simulation algorithms to overcome these limitations. CMS plans to execute multicore jobs while still supporting singlecore processing for other tasks difficult to parallelize, such as user analysis. The CMS strategy for job management thus aims at integrating single and multicore job scheduling across the Grid. This is accomplished by employing multicore pilots with internal dynamic partitioning of the allocated resources, capable of running payloads of various core counts simultaneously. An extensive test programme has been conducted to enable multicore scheduling with the various local batch systems available at CMS sites, with the focus on the Tier-0 and Tier-1s, responsible during 2015 of the prompt data reconstruction. Scale tests have been run to analyse the performance of this scheduling strategy and ensure an efficient use of the distributed resources. This paper presents the evolution of the CMS job management and resource provisioning systems in order to support this hybrid scheduling model, as well as its deployment and performance tests, which will enable CMS to transition to a multicore production model for the second LHC run.

  6. Evolution of CMS Workload Management Towards Multicore Job Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Calero Yzquierdo, A.; Hernández, J. M.; Khan, F. A.

    The successful exploitation of multicore processor architectures is a key element of the LHC distributed computing system in the coming era of the LHC Run 2. High-pileup complex-collision events represent a challenge for the traditional sequential programming in terms of memory and processing time budget. The CMS data production and processing framework is introducing the parallel execution of the reconstruction and simulation algorithms to overcome these limitations. CMS plans to execute multicore jobs while still supporting singlecore processing for other tasks difficult to parallelize, such as user analysis. The CMS strategy for job management thus aims at integrating single andmore » multicore job scheduling across the Grid. This is accomplished by employing multicore pilots with internal dynamic partitioning of the allocated resources, capable of running payloads of various core counts simultaneously. An extensive test programme has been conducted to enable multicore scheduling with the various local batch systems available at CMS sites, with the focus on the Tier-0 and Tier-1s, responsible during 2015 of the prompt data reconstruction. Scale tests have been run to analyse the performance of this scheduling strategy and ensure an efficient use of the distributed resources. This paper presents the evolution of the CMS job management and resource provisioning systems in order to support this hybrid scheduling model, as well as its deployment and performance tests, which will enable CMS to transition to a multicore production model for the second LHC run.« less

  7. 78 FR 48170 - Privacy Act of 1974; CMS Computer Match No. 2013-12; HHS Computer Match No. 1307; SSA Computer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-07

    ... Wesolowski, Director, Verifications Policy & Operations Branch, Division of Eligibility and Enrollment Policy..., electronic interfaces and an on-line system for the verification of eligibility. PURPOSE(S) OF THE MATCHING... Security number (SSN) verifications, (2) a death indicator, (3) an indicator of a finding of disability by...

  8. Function Package for Computing Quantum Resource Measures

    NASA Astrophysics Data System (ADS)

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  9. Fast access to the CMS detector condition data employing HTML5 technologies

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    This paper focuses on using HTML version 5 (HTML5) for accessing condition data for the CMS experiment, evaluating the benefits and risks posed by the use of this technology. According to the authors of HTML5, this technology attempts to solve issues found in previous iterations of HTML and addresses the needs of web applications, an area previously not adequately covered by HTML. We demonstrate that employing HTML5 brings important benefits in terms of access performance to the CMS condition data. The combined use of web storage and web sockets allows increasing the performance and reducing the costs in term of computation power, memory usage and network bandwidth for client and server. Above all, the web workers allow creating different scripts that can be executed using multi-thread mode, exploiting multi-core microprocessors. Web workers have been employed in order to substantially decrease the web page rendering time to display the condition data stored in the CMS condition database.

  10. Implementation of radiation shielding calculation methods. Volume 1: Synopsis of methods and summary of results

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.

    1971-01-01

    The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.

  11. 78 FR 49525 - Privacy Act of 1974; CMS Computer Match No. 2013-06; HHS Computer Match No. 1308

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... Care Act of 2010 (Pub. L. 111-148), as amended by the Health Care and Education Reconciliation Act of..., 2009). INCLUSIVE DATES OF THE MATCH: The CMP will become effective no sooner than 40 days after the...

  12. 78 FR 49524 - Privacy Act of 1974; CMS Computer Match No. 2013-08; HHS Computer Match No. 1309

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111-152) (collectively, the ACA...). INCLUSIVE DATES OF THE MATCH: The CMP will become effective no sooner than 40 days after the report of the...

  13. 78 FR 50419 - Privacy Act of 1974; CMS Computer Match No. 2013-10; HHS Computer Match No. 1310

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... (Pub. L. 111- 148), as amended by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111... Entitlements Program System of Records Notice, 77 FR 47415 (August 8, 2012). Inclusive Dates of the Match: The...

  14. Search for Contact Interactions in Dilepton Final State in the CMS Experiment: Generator-Level Studies

    NASA Astrophysics Data System (ADS)

    Zaleski, Shawn

    2017-01-01

    A set of contact interaction (CI) Monte Carlo events, for which Standard Model Drell-Yan events are background, are generated using a leading-order parton-shower generator, Pythia8. We consider three isoscalar models with three different helicity structures, left-left (LL), left-right/right-left (LR), and right­right (RR), each with destructive and constructive interference. For each of these models, 150,000 events are generated for analysis of CI interactions in the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) with a centre of mass energy of 13 TeV. This study is a generator level study, and detector effects are accounted for by application of kinematic cuts on the generator-level quantities rather than application of a detailed detector simulation package (e.g. GEANT). Distributions of dilepton invariant mass, Collins-Soper angle, and the forward-backward asymmetry are compared with those arising from pure Drell-Yan events.

  15. Standard duplex criteria overestimate the degree of stenosis after eversion carotid endarterectomy.

    PubMed

    Benzing, Travis; Wilhoit, Cameron; Wright, Sharee; McCann, P Aaron; Lessner, Susan; Brothers, Thomas E

    2015-06-01

    The eversion technique for carotid endarterectomy (eCEA) offers an alternative to longitudinal arteriotomy and patch closure (pCEA) for open carotid revascularization. In some reports, eCEA has been associated with a higher rate of >50% restenosis of the internal carotid when it is defined as peak systolic velocity (PSV) >125 cm/s by duplex imaging. Because the conformation of the carotid bifurcation may differ after eCEA compared with native carotid arteries, it was hypothesized that standard duplex criteria might not accurately reflect the presence of restenosis after eCEA. In a case-control study, the outcomes of all patients undergoing carotid endarterectomy by one surgeon during the last 10 years were analyzed retrospectively, with a primary end point of PSV >125 cm/s. Duplex flow velocities were compared with luminal diameter measurements for any carotid computed tomography arteriography or magnetic resonance angiography study obtained within 2 months of duplex imaging, with the degree of stenosis calculated by the methodology used in the North American Symptomatic Carotid Endarterectomy Trial (NASCET) and the European Carotid Surgery Trial (ECST) as well as cross-sectional area (CSA) reduction. Simulations were generated and analyzed by computational model simulations of the eCEA and pCEA arteries. Eversion and longitudinal arteriotomy with patch techniques were used in 118 and 177 carotid arteries, respectively. Duplex follow-up was available in 90 eCEA arteries at a median of 16 (range, 2-136) months and in 150 pCEA arteries at a median of 41 (range, 3-115) months postoperatively. PSV >125 cm/s was present at some time during follow-up in 31% of eCEA and pCEA carotid arteries, each, and in the most recent duplex examination in 7% after eCEA and 21% after pCEA (P = .003), with no eCEA and two pCEA arteries occluding completely during follow-up (P = .29). In 19 carotid arteries with PSV >125 cm/s after angle correction (median, 160 cm/s; interquartile range, 146-432 cm/s) after eCEA that were subsequently examined by axial imaging, the mean percentage stenosis was 8% ± 11% by NASCET, 11% ± 5% by ECST, and 20% ± 9% by CSA criteria. For eight pCEA arteries with PSV >125 cm/s (median velocity, 148 cm/s; interquartile range, 139-242 cm/s), the corresponding NASCET, ECST, and CSA stenoses were 8% ± 35%, 26% ± 32%, and 25% ± 33%, respectively. NASCET internal carotid diameter reduction of at least 50% was noted by axial imaging after two of the eight pCEAs, and the PSV exceeded 200 cm/s in each case. The presence of hemodynamically significant carotid artery restenosis may be overestimated by standard duplex criteria after eCEA and perhaps after pCEA. Insufficient information currently exists to determine what PSV does correspond to hemodynamically significant restenosis. Published by Elsevier Inc.

  16. Intelligent systems and advanced user interfaces for design, operation, and maintenance of command management systems

    NASA Technical Reports Server (NTRS)

    Potter, William J.; Mitchell, Christine M.

    1993-01-01

    Historically, command management systems (CMS) have been large and expensive spacecraft-specific software systems that were costly to build, operate, and maintain. Current and emerging hardware, software, and user interface technologies may offer an opportunity to facilitate the initial formulation and design of a spacecraft-specific CMS as well as to develop a more generic CMS system. New technologies, in addition to a core CMS common to a range of spacecraft, may facilitate the training and enhance the efficiency of CMS operations. Current mission operations center (MOC) hardware and software include Unix workstations, the C/C++ programming languages, and an X window interface. This configuration provides the power and flexibility to support sophisticated and intelligent user interfaces that exploit state-of-the-art technologies in human-machine interaction, artificial intelligence, and software engineering. One of the goals of this research is to explore the extent to which technologies developed in the research laboratory can be productively applied in a complex system such as spacecraft command management. Initial examination of some of these issues in CMS design and operation suggests that application of technologies such as intelligent planning, case-based reasoning, human-machine systems design and analysis tools (e.g., operator and designer models), and human-computer interaction tools (e.g., graphics, visualization, and animation) may provide significant savings in the design, operation, and maintenance of the CMS for a specific spacecraft as well as continuity for CMS design and development across spacecraft. The first six months of this research saw a broad investigation by Georgia Tech researchers into the function, design, and operation of current and planned command management systems at Goddard Space Flight Center. As the first step, the researchers attempted to understand the current and anticipated horizons of command management systems at Goddard. Preliminary results are given on CMS commonalities and causes of low re-use, and methods are proposed to facilitate increased re-use.

  17. Computer designed compensation filters for use in radiation therapy. Master's thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Higgins, R. Jr.

    1982-12-01

    A computer program was written in the MUMPS language to design filters for use in cancer radiotherapy. The filter corrects for patient surface irregularities and allows homogeneous dose distribution with depth in the patient. The program does not correct for variations in the density of the patient. The program uses data available from the software in Computerized Medical Systems Inc.'s Radiation Treatment Planning package. External contours of General Electric CAT scans are made using the RTP software. The program uses the data from these external contours in designing the compensation filters. The program is written to process from 3 tomore » 31, 1cm thick, CAT scan slices. The output from the program can be in one of two different forms. The first option will drive the probe of a CMS Water Phantom in three dimensions as if it were the bit of a routing machine. Thus a routing machine constructed to run from the same output that drives the Water Phantom probe would produce a three dimensional filter mold. The second option is a listing of thicknesses for an array of aluminum blocks to filter the radiation. The size of the filter array is 10 in. by 10 in. The Printronix printer provides an array of blocks 1/2 in. by 1/2 in. with the thickness in millimeters printed inside each block.« less

  18. Theoretical analysis of HVAC duct hanger systems

    NASA Technical Reports Server (NTRS)

    Miller, R. D.

    1987-01-01

    Several methods are presented which, together, may be used in the analysis of duct hanger systems over a wide range of frequencies. The finite element method (FEM) and component mode synthesis (CMS) method are used for low- to mid-frequency range computations and have been shown to yield reasonably close results. The statistical energy analysis (SEA) method yields predictions which agree with the CMS results for the 800 to 1000 Hz range provided that a sufficient number of modes participate. The CMS approach has been shown to yield valuable insight into the mid-frequency range of the analysis. It has been demonstrated that it is possible to conduct an analysis of a duct/hanger system in a cost-effective way for a wide frequency range, using several methods which overlap for several frequency bands.

  19. 78 FR 39730 - Privacy Act of 1974; CMS Computer Match No. 2013-11; HHS Computer Match No. 1302

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-02

    ... (Pub. L. 111-148), as amended by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111... 78 FR 32256 on May 29, 2013. Inclusive Dates of the Match: The CMP shall become effective no sooner...

  20. 45 CFR 150.429 - Computation of time and extensions of time.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Computation of time and extensions of time. 150.429 Section 150.429 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings...

  1. 45 CFR 150.429 - Computation of time and extensions of time.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Computation of time and extensions of time. 150.429 Section 150.429 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings...

  2. Tracking at High Level Trigger in CMS

    NASA Astrophysics Data System (ADS)

    Tosi, M.

    2016-04-01

    The trigger systems of the LHC detectors play a crucial role in determining the physics capabilities of experiments. A reduction of several orders of magnitude of the event rate is needed to reach values compatible with detector readout, offline storage and analysis capability. The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger (L1T), implemented on custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a trade-off between the complexity of the algorithms, the sustainable output rate, and the selection efficiency. With the computing power available during the 2012 data taking the maximum reconstruction time at HLT was about 200 ms per event, at the nominal L1T rate of 100 kHz. Track reconstruction algorithms are widely used in the HLT, for the reconstruction of the physics objects as well as in the identification of b-jets and lepton isolation. Reconstructed tracks are also used to distinguish the primary vertex, which identifies the hard interaction process, from the pileup ones. This task is particularly important in the LHC environment given the large number of interactions per bunch crossing: on average 25 in 2012, and expected to be around 40 in Run II. We will present the performance of HLT tracking algorithms, discussing its impact on CMS physics program, as well as new developments done towards the next data taking in 2015.

  3. EFTofPNG: a package for high precision computation with the effective field theory of post-Newtonian gravity

    NASA Astrophysics Data System (ADS)

    Levi, Michele; Steinhoff, Jan

    2017-12-01

    We present a novel public package ‘EFTofPNG’ for high precision computation in the effective field theory of post-Newtonian (PN) gravity, including spins. We created this package in view of the timely need to publicly share automated computation tools, which integrate the various types of physics manifested in the expected increasing influx of gravitational wave (GW) data. Hence, we created a free and open source package, which is self-contained, modular, all-inclusive, and accessible to the classical gravity community. The ‘EFTofPNG’ Mathematica package also uses the power of the ‘xTensor’ package, suited for complicated tensor computation, where our coding also strategically approaches the generic generation of Feynman contractions, which is universal to all perturbation theories in physics, by efficiently treating n-point functions as tensors of rank n. The package currently contains four independent units, which serve as subsidiaries to the main one. Its final unit serves as a pipeline chain for the obtainment of the final GW templates, and provides the full computation of derivatives and physical observables of interest. The upcoming ‘EFTofPNG’ package version 1.0 should cover the point mass sector, and all the spin sectors, up to the fourth PN order, and the two-loop level. We expect and strongly encourage public development of the package to improve its efficiency, and to extend it to further PN sectors, and observables useful for the waveform modelling.

  4. First Experiences with CMS Data Storage on the GEMSS System at the INFN-CNAF Tier-1

    NASA Astrophysics Data System (ADS)

    Andreotti, D.; Bonacorsi, D.; Cavalli, A.; Pra, S. Dal; Dell'Agnello, L.; Forti, Alberto; Grandi, C.; Gregori, D.; Gioi, L. Li; Martelli, B.; Prosperini, A.; Ricci, P. P.; Ronchieri, Elisabetta; Sapunenko, V.; Sartirana, A.; Vagnoni, V.; Zappi, Riccardo

    A brand new Mass Storage System solution called "Grid-Enabled Mass Storage System" (GEMSS) -based on the Storage Resource Manager (StoRM) developed by INFN, on the General Parallel File System by IBM and on the Tivoli Storage Manager by IBM -has been tested and deployed at the INFNCNAF Tier-1 Computing Centre in Italy. After a successful stress test phase, the solution is now being used in production for the data custodiality of the CMS experiment at CNAF. All data previously recorded on the CASTOR system have been transferred to GEMSS. As final validation of the GEMSS system, some of the computing tests done in the context of the WLCG "Scale Test for the Experiment Program" (STEP'09) challenge were repeated in September-October 2009 and compared with the results previously obtained with CASTOR in June 2009. In this paper, the GEMSS system basics, the stress test activity and the deployment phase -as well as the reliability and performance of the system -are overviewed. The experiences in the use of GEMSS at CNAF in preparing for the first months of data taking of the CMS experiment at the Large Hadron Collider are also presented.

  5. CMS Readiness for Multi-Core Workload Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.

    In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides amore » solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.« less

  6. CMS readiness for multi-core workload scheduling

    NASA Astrophysics Data System (ADS)

    Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.; Aftab Khan, F.; Letts, J.; Mason, D.; Verguilov, V.

    2017-10-01

    In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides a solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.

  7. The gputools package enables GPU computing in R.

    PubMed

    Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan

    2010-01-01

    By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu

  8. Harmonisation of Global Land-Use Scenarios for the Period 1500-2100 for IPCC-AR5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurtt, George; Chini, Louise Parsons; Frolking, Steve

    2009-06-01

    In preparation for the fifth Intergovernmental Panel on Climate Change climate change assessment (IPCC-AR5), the international community is developing new advanced computer models (CMs) to address the combined effects of human activities (e.g. land-use and fossil fuel emissions) on the carbon-climate system. In addition, four Representative Concentration Pathway (RCP) scenarios of the future (2005-2100) are being developed by four Integrated Assessment Modeling teams (IAMs) to be used as input to the CMs for future climate projections. The diversity of requirements and approaches among CMs and IAMs for tracking land-use changes (past, present, and future), presents major challenges for treating land-usemore » comprehensively and consistently between these communities. As part of an international working group, we have been working to meet these challenges by developing a "harmonized" set of land-use change scenarios that smoothly connects gridded historical reconstructions of land-use with future projections, in a format required by CMs. This approach to harmonizing the treatment of land-use between two key modeling communities, CMs and IAMs, represents a major advance that will facilitate more consistent and fuller treatments of land-use/land-use change effects including both CO2 emissions and corresponding land-surface changes.« less

  9. Comparison of the accuracy of maxillary position between conventional model surgery and virtual surgical planning.

    PubMed

    Ritto, F G; Schmitt, A R M; Pimentel, T; Canellas, J V; Medeiros, P J

    2018-02-01

    The aim of this study was to determine whether virtual surgical planning (VSP) is an accurate method for positioning the maxilla when compared to conventional articulator model surgery (CMS), through the superimposition of computed tomography (CT) images. This retrospective study included the records of 30 adult patients submitted to bimaxillary orthognathic surgery. Two groups were created according to the treatment planning performed: CMS and VSP. The treatment planning protocol was the same for all patients. Pre- and postoperative CT images were superimposed and the linear distances between upper jaw reference points were measured. Measurements were then compared to the treatment planning, and the difference in accuracy between CMS and VSP was determined using the t-test for independent samples. The success criterion adopted was a mean linear difference of <2mm. The mean linear difference between planned and obtained movements for CMS was 1.27±1.05mm, and for VSP was 1.20±1.08mm. With CMS, 80% of overlapping reference points had a difference of <2mm, while for VSP this value was 83.6%. There was no statistically significant difference between the two techniques regarding accuracy (P>0.05). Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  10. Progress in Machine Learning Studies for the CMS Computing Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonacorsi, Daniele; Kuznetsov, Valentin; Magini, Nicolo

    Here, computing systems for LHC experiments developed together with Grids worldwide. While a complete description of the original Grid-based infrastructure and services for LHC experiments and its recent evolutions can be found elsewhere, it is worth to mention here the scale of the computing resources needed to fulfill the needs of LHC experiments in Run-1 and Run-2 so far.

  11. Progress in Machine Learning Studies for the CMS Computing Infrastructure

    DOE PAGES

    Bonacorsi, Daniele; Kuznetsov, Valentin; Magini, Nicolo; ...

    2017-12-06

    Here, computing systems for LHC experiments developed together with Grids worldwide. While a complete description of the original Grid-based infrastructure and services for LHC experiments and its recent evolutions can be found elsewhere, it is worth to mention here the scale of the computing resources needed to fulfill the needs of LHC experiments in Run-1 and Run-2 so far.

  12. 42 CFR 413.337 - Methodology for calculating the prospective payment rates.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... excluded from the data base used to compute the Federal payment rates. In addition, allowable costs related to exceptions payments under § 413.30(f) are excluded from the data base used to compute the Federal... prospective payment rates. (a) Data used. (1) To calculate the prospective payment rates, CMS uses— (i...

  13. 42 CFR 413.337 - Methodology for calculating the prospective payment rates.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... excluded from the data base used to compute the Federal payment rates. In addition, allowable costs related to exceptions payments under § 413.30(f) are excluded from the data base used to compute the Federal... prospective payment rates. (a) Data used. (1) To calculate the prospective payment rates, CMS uses— (i...

  14. 42 CFR 413.337 - Methodology for calculating the prospective payment rates.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... excluded from the data base used to compute the Federal payment rates. In addition, allowable costs related to exceptions payments under § 413.30(f) are excluded from the data base used to compute the Federal... prospective payment rates. (a) Data used. (1) To calculate the prospective payment rates, CMS uses— (i...

  15. Morphometric analysis of stab wounds by MSCT and MRI after the instillation of contrast medium.

    PubMed

    Fais, Paolo; Cecchetto, Giovanni; Boscolo-Berto, Rafael; Toniolo, Matteo; Viel, Guido; Miotto, Diego; Montisci, Massimo; Tagliaro, Franco; Giraudo, Chiara

    2016-06-01

    To analyze the morphology and depth of stab wounds experimentally produced on human legs amputated for medical reasons using multislice computed tomography (MSCT) and magnetic resonance imaging (MRI) after the instillation of a single contrast medium solution (CMS). For morphological analysis, MSCT and MRI scans were performed before and after the instillation of CMS into the wound cavity. Depth measurements were performed on the sagittal view only after CMS instillation. Subsequently, each wound was dissected using the layer-by-layer technique and the depth was measured by a ruler. One-way between-groups pairwise analysis of variance (ANOVA) and Bland-Altman plot analysis were used for comparing radiological and anatomical measurements. Unenhanced MSCT images did not identify the wound channels, whereas unenhanced MRI evidenced the wound cavity in 50 % of cases. After the instillation of CMS, both MSCT and MRI depicted the wound channel in all the investigated stabbings, although the morphology of the cavity was irregular and did not resemble the shape of the blade. The radiological measurements of the wounds' depth, after the application of CMS, exhibited a high level of agreement (about 95 % at Bland-Altman plot analysis) with the anatomical measurements at dissection. A similar systematic underestimation, however, has been evidenced for MSCT (average 11.4 %; 95 % CI 7-17) and MRI (average 9.6 %; 95 % CI 6-13) data after the instillation of CMS with respect to wound dissection measurements. MSCT and MRI after the instillation of CMS can be used for depicting the morphometric features of stab wounds, although depth measurements are affected by a slight systematic underestimation compared to layer-by-layer dissection.

  16. tkLayout: a design tool for innovative silicon tracking detectors

    NASA Astrophysics Data System (ADS)

    Bianchi, G.

    2014-03-01

    A new CMS tracker is scheduled to become operational for the LHC Phase 2 upgrade in the early 2020's. tkLayout is a software package developed to create 3d models for the design of the CMS tracker and to evaluate its fundamental performance figures. The new tracker will have to cope with much higher luminosity conditions, resulting in increased track density, harsher radiation exposure and, especially, much higher data acquisition bandwidth, such that equipping the tracker with triggering capabilities is envisaged. The design of an innovative detector involves deciding on an architecture offering the best trade-off among many figures of merit, such as tracking resolution, power dissipation, bandwidth, cost and so on. Quantitatively evaluating these figures of merit as early as possible in the design phase is of capital importance and it is best done with the aid of software models. tkLayout is a flexible modeling tool: new performance estimates and support for different detector geometries can be quickly added, thanks to its modular structure. Besides, the software executes very quickly (about two minutes), so that many possible architectural variations can be rapidly modeled and compared, to help in the choice of a viable detector layout and then to optimize it. A tracker geometry is generated from simple configuration files, defining the module types, layout and materials. Support structures are automatically added and services routed to provide a realistic tracker description. The tracker geometries thus generated can be exported to the standard CMS simulation framework (CMSSW) for full Monte Carlo studies. tkLayout has proven essential in giving guidance to CMS in studying different detector layouts and exploring the feasibility of innovative solutions for tracking detectors, in terms of design, performance and projected costs. This tool has been one of the keys to making important design decisions for over five years now and has also enabled project engineers and simulation experts to focus their efforts on other important or specific issues. Even if tkLayout was designed for the CMS tracker upgrade project, its flexibility makes it experiment-agnostic, so that it could be easily adapted to model other tracking detectors. The technology behind tkLayout is presented, as well as some of the results obtained in the context of the CMS silicon tracker design studies.

  17. The HEPCloud Facility: elastic computing for High Energy Physics - The NOvA Use Case

    NASA Astrophysics Data System (ADS)

    Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Norman, A.; Timm, S.; Tiradani, A.

    2017-10-01

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 38 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper describes the Fermilab HEPCloud Facility and the challenges overcome for the CMS and NOvA communities.

  18. From the CMS Computing Experience in the WLCG STEP'09 Challenge to the First Data Taking of the LHC Era

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Gutsche, O.

    The Worldwide LHC Computing Grid (WLCG) project decided in March 2009 to perform scale tests of parts of its overall Grid infrastructure before the start of the LHC data taking. The "Scale Test for the Experiment Program" (STEP'09) was performed mainly in June 2009 -with more selected tests in September- October 2009 -and emphasized the simultaneous test of the computing systems of all 4 LHC experiments. CMS tested its Tier-0 tape writing and processing capabilities. The Tier-1 tape systems were stress tested using the complete range of Tier-1 work-flows: transfer from Tier-0 and custody of data on tape, processing and subsequent archival, redistribution of datasets amongst all Tier-1 sites as well as burst transfers of datasets to Tier-2 sites. The Tier-2 analysis capacity was tested using bulk analysis job submissions to backfill normal user activity. In this talk, we will report on the different performed tests and present their post-mortem analysis.

  19. 10 CFR 431.92 - Definitions concerning commercial air conditioners and heat pumps.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... expressed in identical units of measurement. Commercial package air-conditioning and heating equipment means... application. Computer Room Air Conditioner means a basic model of commercial package air-conditioning and heating equipment (packaged or split) that is: Used in computer rooms, data processing rooms, or other...

  20. THE USE OF COMPUTER MODELING PACKAGES TO ILLUSTRATE UNCERTAINTY IN RISK ASSESSMENTS: AN EASE OF USE AND INTERPRETATION COMPARISON

    EPA Science Inventory

    Consistent improvements in processor speed and computer access have substantially increased the use of computer modeling by experts and non-experts alike. Several new computer modeling packages operating under graphical operating systems (i.e. Microsoft Windows or Macintosh) m...

  1. Open-source Software for Exoplanet Atmospheric Modeling

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Blecic, Jasmina; Harrington, Joseph

    2018-01-01

    I will present a suite of self-standing open-source tools to model and retrieve exoplanet spectra implemented for Python. These include: (1) a Bayesian-statistical package to run Levenberg-Marquardt optimization and Markov-chain Monte Carlo posterior sampling, (2) a package to compress line-transition data from HITRAN or Exomol without loss of information, (3) a package to compute partition functions for HITRAN molecules, (4) a package to compute collision-induced absorption, and (5) a package to produce radiative-transfer spectra of transit and eclipse exoplanet observations and atmospheric retrievals.

  2. SAHARA: A package of PC computer programs for estimating both log-hyperbolic grain-size parameters and standard moments

    NASA Astrophysics Data System (ADS)

    Christiansen, Christian; Hartmann, Daniel

    This paper documents a package of menu-driven POLYPASCAL87 computer programs for handling grouped observations data from both sieving (increment data) and settling tube procedures (cumulative data). The package is designed deliberately for use on IBM-compatible personal computers. Two of the programs solve the numerical problem of determining the estimates of the four (main) parameters of the log-hyperbolic distribution and their derivatives. The package also contains a program for determining the mean, sorting, skewness. and kurtosis according to the standard moments. Moreover, the package contains procedures for smoothing and grouping of settling tube data. A graphic part of the package plots the data in a log-log plot together with the estimated log-hyperbolic curve. Along with the plot follows all estimated parameters. Another graphic option is a plot of the log-hyperbolic shape triangle with the (χ,ζ) position of the sample.

  3. NMRbox: A Resource for Biomolecular NMR Computation.

    PubMed

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  4. Intelligent Systems and Advanced User Interfaces for Design, Operation, and Maintenance of Command Management Systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1998-01-01

    Historically Command Management Systems (CMS) have been large, expensive, spacecraft-specific software systems that were costly to build, operate, and maintain. Current and emerging hardware, software, and user interface technologies may offer an opportunity to facilitate the initial formulation and design of a spacecraft-specific CMS as well as a to develop a more generic or a set of core components for CMS systems. Current MOC (mission operations center) hardware and software include Unix workstations, the C/C++ and Java programming languages, and X and Java window interfaces representations. This configuration provides the power and flexibility to support sophisticated systems and intelligent user interfaces that exploit state-of-the-art technologies in human-machine systems engineering, decision making, artificial intelligence, and software engineering. One of the goals of this research is to explore the extent to which technologies developed in the research laboratory can be productively applied in a complex system such as spacecraft command management. Initial examination of some of the issues in CMS design and operation suggests that application of technologies such as intelligent planning, case-based reasoning, design and analysis tools from a human-machine systems engineering point of view (e.g., operator and designer models) and human-computer interaction tools, (e.g., graphics, visualization, and animation), may provide significant savings in the design, operation, and maintenance of a spacecraft-specific CMS as well as continuity for CMS design and development across spacecraft with varying needs. The savings in this case is in software reuse at all stages of the software engineering process.

  5. The CMS Tier0 goes cloud and grid for LHC Run 2

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threadedmore » framework to deal with the increased event complexity and to ensure efficient use of the resources. Furthermore, this contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.« less

  6. The CMS TierO goes Cloud and Grid for LHC Run 2

    NASA Astrophysics Data System (ADS)

    Hufnagel, Dirk

    2015-12-01

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threaded framework to deal with the increased event complexity and to ensure efficient use of the resources. This contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.

  7. Spin wave Feynman diagram vertex computation package

    NASA Astrophysics Data System (ADS)

    Price, Alexander; Javernick, Philip; Datta, Trinanjan

    Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.

  8. Cardiometabolic Syndrome in People With Spinal Cord Injury/Disease: Guideline-Derived and Nonguideline Risk Components in a Pooled Sample.

    PubMed

    Nash, Mark S; Tractenberg, Rochelle E; Mendez, Armando J; David, Maya; Ljungberg, Inger H; Tinsley, Emily A; Burns-Drecq, Patricia A; Betancourt, Luisa F; Groah, Suzanne L

    2016-10-01

    To assess cardiometabolic syndrome (CMS) risk definitions in spinal cord injury/disease (SCI/D). Cross-sectional analysis of a pooled sample. Two SCI/D academic medical and rehabilitation centers. Baseline data from subjects in 7 clinical studies were pooled; not all variables were collected in all studies; therefore, participant numbers varied from 119 to 389. The pooled sample included men (79%) and women (21%) with SCI/D >1 year at spinal cord levels spanning C3-T2 (American Spinal Injury Association Impairment Scale [AIS] grades A-D). Not applicable. We computed the prevalence of CMS using the American Heart Association/National Heart, Lung, and Blood Institute guideline (CMS diagnosis as sum of risks ≥3 method) for the following risk components: overweight/obesity, insulin resistance, hypertension, and dyslipidemia. We compared this prevalence with the risk calculated from 2 routinely used nonguideline CMS risk assessments: (1) key cut scores identifying insulin resistance derived from the homeostatic model 2 (HOMA2) method or quantitative insulin sensitivity check index (QUICKI), and (2) a cardioendocrine risk ratio based on an inflammation (C-reactive protein [CRP])-adjusted total cholesterol/high-density lipoprotein cholesterol ratio. After adjustment for multiple comparisons, injury level and AIS grade were unrelated to CMS or risk factors. Of the participants, 13% and 32.1% had CMS when using the sum of risks or HOMA2/QUICKI model, respectively. Overweight/obesity and (pre)hypertension were highly prevalent (83% and 62.1%, respectively), with risk for overweight/obesity being significantly associated with CMS diagnosis (sum of risks, χ(2)=10.105; adjusted P=.008). Insulin resistance was significantly associated with CMS when using the HOMA2/QUICKI model (χ(2)2=21.23, adjusted P<.001). Of the subjects, 76.4% were at moderate to high risk from elevated CRP, which was significantly associated with CMS determination (both methods; sum of risks, χ(2)2=10.198; adjusted P=.048 and HOMA2/QUICKI, χ(2)2=10.532; adjusted P=.04). As expected, guideline-derived CMS risk factors were prevalent in individuals with SCI/D. Overweight/obesity, hypertension, and elevated CRP were common in SCI/D and, because they may compound risks associated with CMS, should be considered population-specific risk determinants. Heightened surveillance for risk, and adoption of healthy living recommendations specifically directed toward weight reduction, hypertension management, and inflammation control, should be incorporated as a priority for disease prevention and management. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  9. Introduction of conditional mean spectrum and conditional spectrum in the practice of seismic safety evaluation in China

    NASA Astrophysics Data System (ADS)

    Ji, Kun; Bouaanani, Najib; Wen, Ruizhi; Ren, Yefei

    2018-05-01

    This paper aims at implementing and introducing the use of conditional mean spectrum (CMS) and conditional spectrum (CS) as the main input parameters in the practice of seismic safety evaluation (SSE) in China, instead of the currently used uniform hazard spectrum (UHS). For this purpose, a procedure for M-R-epsilon seismic hazard deaggregation in China was first developed. For illustration purposes, two different typical sites in China, with one to two dominant seismic zones, were considered as examples to carry out seismic hazard deaggregation and illustrate the construction of CMS/CS. Two types of correlation coefficients were used to generate CMS and the results were compared over a vibration period range of interest. Ground motion records were selected from the NSMONS (2007-2015) and PEER NGA-West2 databases to correspond to the target CMS and CS. Hazard consistency of the spectral accelerations of the selected ground motion records was evaluated and validated by computing the annual exceedance probability rate of the response spectra and comparing the results to the hazard curve corresponding to each site of concern at different periods. The tools developed in this work and their illustrative application to specific case studies in China are a first step towards the adoption of CMS and CS into the practice of seismic safety evaluation in this country.

  10. Computer program documentation modified version of the JA70 aerodynamic heating computer program H800 (MINIVER with a DISSPLA plot package

    NASA Technical Reports Server (NTRS)

    Olmedo, L.

    1980-01-01

    The changes, modifications, and inclusions which were adapted to the current version of the MINIVER program are discussed. Extensive modifications were made to various subroutines, and a new plot package added. This plot package is the Johnson Space Center DISSPLA Graphics System currently driven under an 1110 EXEC 8 configuration. User instructions on executing the MINIVER program are provided and the plot package is described.

  11. A histological ontology of the human cardiovascular system.

    PubMed

    Mazo, Claudia; Salazar, Liliana; Corcho, Oscar; Trujillo, Maria; Alegre, Enrique

    2017-10-02

    In this paper, we describe a histological ontology of the human cardiovascular system developed in collaboration among histology experts and computer scientists. The histological ontology is developed following an existing methodology using Conceptual Models (CMs) and validated using OOPS!, expert evaluation with CMs, and how accurately the ontology can answer the Competency Questions (CQ). It is publicly available at http://bioportal.bioontology.org/ontologies/HO and https://w3id.org/def/System . The histological ontology is developed to support complex tasks, such as supporting teaching activities, medical practices, and bio-medical research or having natural language interactions.

  12. Comparative Effects of Two Modes of Computer-Assisted Instructional Package on Solid Geometry Achievement

    ERIC Educational Resources Information Center

    Gambari, Isiaka Amosa; Ezenwa, Victoria Ifeoma; Anyanwu, Romanus Chogozie

    2014-01-01

    The study examined the effects of two modes of computer-assisted instructional package on solid geometry achievement amongst senior secondary school students in Minna, Niger State, Nigeria. Also, the influence of gender on the performance of students exposed to CAI(AT) and CAI(AN) packages were examined. This study adopted a pretest-posttest…

  13. Computers and Writing. Learning Package No. 33.

    ERIC Educational Resources Information Center

    Simic, Marge, Comp.; Smith, Carl, Ed.

    Originally developed as part of a project for the Department of Defense Schools (DoDDS) system, this learning package on computers and writing is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an…

  14. Cluster-based adaptive power control protocol using Hidden Markov Model for Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Vinutha, C. B.; Nalini, N.; Nagaraja, M.

    2017-06-01

    This paper presents strategies for an efficient and dynamic transmission power control technique, in order to reduce packet drop and hence energy consumption of power-hungry sensor nodes operated in highly non-linear channel conditions of Wireless Sensor Networks. Besides, we also focus to prolong network lifetime and scalability by designing cluster-based network structure. Specifically we consider weight-based clustering approach wherein, minimum significant node is chosen as Cluster Head (CH) which is computed stemmed from the factors distance, remaining residual battery power and received signal strength (RSS). Further, transmission power control schemes to fit into dynamic channel conditions are meticulously implemented using Hidden Markov Model (HMM) where probability transition matrix is formulated based on the observed RSS measurements. Typically, CH estimates initial transmission power of its cluster members (CMs) from RSS using HMM and broadcast this value to its CMs for initialising their power value. Further, if CH finds that there are variations in link quality and RSS of the CMs, it again re-computes and optimises the transmission power level of the nodes using HMM to avoid packet loss due noise interference. We have demonstrated our simulation results to prove that our technique efficiently controls the power levels of sensing nodes to save significant quantity of energy for different sized network.

  15. An Interactive Computer Aided Design and Analysis Package.

    DTIC Science & Technology

    1986-03-01

    Al-A167 114 AN INTERACTIVE COMPUTER AIDED DESIGN MUD ANAILYSIS 1/𔃼 PACKAGE(U) NAVAL POSTGRADUATE SCHOOL NONTEREY CA T L EUALD "AR 86 UNCLSSIFIED F... SCHOOL Monterey, California DTIC .LECTE MAYOS THESIS AN INTERACTIVE COMPUTER AIDED DESIGN AND ANALYSIS PACKAGE by Terrence L. Ewald March 1986 jThesis...ORGANIZATION Naval Postgraduate School (if dAp90h81111) Naval Postgraduate School . 62A 6C. ADDRESS (0ty. State, and ZIP Code) 7b. ADDRESS (City State. and

  16. A randomized, controlled, single-blind trial of teaching provided by a computer-based multimedia package versus lecture.

    PubMed

    Williams, C; Aubin, S; Harkin, P; Cottrell, D

    2001-09-01

    Computer-based teaching may allow effective teaching of important psychiatric knowledge and skills. To investigate the effectiveness and acceptability of computer-based teaching. A single-blind, randomized, controlled study of 166 undergraduate medical students at the University of Leeds, involving an educational intervention of either a structured lecture or a computer-based teaching package (both of equal duration). There was no difference in knowledge between the groups at baseline or immediately after teaching. Both groups made significant gains in knowledge after teaching. Students who attended the lecture rated their subjective knowledge and skills at a statistically significantly higher level than students who had used the computers. Students who had used the computer package scored higher on an objective measure of assessment skills. Students did not perceive the computer package to be as useful as the traditional lecture format, despite finding it easy to use and recommending its use to other students. Medical students rate themselves subjectively as learning less from computer-based as compared with lecture-based teaching. Objective measures suggest equivalence in knowledge acquisition and significantly greater skills acquisition for computer-based teaching.

  17. Computer Aided Drafting Packages for Secondary Education. Edition 1. Apple II and Macintosh. A MicroSIFT Quarterly Report.

    ERIC Educational Resources Information Center

    Pollard, Jim

    This report reviews software packages for Apple Macintosh and Apple II computers available to secondary schools to teach computer-aided drafting (CAD). Products for the report were gathered through reviews of CAD periodicals, computers in education periodicals, advertisements, and teacher recommendations. The first section lists the primary…

  18. Advance Directives and Do Not Resuscitate Orders

    MedlinePlus

    ... a form. Call a lawyer. Use a computer software package for legal documents. Advance directives and living ... you write by yourself or with a computer software package should follow your state laws. You may ...

  19. EQS Goes R: Simulations for SEM Using the Package REQS

    ERIC Educational Resources Information Center

    Mair, Patrick; Wu, Eric; Bentler, Peter M.

    2010-01-01

    The REQS package is an interface between the R environment of statistical computing and the EQS software for structural equation modeling. The package consists of 3 main functions that read EQS script files and import the results into R, call EQS script files from R, and run EQS script files from R and import the results after EQS computations.…

  20. The Computer as an Aid to Reading Instruction. Learning Package No. 27.

    ERIC Educational Resources Information Center

    Simic, Marge, Comp.; Smith, Carl, Ed.

    Originally developed for the Department of Defense Schools (DoDDS) system, this learning package on computer use in reading is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an overview on the…

  1. Monitoring data transfer latency in CMS computing operations

    DOE PAGES

    Bonacorsi, Daniele; Diotalevi, Tommaso; Magini, Nicolo; ...

    2015-12-23

    During the first LHC run, the CMS experiment collected tens of Petabytes of collision and simulated data, which need to be distributed among dozens of computing centres with low latency in order to make efficient use of the resources. While the desired level of throughput has been successfully achieved, it is still common to observe transfer workflows that cannot reach full completion in a timely manner due to a small fraction of stuck files which require operator intervention.For this reason, in 2012 the CMS transfer management system, PhEDEx, was instrumented with a monitoring system to measure file transfer latencies, andmore » to predict the completion time for the transfer of a data set. The operators can detect abnormal patterns in transfer latencies while the transfer is still in progress, and monitor the long-term performance of the transfer infrastructure to plan the data placement strategy.Based on the data collected for one year with the latency monitoring system, we present a study on the different factors that contribute to transfer completion time. As case studies, we analyze several typical CMS transfer workflows, such as distribution of collision event data from CERN or upload of simulated event data from the Tier-2 centres to the archival Tier-1 centres. For each workflow, we present the typical patterns of transfer latencies that have been identified with the latency monitor.We identify the areas in PhEDEx where a development effort can reduce the latency, and we show how we are able to detect stuck transfers which need operator intervention. Lastly, we propose a set of metrics to alert about stuck subscriptions and prompt for manual intervention, with the aim of improving transfer completion times.« less

  2. Monitoring data transfer latency in CMS computing operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonacorsi, Daniele; Diotalevi, Tommaso; Magini, Nicolo

    During the first LHC run, the CMS experiment collected tens of Petabytes of collision and simulated data, which need to be distributed among dozens of computing centres with low latency in order to make efficient use of the resources. While the desired level of throughput has been successfully achieved, it is still common to observe transfer workflows that cannot reach full completion in a timely manner due to a small fraction of stuck files which require operator intervention.For this reason, in 2012 the CMS transfer management system, PhEDEx, was instrumented with a monitoring system to measure file transfer latencies, andmore » to predict the completion time for the transfer of a data set. The operators can detect abnormal patterns in transfer latencies while the transfer is still in progress, and monitor the long-term performance of the transfer infrastructure to plan the data placement strategy.Based on the data collected for one year with the latency monitoring system, we present a study on the different factors that contribute to transfer completion time. As case studies, we analyze several typical CMS transfer workflows, such as distribution of collision event data from CERN or upload of simulated event data from the Tier-2 centres to the archival Tier-1 centres. For each workflow, we present the typical patterns of transfer latencies that have been identified with the latency monitor.We identify the areas in PhEDEx where a development effort can reduce the latency, and we show how we are able to detect stuck transfers which need operator intervention. Lastly, we propose a set of metrics to alert about stuck subscriptions and prompt for manual intervention, with the aim of improving transfer completion times.« less

  3. A New Streamflow-Routing (SFR1) Package to Simulate Stream-Aquifer Interaction with MODFLOW-2000

    USGS Publications Warehouse

    Prudic, David E.; Konikow, Leonard F.; Banta, Edward R.

    2004-01-01

    The increasing concern for water and its quality require improved methods to evaluate the interaction between streams and aquifers and the strong influence that streams can have on the flow and transport of contaminants through many aquifers. For this reason, a new Streamflow-Routing (SFR1) Package was written for use with the U.S. Geological Survey's MODFLOW-2000 ground-water flow model. The SFR1 Package is linked to the Lake (LAK3) Package, and both have been integrated with the Ground-Water Transport (GWT) Process of MODFLOW-2000 (MODFLOW-GWT). SFR1 replaces the previous Stream (STR1) Package, with the most important difference being that stream depth is computed at the midpoint of each reach instead of at the beginning of each reach, as was done in the original Stream Package. This approach allows for the addition and subtraction of water from runoff, precipitation, and evapotranspiration within each reach. Because the SFR1 Package computes stream depth differently than that for the original package, a different name was used to distinguish it from the original Stream (STR1) Package. The SFR1 Package has five options for simulating stream depth and four options for computing diversions from a stream. The options for computing stream depth are: a specified value; Manning's equation (using a wide rectangular channel or an eight-point cross section); a power equation; or a table of values that relate flow to depth and width. Each stream segment can have a different option. Outflow from lakes can be computed using the same options. Because the wetted perimeter is computed for the eight-point cross section and width is computed for the power equation and table of values, the streambed conductance term no longer needs to be calculated externally whenever the area of streambed changes as a function of flow. The concentration of solute is computed in a stream network when MODFLOW-GWT is used in conjunction with the SFR1 Package. The concentration of a solute in a stream reach is based on a mass-balance approach and accounts for exchanges with (inputs from or losses to) ground-water systems. Two test examples are used to illustrate some of the capabilities of the SFR1 Package. The first test simulation was designed to illustrate how pumping of ground water from an aquifer connected to streams can affect streamflow, depth, width, and streambed conductance using the different options. The second test simulation was designed to illustrate solute transport through interconnected lakes, streams, and aquifers. Because of the need to examine time series results from the model simulations, the Gage Package first described in the LAK3 documentation was revised to include time series results of selected variables (streamflows, stream depth and width, streambed conductance, solute concentrations, and solute loads) for specified stream reaches. The mass-balance or continuity approach for routing flow and solutes through a stream network may not be applicable for all interactions between streams and aquifers. The SFR1 Package is best suited for modeling long-term changes (months to hundreds of years) in ground-water flow and solute concentrations using averaged flows in streams. The Package is not recommended for modeling the transient exchange of water between streams and aquifers when the objective is to examine short-term (minutes to days) effects caused by rapidly changing streamflows.

  4. Candidate Reference Genes Selection and Application for RT-qPCR Analysis in Kenaf with Cytoplasmic Male Sterility Background

    PubMed Central

    Zhou, Bujin; Chen, Peng; Khan, Aziz; Zhao, Yanhong; Chen, Lihong; Liu, Dongmei; Liao, Xiaofang; Kong, Xiangjun; Zhou, Ruiyang

    2017-01-01

    Cytoplasmic male sterility (CMS) is a maternally inherited trait that results in the production of dysfunctional pollen. Based on reliable reference gene-normalized real-time quantitative PCR (RT-qPCR) data, examining gene expression profile can provide valuable information on the molecular mechanism of kenaf CMS. However, studies have not been conducted regarding selection of reference genes for normalizing RT-qPCR data in the CMS and maintainer lines of kenaf crop. Therefore, we studied 10 candidate reference genes (ACT3, ELF1A, G6PD, PEPKR1, TUB, TUA, CYP, GAPDH, H3, and 18S) to assess their expression stability at three stages of pollen development in CMS line 722A and maintainer line 722B of kenaf. Five computational statistical approaches (GeNorm, NormFinder, ΔCt, BestKeeper, and RefFinder) were used to evaluate the expression stability levels of these genes. According to RefFinder and GeNorm, the combination of TUB, CYP, and PEPKR1 was identified as an internal control for the accurate normalization across all sample set, which was further confirmed by validating the expression of HcPDIL5-2a. Furthermore, the combination of TUB, CYP, and PEPKR1 was used to differentiate the expression pattern of five mitochondria F1F0-ATPase subunit genes (atp1, atp4, atp6, atp8, and atp9) by RT-qPCR during pollen development in CMS line 722A and maintainer line 722B. We found that atp1, atp6, and atp9 exhibited significantly different expression patterns during pollen development in line 722A compared with line 722B. This is the first systematic study of reference genes selection for CMS and will provide useful information for future research on the gene expressions and molecular mechanisms underlying CMS in kenaf. PMID:28919905

  5. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, Kevin M.; Meservey, Richard H.; Landon, Mark D.

    1999-01-01

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D&D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded.

  6. Impact of Reimbursement Cuts on the Sustainability and Accessibility of Dopamine Transporter Imaging.

    PubMed

    Covington, Matthew F; McMillan, Natalie A; Kuo, Phillip H

    2016-09-01

    Dopamine transporter single-photon emission computed tomography imaging utilizing iodine-123 ioflupane is accurate for differentiation of Parkinson disease from essential tremor. This study evaluates how reimbursement for I-123 ioflupane imaging changed between 2011 (year of FDA approval) and 2014 (year after loss of pass-through status for hospital-based outpatient imaging from CMS). I-123 ioflupane reimbursement data for our institution's hospital-based imaging were compared between two periods: (1) July 2011 to October 2012, and (2) 2014. For each time period separately and in combination, averages and ranges of reimbursement for private insurance and CMS were analyzed and compared. A model to ensure recouping of radiopharmaceutical costs was developed. Review yielded 247 studies from July 2011 to October 2012 and 94 studies from 2014. Average reimbursement per study fell from $2,469 (US dollars) in 2011 to 2012 to $1,657 in 2014. CMS reduced average reimbursement by $1,148 in 2014 because of loss of radiopharmaceutical pass-through status. Average reimbursements from CMS versus private payors markedly differed in 2011 to 2012 at $2,266 versus $2,861, respectively, and in 2014 at $1,118 versus $3,470, respectively. Between 2011 to 2012 and 2014, the CMS percentage increased from 54% to 78%. Assuming that I-123 ioflupane cost $2,000, our model based on 2014 data predicts a practice with greater than 60% CMS patients would no longer recover radiopharmaceutical costs. Reimbursement levels, payor mix, scanner location, and radiopharmaceutical costs are all critical, variable factors for modeling the financial viability of I-123 ioflupane imaging and, by extrapolation, future radiopharmaceuticals. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. Use of computer modeling to investigate a dynamic interaction problem in the Skylab TACS quad-valve package

    NASA Technical Reports Server (NTRS)

    Hesser, R. J.; Gershman, R.

    1975-01-01

    A valve opening-response problem encountered during development of a control valve for the Skylab thruster attitude control system (TACS) is described. The problem involved effects of dynamic interaction among valves in the quad-redundant valve package. Also described is a detailed computer simulation of the quad-valve package which was helpful in resolving the problem.

  8. CMS tier structure and operation of the experiment-specific tasks in Germany

    NASA Astrophysics Data System (ADS)

    Nowack, A.

    2008-07-01

    In Germany, several university institutes and research centres take part in the CMS experiment. Concerning the data analysis, a couple of computing centres at different Tier levels, ranging from Tier 1 to Tier 3, exists at these places. The German Tier 1 centre GridKa at the research centre at Karlsruhe serves all four LHC experiments as well as four non-LHC experiments. With respect to the CMS experiment, GridKa is mainly involved in central tasks. The Tier 2 centre in Germany consists of two sites, one at the research centre DESY at Hamburg and one at RWTH Aachen University, forming a federated Tier 2 centre. Both parts cover different aspects of a Tier 2 centre. The German Tier 3 centres are located at the research centre DESY at Hamburg, at RWTH Aachen University, and at the University of Karlsruhe. Furthermore the building of a German user analysis facility is planned. Since the CMS community in German is rather small, a good cooperation between the different sites is essential. This cooperation includes physical topics as well as technical and operational issues. All available communication channels such as email, phone, monthly video conferences, and regular personal meetings are used. For example, the distribution of data sets is coordinated globally within Germany. Also the CMS-specific services such as the data transfer tool PhEDEx or the Monte Carlo production are operated by people from different sites in order to spread the knowledge widely and increase the redundancy in terms of operators.

  9. The complications and the position of the Codman MicroSensor™ ICP device: an analysis of 549 patients and 650 Sensors.

    PubMed

    Koskinen, Lars-Owe D; Grayson, David; Olivecrona, Magnus

    2013-11-01

    Complications of and insertion depth of the Codman MicroSensor ICP monitoring device (CMS) is not well studied. To study complications and the insertion depth of the CMS in a clinical setting. We identified all patients who had their intracranial pressure (ICP) monitored using a CMS device between 2002 and 2010. The medical records and post implantation computed tomography (CT) scans were analyzed for occurrence of infection, hemorrhage and insertion depth. In all, 549 patients were monitored using 650 CMS. Mean monitoring time was 7.0 ± 4.9 days. The mean implantation depth was 21.3 ± 11.1 mm (0-88 mm). In 27 of the patients, a haematoma was identified; 26 of these were less than 1 ml, and one was 8 ml. No clinically significant bleeding was found. There was no statistically significant increase in the number of hemorrhages in presumed coagulopathic patients. The infection rate was 0.6 % and the calculated infection rate per 1,000 catheter days was 0.8. The risk for hemorrhagic and infectious complications when using the CMS for ICP monitoring is low. The depth of insertion varies considerably and should be taken into account if patients are treated with head elevation, since the pressure is measured at the tip of the sensor. To meet the need for ICP monitoring, an intraparenchymal ICP monitoring device should be preferred to the use of an external ventricular drainage (EVD).

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trentadue, R.; Clemencic, M.; Dykstra, D.

    The LCG Persistency Framework consists of three software packages (CORAL, COOL and POOL) that address the data access requirements of the LHC experiments in several different areas. The project is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that are using some or all of the Persistency Framework components to access their data. POOL is a hybrid technology store for C++ objects, using a mixture of streaming and relational technologies to implement both object persistency and object metadata catalogs and collections. CORAL is an abstraction layer with an SQL-free APImore » for accessing data stored using relational database technologies. COOL provides specific software components and tools for the handling of the time variation and versioning of the experiment conditions data. This presentation reports on the status and outlook in each of the three sub-projects at the time of the CHEP2012 conference, reviewing the usage of each package in the three LHC experiments.« less

  11. The HEPCloud Facility: elastic computing for High Energy Physics – The NOvA Use Case

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuess, S.; Garzoglio, G.; Holzman, B.

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a commonmore » interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 25 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper describes the Fermilab HEPCloud Facility and the challenges overcome for the CMS and NOvA communities.« less

  12. Computer Managed Instruction: An Application in Teaching Introductory Statistics.

    ERIC Educational Resources Information Center

    Hudson, Walter W.

    1985-01-01

    This paper describes a computer managed instruction package for teaching introductory or advanced statistics. The instructional package is described and anecdotal information concerning its performance and student responses to its use over two semesters are given. (Author/BL)

  13. Computer package for the design and optimization of absorption air conditioning system operated by solar energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sofrata, H.; Khoshaim, B.; Megahed, M.

    1980-12-01

    In this paper a computer package for the design and optimization of the simple Li-Br absorption air conditioning system, operated by solar energy, is developed in order to study its performance. This was necessary, as a first step, before carrying out any computations regarding the dual system (1-3). The computer package has the facilities of examining any parameter which may control the system; namely generator, evaporator, condenser, absorber temperatures and pumping factor. The output may be tabulated and also fed to the graph plotter. The flow chart of the programme is explained in an easy way and a typical examplemore » is included.« less

  14. Design Tool

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Developed under a Small Business Innovation Research (SBIR) contract, RAMPANT is a CFD software package for computing flow around complex shapes. The package is flexible, fast and easy to use. It has found a great number of applications, including computation of air flow around a Nordic ski jumper, prediction of flow over an airfoil and computation of the external aerodynamics of motor vehicles.

  15. A Freeware Path to Neutron Computed Tomography

    NASA Astrophysics Data System (ADS)

    Schillinger, Burkhard; Craft, Aaron E.

    Neutron computed tomography has become a routine method at many neutron sources due to the availability of digital detection systems, powerful computers and advanced software. The commercial packages Octopus by Inside Matters and VGStudio by Volume Graphics have been established as a quasi-standard for high-end computed tomography. However, these packages require a stiff investment and are available to the users only on-site at the imaging facility to do their data processing. There is a demand from users to have image processing software at home to do further data processing; in addition, neutron computed tomography is now being introduced even at smaller and older reactors. Operators need to show a first working tomography setup before they can obtain a budget to build an advanced tomography system. Several packages are available on the web for free; however, these have been developed for X-rays or synchrotron radiation and are not immediately useable for neutron computed tomography. Three reconstruction packages and three 3D-viewers have been identified and used even for Gigabyte datasets. This paper is not a scientific publication in the classic sense, but is intended as a review to provide searchable help to make the described packages usable for the tomography community. It presents the necessary additional preprocessing in ImageJ, some workarounds for bugs in the software, and undocumented or badly documented parameters that need to be adapted for neutron computed tomography. The result is a slightly complicated, but surprisingly high-quality path to neutron computed tomography images in 3D, but not a replacement for the even more powerful commercial software mentioned above.

  16. Computer assisted learning (CAL) of oral manifestations of HIV disease.

    PubMed

    Porter, S R; Telford, A; Chandler, K; Furber, S; Williams, J; Price, S; Scully, C; Triantos, D; Bain, L

    1996-09-07

    General dental practitioners (GDPs) in the UK may wish additional education on relevant aspects of human immunodeficiency virus (HIV) disease. The aim of the present study was to develop and assess a computer assisted learning package on the oral manifestations of HIV disease of relevance to GDPs. A package was developed using a commercially-available software development tool and assessed by a group of 75 GDPs interested in education and computers. Fifty-four (72%) of the GDPs completed a self-administered questionnaire of their opinions of the package. The majority reported the package to be easy to load and run, that it provided clear instructions and displays, and that it was a more effective educational tool than videotapes, audiotapes, professional journals and textbooks, and of similar benefit as post-graduate courses. The GDPs often commented favourably on the effectiveness of the clinical images and use of questions and answers, although some had criticisms of these and other aspects of the package. As a consequence of this investigation the package has been modified and distributed to GDPs in England and Wales.

  17. Electrons and photons at High Level Trigger in CMS for Run II

    NASA Astrophysics Data System (ADS)

    Anuar, Afiq A.

    2015-12-01

    The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increase in center-of-mass energy and luminosity will raise the event rate to a level challenging for the HLT algorithms. New approaches have been studied to keep the HLT output rate manageable while maintaining thresholds low enough to cover physics analyses. The strategy mainly relies on porting online the ingredients that have been successfully applied in the offline reconstruction, thus allowing to move HLT selection closer to offline cuts. Improvements in HLT electron and photon definitions will be presented, focusing in particular on: updated clustering algorithm and the energy calibration procedure, new Particle-Flow-based isolation approach and pileup mitigation techniques, and the electron-dedicated track fitting algorithm based on Gaussian Sum Filter.

  18. An Ada Linear-Algebra Software Package Modeled After HAL/S

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  19. Accounting utility for determining individual usage of production level software systems

    NASA Technical Reports Server (NTRS)

    Garber, S. C.

    1984-01-01

    An accounting package was developed which determines the computer resources utilized by a user during the execution of a particular program and updates a file containing accumulated resource totals. The accounting package is divided into two separate programs. The first program determines the total amount of computer resources utilized by a user during the execution of a particular program. The second program uses these totals to update a file containing accumulated totals of computer resources utilized by a user for a particular program. This package is useful to those persons who have several other users continually accessing and running programs from their accounts. The package provides the ability to determine which users are accessing and running specified programs along with their total level of usage.

  20. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, K.M.; Meservey, R.H.; Landon, M.D.

    1999-08-10

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D and D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded. 3 figs.

  1. Hypertext-based computer vision teaching packages

    NASA Astrophysics Data System (ADS)

    Marshall, A. David

    1994-10-01

    The World Wide Web Initiative has provided a means for providing hypertext and multimedia based information across the whole INTERNET. Many applications have been developed on such http servers. At Cardiff we have developed a http hypertext based multimedia server, the Cardiff Information Server, using the widely available Mosaic system. The server provides a variety of information ranging from the provision of teaching modules, on- line documentation, timetables for departmental activities to more light hearted hobby interests. One important and novel development to the server has been the development of courseware facilities. This ranges from the provision of on-line lecture notes, exercises and their solutions to more interactive teaching packages. A variety of disciplines have benefitted notably Computer Vision, and Image Processing but also C programming, X Windows, Computer Graphics and Parallel Computing. This paper will address the issues of the implementation of the Computer Vision and Image Processing packages, the advantages gained from using a hypertext based system and also will relate practical experiences of using the packages in a class environment. The paper addresses issues of how best to provide information in such a hypertext based system and how interactive image processing packages can be developed and integrated into courseware. The suite of tools developed facilitates a flexible and powerful courseware package that has proved popular in the classroom and over the Internet. The paper will also detail many future developments we see possible. One of the key points raised in the paper is that Mosaic's hypertext language (html) is extremely powerful and yet relatively straightforward to use. It is also possible to link in Unix calls so that programs and shells can be executed. This provides a powerful suite of utilities that can be exploited to develop many packages.

  2. Analysis of reference transactions using packaged computer programs.

    PubMed

    Calabretta, N; Ross, R

    1984-01-01

    Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.

  3. Comparison of Computer Based Instruction to Behavior Skills Training for Teaching Staff Implementation of Discrete-Trial Instruction with an Adult with Autism

    ERIC Educational Resources Information Center

    Nosik, Melissa R.; Williams, W. Larry; Garrido, Natalia; Lee, Sarah

    2013-01-01

    In the current study, behavior skills training (BST) is compared to a computer based training package for teaching discrete trial instruction to staff, teaching an adult with autism. The computer based training package consisted of instructions, video modeling and feedback. BST consisted of instructions, modeling, rehearsal and feedback. Following…

  4. Virginia Transit Performance Evaluation Package (VATPEP).

    DOT National Transportation Integrated Search

    1987-01-01

    The Virginia Transit Performance Evaluation Package (VATPEP), a computer software package, is documented. This is the computerized version of the methodology used by the Virginia Department of Transportation to evaluate the performance of public tran...

  5. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  6. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  7. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  8. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  9. 21 CFR 1314.110 - Reports for mail-order sales.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...

  10. spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains

    NASA Astrophysics Data System (ADS)

    Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo

    2016-09-01

    The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.

  11. Study of the TRAC Airfoil Table Computational System

    NASA Technical Reports Server (NTRS)

    Hu, Hong

    1999-01-01

    The report documents the study of the application of the TRAC airfoil table computational package (TRACFOIL) to the prediction of 2D airfoil force and moment data over a wide range of angle of attack and Mach number. The TRACFOIL generates the standard C-81 airfoil table for input into rotorcraft comprehensive codes such as CAM- RAD. The existing TRACFOIL computer package is successfully modified to run on Digital alpha workstations and on Cray-C90 supercomputers. A step-by-step instruction for using the package on both computer platforms is provided. Application of the newer version of TRACFOIL is made for two airfoil sections. The C-81 data obtained using the TRACFOIL method are compared with those of wind-tunnel data and results are presented.

  12. The Hidden Cost of Buying a Computer.

    ERIC Educational Resources Information Center

    Johnson, Michael

    1983-01-01

    In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)

  13. Implementation and use of direct-flow connections in a coupled ground-water and surface-water model

    USGS Publications Warehouse

    Swain, Eric D.

    1994-01-01

    The U.S. Geological Survey's MODFLOW finite-difference ground-water flow model has been coupled with three surface-water packages - the MODBRANCH, River, and Stream packages - to simulate surface water and its interaction with ground water. Prior to the development of the coupling packages, the only interaction between these modeling packages was that leakage values could be passed between MODFLOW and the three surface-water packages. To facilitate wider and more flexible uses of the models, a computer program was developed and added to MODFLOW to allow direct flows or stages to be passed between any of the packages and MODFLOW. The flows or stages calculated in one package can be set as boundary discharges or stages to be used in another package. Several modeling packages can be used in the same simulation depending upon the level of sophistication needed in the various reaches being modeled. This computer program is especially useful when any of the River, Stream, or MODBRANCH packages are used to model a river flowing directly into or out of wetlands in direct connection with the aquifer and represented in the model as an aquifer block. A field case study is shown to illustrate an application.

  14. Performance of the CMS Event Builder

    NASA Astrophysics Data System (ADS)

    Andre, J.-M.; Behrens, U.; Branson, J.; Brummer, P.; Chaze, O.; Cittolin, S.; Contescu, C.; Craigs, B. G.; Darlea, G.-L.; Deldicque, C.; Demiragli, Z.; Dobson, M.; Doualot, N.; Erhan, S.; Fulcher, J. F.; Gigi, D.; Gładki, M.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Janulis, M.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; O'Dell, V.; Orsini, L.; Paus, C.; Petrova, P.; Pieri, M.; Racz, A.; Reis, T.; Sakulin, H.; Schwick, C.; Simelevicius, D.; Zejdl, P.

    2017-10-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of {\\mathscr{O}}(100 {{GB}}/{{s}}) to the high-level trigger farm. The DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbit/s Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbit/s Infiniband FDR Clos network has been chosen for the event builder. This paper presents the implementation and performance of the event-building system.

  15. Development of a computer-assisted learning software package on dental traumatology.

    PubMed

    Tolidis, K; Crawford, P; Stephens, C; Papadogiannis, Y; Plakias, C

    1998-10-01

    The development of computer-assisted learning software packages is a relatively new field of computer application. The progress made in personal computer technology toward more user-friendly operating systems has stimulated the academic community to develop computer-assisted learning for pre- and postgraduate students. The ability of computers to combine audio and visual data in an interactive form provides a powerful educational tool. The purpose of this study was to develop and evaluate a computer-assisted learning package on dental traumatology. This program contains background information on the diagnosis, classification, and management of dental injuries in both the permanent and the deciduous dentitions. It is structured into chapters according to the nature of the injury and whether injury has occurred in the primary or permanent dentition. At the end of each chapter there is a self-assessment questionnaire as well as references to relevant literature. Extensive use of pictures and video provides a comprehensive overview of the subject.

  16. On the escape of oxygen and hydrogen from Mars

    NASA Technical Reports Server (NTRS)

    Fox, J. L.

    1993-01-01

    Escape rates of oxygen atoms from dissociative recombination of O2(+) above the Martian exobase are computed in light of new information from ab initio calculations of the dissociative recombination process and our recently revised understanding of the Martian dayside ionosphere. Only about 60 percent of the dissociative recombinations occur in channels in which the O atoms are released with energies in excess of the escape velocity. Futhermore, we find that the computed escape fluxes for O depend greatly on the nature of the ion loss process that has been found necessary to reproduce the topside ion density profiles measured by Viking. If it is assumed that the ions are not lost from the gravitational field of the planet, as required by an analysis of nitrogen escape, the computed average O escape rate is 3 x 10 exp 6/sq cm/s, much less than half the H escape rates inferred from measurements of the Lyman-alpha dayglow, which are in the range (1-2) x 10 exp 8/sq cm/s. Suggestions for restoring the relative escape rates of H and O to the stoichiometric ratio of water are explored.

  17. An Introduction to Research and the Computer: A Self-Instructional Package.

    ERIC Educational Resources Information Center

    Vasu, Ellen Storey; Palmer, Richard I.

    This self-instructional package includes learning objectives, definitions, exercises, and feedback for learning some basic concepts and skills involved in using computers for analyzing data and understanding basic research terminology. Learning activities are divided into four sections: research and research hypotheses; variables, cases, and…

  18. Revealing Neurocomputational Mechanisms of Reinforcement Learning and Decision-Making With the hBayesDM Package

    PubMed Central

    Ahn, Woo-Young; Haines, Nathaniel; Zhang, Lei

    2017-01-01

    Reinforcement learning and decision-making (RLDM) provide a quantitative framework and computational theories with which we can disentangle psychiatric conditions into the basic dimensions of neurocognitive functioning. RLDM offer a novel approach to assessing and potentially diagnosing psychiatric patients, and there is growing enthusiasm for both RLDM and computational psychiatry among clinical researchers. Such a framework can also provide insights into the brain substrates of particular RLDM processes, as exemplified by model-based analysis of data from functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). However, researchers often find the approach too technical and have difficulty adopting it for their research. Thus, a critical need remains to develop a user-friendly tool for the wide dissemination of computational psychiatric methods. We introduce an R package called hBayesDM (hierarchical Bayesian modeling of Decision-Making tasks), which offers computational modeling of an array of RLDM tasks and social exchange games. The hBayesDM package offers state-of-the-art hierarchical Bayesian modeling, in which both individual and group parameters (i.e., posterior distributions) are estimated simultaneously in a mutually constraining fashion. At the same time, the package is extremely user-friendly: users can perform computational modeling, output visualization, and Bayesian model comparisons, each with a single line of coding. Users can also extract the trial-by-trial latent variables (e.g., prediction errors) required for model-based fMRI/EEG. With the hBayesDM package, we anticipate that anyone with minimal knowledge of programming can take advantage of cutting-edge computational-modeling approaches to investigate the underlying processes of and interactions between multiple decision-making (e.g., goal-directed, habitual, and Pavlovian) systems. In this way, we expect that the hBayesDM package will contribute to the dissemination of advanced modeling approaches and enable a wide range of researchers to easily perform computational psychiatric research within different populations. PMID:29601060

  19. Development of high performance scientific components for interoperability of computing packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gulabani, Teena Pratap

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less

  20. Review and analysis of dense linear system solver package for distributed memory machines

    NASA Technical Reports Server (NTRS)

    Narang, H. N.

    1993-01-01

    A dense linear system solver package recently developed at the University of Texas at Austin for distributed memory machine (e.g. Intel Paragon) has been reviewed and analyzed. The package contains about 45 software routines, some written in FORTRAN, and some in C-language, and forms the basis for parallel/distributed solutions of systems of linear equations encountered in many problems of scientific and engineering nature. The package, being studied by the Computer Applications Branch of the Analysis and Computation Division, may provide a significant computational resource for NASA scientists and engineers in parallel/distributed computing. Since the package is new and not well tested or documented, many of its underlying concepts and implementations were unclear; our task was to review, analyze, and critique the package as a step in the process that will enable scientists and engineers to apply it to the solution of their problems. All routines in the package were reviewed and analyzed. Underlying theory or concepts which exist in the form of published papers or technical reports, or memos, were either obtained from the author, or from the scientific literature; and general algorithms, explanations, examples, and critiques have been provided to explain the workings of these programs. Wherever the things were still unclear, communications were made with the developer (author), either by telephone or by electronic mail, to understand the workings of the routines. Whenever possible, tests were made to verify the concepts and logic employed in their implementations. A detailed report is being separately documented to explain the workings of these routines.

  1. Department of Defense In-House RDT and E Activities: Management Analysis Report for Fiscal Year 1993

    DTIC Science & Technology

    1994-11-01

    A worldwide unique lab because it houses a high - speed modeling and simulation system, a prototype...E Division, San Diego, CA: High Performance Computing Laboratory providing a wide range of advanced computer systems for the scientific investigation...Machines CM-200 and a 256-node Thinking Machines CM-S. The CM-5 is in a very large memory, ( high performance 32 Gbytes, >4 0 OFlop) coafiguration,

  2. Challenges to Software/Computing for Experimentation at the LHC

    NASA Astrophysics Data System (ADS)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  3. MATREX Leads the Way in Implementing New DOD VV&A Documentation Standards

    DTIC Science & Technology

    2007-05-24

    Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems Acquisition Concept...Communications Human Performance Model • C3GRID – Command & Control, Computer GRID • CES – Communications Effects Server • CMS2 – Comprehensive

  4. Can I Trust This Software Package? An Exercise in Validation of Computational Results

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Brauner, Neima; Ashurst, W. Robert; Cutlip, Michael B.

    2008-01-01

    Mathematical software packages such as Polymath, MATLAB, and Mathcad are currently widely used for engineering problem solving. Applications of several of these packages to typical chemical engineering problems have been demonstrated by Cutlip, et al. The main characteristic of these packages is that they provide a "problem-solving environment…

  5. An Interactive Computer Aided Electrical Engineering Education Package.

    ERIC Educational Resources Information Center

    Cavati, Cicero Romao

    This paper describes an educational package to help the learning process. A case study is presented of an energy distribution course in the Electrical Engineering Department at the Federal University of Espirito Santo (UFES). The advantages of the developed package are shown by comparing it with the traditional academic book. This package presents…

  6. Eddylicious: A Python package for turbulent inflow generation

    NASA Astrophysics Data System (ADS)

    Mukha, Timofey; Liefvendahl, Mattias

    2018-01-01

    A Python package for generating inflow for scale-resolving computer simulations of turbulent flow is presented. The purpose of the package is to unite existing inflow generation methods in a single code-base and make them accessible to users of various Computational Fluid Dynamics (CFD) solvers. The currently existing functionality consists of an accurate inflow generation method suitable for flows with a turbulent boundary layer inflow and input/output routines for coupling with the open-source CFD solver OpenFOAM.

  7. Application of GA package in functional packaging

    NASA Astrophysics Data System (ADS)

    Belousova, D. A.; Noskova, E. E.; Kapulin, D. V.

    2018-05-01

    The approach to application program for the task of configuration of the elements of the commutation circuit for design of the radio-electronic equipment on the basis of the genetic algorithm is offered. The efficiency of the used approach for commutation circuits with different characteristics for computer-aided design on radio-electronic manufacturing is shown. The prototype of the computer-aided design subsystem on the basis of a package GA for R with a set of the general functions for optimization of multivariate models is programmed.

  8. A Maple package for computing Gröbner bases for linear recurrence relations

    NASA Astrophysics Data System (ADS)

    Gerdt, Vladimir P.; Robertz, Daniel

    2006-04-01

    A Maple package for computing Gröbner bases of linear difference ideals is described. The underlying algorithm is based on Janet and Janet-like monomial divisions associated with finite difference operators. The package can be used, for example, for automatic generation of difference schemes for linear partial differential equations and for reduction of multiloop Feynman integrals. These two possible applications are illustrated by simple examples of the Laplace equation and a one-loop scalar integral of propagator type.

  9. Morphological and genetic characterization of a new cytoplasmic male sterility system (oxa CMS) in stem mustard (Brassica juncea).

    PubMed

    Heng, Shuangping; Liu, Sansan; Xia, Chunxiu; Tang, HongYu; Xie, Fei; Fu, Tingdong; Wan, Zhengjie

    2018-01-01

    KEY MESSAGE: oxa CMS is a new cytoplasmic male sterility type in Brassica juncea. oxa CMS is a cytoplasmic male sterility (CMS) line that has been widely used in the production and cultivation of stem mustard in the southwestern China. In this study, different CMS-type specific mitochondrial markers were used to confirm that oxa CMS is distinct from the pol CMS, ogu CMS, nap CMS, hau CMS, tour CMS, Moricandia arvensis CMS, orf220-type CMS, etc., that have been previously reported in Brassica crops. Pollen grains of the oxa CMS line are sterile with a self-fertility rate of almost 0% and the sterility strain rate and sterility degree of oxa CMS is 100% due to a specific flower structure and flowering habit. Scanning electron microscopy revealed that most pollen grains in mature anthers of the oxa CMS line are empty, flat and deflated. Semi-thin section further showed that the abortive stage of anther development in oxa CMS is initiated at the late uninucleate stage. Abnormally vacuolated microspores caused male sterility in the oxa CMS line. This cytological study combined with marker-assisted selection showed that oxa CMS is a novel CMS type in stem mustard (Brassica juncea). Interestingly, the abortive stage of oxa CMS is later than those in other CMS types reported in Brassica crops, and there is no negative effect on the oxa CMS line growth period. This study demonstrated that this novel oxa CMS has a unique flower structure with sterile pollen grains at the late uninucleate stage. Our results may help to uncover the mechanism of oxa CMS in Brassica juncea.

  10. An Innovative Learning Model for Computation in First Year Mathematics

    ERIC Educational Resources Information Center

    Tonkes, E. J.; Loch, B. I.; Stace, A. W.

    2005-01-01

    MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…

  11. Computerised data reduction.

    PubMed

    Datson, D J; Carter, N G

    1988-10-01

    The use of personal computers in accountancy and business generally has been stimulated by the availability of flexible software packages. We describe the implementation of a commercial software package designed for interfacing with laboratory instruments and highlight the ease with which it can be implemented, without the need for specialist computer programming staff.

  12. WiLE: A Mathematica package for weak coupling expansion of Wilson loops in ABJ(M) theory

    NASA Astrophysics Data System (ADS)

    Preti, M.

    2018-06-01

    We present WiLE, a Mathematica® package designed to perform the weak coupling expansion of any Wilson loop in ABJ(M) theory at arbitrary perturbative order. For a given set of fields on the loop and internal vertices, the package displays all the possible Feynman diagrams and their integral representations. The user can also choose to exclude non planar diagrams, tadpoles and self-energies. Through the use of interactive input windows, the package should be easily accessible to users with little or no previous experience. The package manual provides some pedagogical examples and the computation of all ladder diagrams at three-loop relevant for the cusp anomalous dimension in ABJ(M). The latter application gives also support to some recent results computed in different contexts.

  13. The effect of resolution on viscous dissipation measured with 4D flow MRI in patients with Fontan circulation: Evaluation using computational fluid dynamics

    PubMed Central

    Cibis, Merih; Jarvis, Kelly; Markl, Michael; Rose, Michael; Rigsby, Cynthia; Barker, Alex J.; Wentzel, Jolanda J.

    2016-01-01

    Viscous dissipation inside Fontan circulation, a parameter associated with the exercise intolerance of Fontan patients, can be derived from computational fluid dynamics (CFD) or 4D flow MRI velocities. However, the impact of spatial resolution and measurement noise on the estimation of viscous dissipation is unclear. Our aim was to evaluate the influence of these parameters on viscous dissipation calculation. Six Fontan patients underwent whole heart 4D flow MRI. Subject-specific CFD simulations were performed. The CFD velocities were down-sampled to isotropic spatial resolutions of 0.5 mm, 1 mm, 2 mm and to MRI resolution. Viscous dissipation was compared between (1) high resolution CFD velocities, (2) CFD velocities down-sampled to MRI resolution, (3) down-sampled CFD velocities with MRI mimicked noise levels, and (4) in-vivo 4D flow MRI velocities. Relative viscous dissipation between subjects was also calculated. 4D flow MRI velocities (15.6±3.8 cm/s) were higher, although not significantly different than CFD velocities (13.8±4.7 cm/s, p=0.16), down-sampled CFD velocities (12.3±4.4 cm/s, p=0.06) and the down-sampled CFD velocities with noise (13.2±4.2 cm/s, p=0.06). CFD-based viscous dissipation (0.81±0.55 mW) was significantly higher than those based on down-sampled CFD (0.25±0.19 mW, p=0.03), down-sampled CFD with noise (0.49±0.26 mW, p=0.03) and 4D flow MRI (0.56±0.28 mW, p=0.06). Nevertheless, relative viscous dissipation between different subjects was maintained irrespective of resolution and noise, suggesting that comparison of viscous dissipation between patients is still possible. PMID:26298492

  14. Use of DAGMan in CRAB3 to Improve the Splitting of CMS User Jobs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, M.; Mascheroni, M.; Woodard, A.

    CRAB3 is a workload management tool used by CMS physicists to analyze data acquired by the Compact Muon Solenoid (CMS) detector at the CERN Large Hadron Collider (LHC). Research in high energy physics often requires the analysis of large collections of files, referred to as datasets. The task is divided into jobs that are distributed among a large collection of worker nodes throughout the Worldwide LHC Computing Grid (WLCG). Splitting a large analysis task into optimally sized jobs is critical to efficient use of distributed computing resources. Jobs that are too big will have excessive runtimes and will not distributemore » the work across all of the available nodes. However, splitting the project into a large number of very small jobs is also inefficient, as each job creates additional overhead which increases load on infrastructure resources. Currently this splitting is done manually, using parameters provided by the user. However the resources needed for each job are difficult to predict because of frequent variations in the performance of the user code and the content of the input dataset. As a result, dividing a task into jobs by hand is difficult and often suboptimal. In this work we present a new feature called “automatic splitting” which removes the need for users to manually specify job splitting parameters. We discuss how HTCondor DAGMan can be used to build dynamic Directed Acyclic Graphs (DAGs) to optimize the performance of large CMS analysis jobs on the Grid. We use DAGMan to dynamically generate interconnected DAGs that estimate the processing time the user code will require to analyze each event. This is used to calculate an estimate of the total processing time per job, and a set of analysis jobs are run using this estimate as a specified time limit. Some jobs may not finish within the alloted time; they are terminated at the time limit, and the unfinished data is regrouped into smaller jobs and resubmitted.« less

  15. Use of DAGMan in CRAB3 to improve the splitting of CMS user jobs

    NASA Astrophysics Data System (ADS)

    Wolf, M.; Mascheroni, M.; Woodard, A.; Belforte, S.; Bockelman, B.; Hernandez, J. M.; Vaandering, E.

    2017-10-01

    CRAB3 is a workload management tool used by CMS physicists to analyze data acquired by the Compact Muon Solenoid (CMS) detector at the CERN Large Hadron Collider (LHC). Research in high energy physics often requires the analysis of large collections of files, referred to as datasets. The task is divided into jobs that are distributed among a large collection of worker nodes throughout the Worldwide LHC Computing Grid (WLCG). Splitting a large analysis task into optimally sized jobs is critical to efficient use of distributed computing resources. Jobs that are too big will have excessive runtimes and will not distribute the work across all of the available nodes. However, splitting the project into a large number of very small jobs is also inefficient, as each job creates additional overhead which increases load on infrastructure resources. Currently this splitting is done manually, using parameters provided by the user. However the resources needed for each job are difficult to predict because of frequent variations in the performance of the user code and the content of the input dataset. As a result, dividing a task into jobs by hand is difficult and often suboptimal. In this work we present a new feature called “automatic splitting” which removes the need for users to manually specify job splitting parameters. We discuss how HTCondor DAGMan can be used to build dynamic Directed Acyclic Graphs (DAGs) to optimize the performance of large CMS analysis jobs on the Grid. We use DAGMan to dynamically generate interconnected DAGs that estimate the processing time the user code will require to analyze each event. This is used to calculate an estimate of the total processing time per job, and a set of analysis jobs are run using this estimate as a specified time limit. Some jobs may not finish within the alloted time; they are terminated at the time limit, and the unfinished data is regrouped into smaller jobs and resubmitted.

  16. Common Ada (tradename) Missile Package (CAMP) Project. Missile Software Parts. Volume 8. Detail Design Document

    DTIC Science & Technology

    1988-03-01

    PACKAGE BODY ) TLCSC P661 (CATALOG #P106-0) This package contains the CAMP parts required to do the vaypoint steering portion of navigation. The...3.3.4.1.6 PROCESSING The following describes the processing performed by this part: package body WaypointSteering is package body ...Steering_Vector_Operations is separate; package body Steering_Vector_Operations_with_Arcsin is separate; procedure Compute Turn_Angle_and Direction (UnitNormal C

  17. Packaging printed circuit boards: A production application of interactive graphics

    NASA Technical Reports Server (NTRS)

    Perrill, W. A.

    1975-01-01

    The structure and use of an Interactive Graphics Packaging Program (IGPP), conceived to apply computer graphics to the design of packaging electronic circuits onto printed circuit boards (PCB), were described. The intent was to combine the data storage and manipulative power of the computer with the imaginative, intuitive power of a human designer. The hardware includes a CDC 6400 computer and two CDC 777 terminals with CRT screens, light pens, and keyboards. The program is written in FORTRAN 4 extended with the exception of a few functions coded in COMPASS (assembly language). The IGPP performs four major functions for the designer: (1) data input and display, (2) component placement (automatic or manual), (3) conductor path routing (automatic or manual), and (4) data output. The most complex PCB packaged to date measured 16.5 cm by 19 cm and contained 380 components, two layers of ground planes and four layers of conductors mixed with ground planes.

  18. Software package for modeling spin-orbit motion in storage rings

    NASA Astrophysics Data System (ADS)

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  19. The `TTIME' Package: Performance Evaluation in a Cluster Computing Environment

    NASA Astrophysics Data System (ADS)

    Howe, Marico; Berleant, Daniel; Everett, Albert

    2011-06-01

    The objective of translating developmental event time across mammalian species is to gain an understanding of the timing of human developmental events based on known time of those events in animals. The potential benefits include improvements to diagnostic and intervention capabilities. The CRAN `ttime' package provides the functionality to infer unknown event timings and investigate phylogenetic proximity utilizing hierarchical clustering of both known and predicted event timings. The original generic mammalian model included nine eutherian mammals: Felis domestica (cat), Mustela putorius furo (ferret), Mesocricetus auratus (hamster), Macaca mulatta (monkey), Homo sapiens (humans), Mus musculus (mouse), Oryctolagus cuniculus (rabbit), Rattus norvegicus (rat), and Acomys cahirinus (spiny mouse). However, the data for this model is expected to grow as more data about developmental events is identified and incorporated into the analysis. Performance evaluation of the `ttime' package across a cluster computing environment versus a comparative analysis in a serial computing environment provides an important computational performance assessment. A theoretical analysis is the first stage of a process in which the second stage, if justified by the theoretical analysis, is to investigate an actual implementation of the `ttime' package in a cluster computing environment and to understand the parallelization process that underlies implementation.

  20. I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison

    NASA Technical Reports Server (NTRS)

    Somawardhana, Ruwan

    2011-01-01

    CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.

  1. Use of symbolic computation in robotics education

    NASA Technical Reports Server (NTRS)

    Vira, Naren; Tunstel, Edward

    1992-01-01

    An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.

  2. Extending the IEEE-LTSA.

    ERIC Educational Resources Information Center

    Voskamp, Jorg; Hambach, Sybille

    An Internet-based course management system has been under development at the Fraunhofer-Institute for Computer Graphics Rostock (Germany) for the past 5 years. It is used by experts for distributing their courses via the Internet and by students for learning with the material distributed. The "Course Management System for WWW--CMS-W3"…

  3. MATREX: A Unifying Modeling and Simulation Architecture for Live-Virtual-Constructive Applications

    DTIC Science & Technology

    2007-05-23

    Deployment Systems Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems...CMS2 – Comprehensive Munitions & Sensor Server • CSAT – C4ISR Static Analysis Tool • C4ISR – Command & Control, Communications, Computers

  4. The CMS tracker control system

    NASA Astrophysics Data System (ADS)

    Dierlamm, A.; Dirkes, G. H.; Fahrer, M.; Frey, M.; Hartmann, F.; Masetti, L.; Militaru, O.; Shah, S. Y.; Stringer, R.; Tsirou, A.

    2008-07-01

    The Tracker Control System (TCS) is a distributed control software to operate about 2000 power supplies for the silicon modules of the CMS Tracker and monitor its environmental sensors. TCS must thus be able to handle about 104 power supply parameters, about 103 environmental probes from the Programmable Logic Controllers of the Tracker Safety System (TSS), about 105 parameters read via DAQ from the DCUs in all front end hybrids and from CCUs in all control groups. TCS is built on top of an industrial SCADA program (PVSS) extended with a framework developed at CERN (JCOP) and used by all LHC experiments. The logical partitioning of the detector is reflected in the hierarchical structure of the TCS, where commands move down to the individual hardware devices, while states are reported up to the root which is interfaced to the broader CMS control system. The system computes and continuously monitors the mean and maximum values of critical parameters and updates the percentage of currently operating hardware. Automatic procedures switch off selected parts of the detector using detailed granularity and avoiding widespread TSS intervention.

  5. Tailoring Mathematical Models to Stem-Cell Derived Cardiomyocyte Lines Can Improve Predictions of Drug-Induced Changes to Their Electrophysiology.

    PubMed

    Lei, Chon Lok; Wang, Ken; Clerx, Michael; Johnstone, Ross H; Hortigon-Vinagre, Maria P; Zamora, Victor; Allan, Andrew; Smith, Godfrey L; Gavaghan, David J; Mirams, Gary R; Polonchuk, Liudmila

    2017-01-01

    Human induced pluripotent stem cell derived cardiomyocytes (iPSC-CMs) have applications in disease modeling, cell therapy, drug screening and personalized medicine. Computational models can be used to interpret experimental findings in iPSC-CMs, provide mechanistic insights, and translate these findings to adult cardiomyocyte (CM) electrophysiology. However, different cell lines display different expression of ion channels, pumps and receptors, and show differences in electrophysiology. In this exploratory study, we use a mathematical model based on iPSC-CMs from Cellular Dynamic International (CDI, iCell), and compare its predictions to novel experimental recordings made with the Axiogenesis Cor.4U line. We show that tailoring this model to the specific cell line, even using limited data and a relatively simple approach, leads to improved predictions of baseline behavior and response to drugs. This demonstrates the need and the feasibility to tailor models to individual cell lines, although a more refined approach will be needed to characterize individual currents, address differences in ion current kinetics, and further improve these results.

  6. Towards a centralized Grid Speedometer

    NASA Astrophysics Data System (ADS)

    Dzhunov, I.; Andreeva, J.; Fajardo, E.; Gutsche, O.; Luyckx, S.; Saiz, P.

    2014-06-01

    Given the distributed nature of the Worldwide LHC Computing Grid and the way CPU resources are pledged and shared around the globe, Virtual Organizations (VOs) face the challenge of monitoring the use of these resources. For CMS and the operation of centralized workflows, the monitoring of how many production jobs are running and pending in the Glidein WMS production pools is very important. The Dashboard Site Status Board (SSB) provides a very flexible framework to collect, aggregate and visualize data. The CMS production monitoring team uses the SSB to define the metrics that have to be monitored and the alarms that have to be raised. During the integration of CMS production monitoring into the SSB, several enhancements to the core functionality of the SSB were required; They were implemented in a generic way, so that other VOs using the SSB can exploit them. Alongside these enhancements, there were a number of changes to the core of the SSB framework. This paper presents the details of the implementation and the advantages for current and future usage of the new features in SSB.

  7. METLIN-PC: An applications-program package for problems of mathematical programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pshenichnyi, B.N.; Sobolenko, L.A.; Sosnovskii, A.A.

    1994-05-01

    The METLIN-PC applications-program package (APP) was developed at the V.M. Glushkov Institute of Cybernetics of the Academy of Sciences of Ukraine on IBM PC XT and AT computers. The present version of the package was written in Turbo Pascal and Fortran-77. The METLIN-PC is chiefly designed for the solution of smooth problems of mathematical programming and is a further development of the METLIN prototype, which was created earlier on a BESM-6 computer. The principal property of the previous package is retained - the applications modules employ a single approach based on the linearization method of B.N. Pschenichnyi. Hence the namemore » {open_quotes}METLIN.{close_quotes}« less

  8. Language Analysis Package (L.A.P.) Version I System Design.

    ERIC Educational Resources Information Center

    Porch, Ann

    To permit researchers to use the speed and versatility of the computer to process natural language text as well as numerical data without undergoing special training in programing or computer operations, a language analysis package has been developed partially based on several existing programs. An overview of the design is provided and system…

  9. Macintosh Computer Classroom and Laboratory Security: Preventing Unwanted Changes to the System.

    ERIC Educational Resources Information Center

    Senn, Gary J.; Smyth, Thomas J. C.

    Because of the graphical interface and "openness" of the operating system, Macintosh computers are susceptible to undesirable changes by the user. This presentation discusses the advantages and disadvantages of software packages that offer protection for the Macintosh system. The two basic forms of software security packages include a…

  10. CMS: A Web-Based System for Visualization and Analysis of Genome-Wide Methylation Data of Human Cancers

    PubMed Central

    Huang, Yi-Wen; Roa, Juan C.; Goodfellow, Paul J.; Kizer, E. Lynette; Huang, Tim H. M.; Chen, Yidong

    2013-01-01

    Background DNA methylation of promoter CpG islands is associated with gene suppression, and its unique genome-wide profiles have been linked to tumor progression. Coupled with high-throughput sequencing technologies, it can now efficiently determine genome-wide methylation profiles in cancer cells. Also, experimental and computational technologies make it possible to find the functional relationship between cancer-specific methylation patterns and their clinicopathological parameters. Methodology/Principal Findings Cancer methylome system (CMS) is a web-based database application designed for the visualization, comparison and statistical analysis of human cancer-specific DNA methylation. Methylation intensities were obtained from MBDCap-sequencing, pre-processed and stored in the database. 191 patient samples (169 tumor and 22 normal specimen) and 41 breast cancer cell-lines are deposited in the database, comprising about 6.6 billion uniquely mapped sequence reads. This provides comprehensive and genome-wide epigenetic portraits of human breast cancer and endometrial cancer to date. Two views are proposed for users to better understand methylation structure at the genomic level or systemic methylation alteration at the gene level. In addition, a variety of annotation tracks are provided to cover genomic information. CMS includes important analytic functions for interpretation of methylation data, such as the detection of differentially methylated regions, statistical calculation of global methylation intensities, multiple gene sets of biologically significant categories, interactivity with UCSC via custom-track data. We also present examples of discoveries utilizing the framework. Conclusions/Significance CMS provides visualization and analytic functions for cancer methylome datasets. A comprehensive collection of datasets, a variety of embedded analytic functions and extensive applications with biological and translational significance make this system powerful and unique in cancer methylation research. CMS is freely accessible at: http://cbbiweb.uthscsa.edu/KMethylomes/. PMID:23630576

  11. CMS: a web-based system for visualization and analysis of genome-wide methylation data of human cancers.

    PubMed

    Gu, Fei; Doderer, Mark S; Huang, Yi-Wen; Roa, Juan C; Goodfellow, Paul J; Kizer, E Lynette; Huang, Tim H M; Chen, Yidong

    2013-01-01

    DNA methylation of promoter CpG islands is associated with gene suppression, and its unique genome-wide profiles have been linked to tumor progression. Coupled with high-throughput sequencing technologies, it can now efficiently determine genome-wide methylation profiles in cancer cells. Also, experimental and computational technologies make it possible to find the functional relationship between cancer-specific methylation patterns and their clinicopathological parameters. Cancer methylome system (CMS) is a web-based database application designed for the visualization, comparison and statistical analysis of human cancer-specific DNA methylation. Methylation intensities were obtained from MBDCap-sequencing, pre-processed and stored in the database. 191 patient samples (169 tumor and 22 normal specimen) and 41 breast cancer cell-lines are deposited in the database, comprising about 6.6 billion uniquely mapped sequence reads. This provides comprehensive and genome-wide epigenetic portraits of human breast cancer and endometrial cancer to date. Two views are proposed for users to better understand methylation structure at the genomic level or systemic methylation alteration at the gene level. In addition, a variety of annotation tracks are provided to cover genomic information. CMS includes important analytic functions for interpretation of methylation data, such as the detection of differentially methylated regions, statistical calculation of global methylation intensities, multiple gene sets of biologically significant categories, interactivity with UCSC via custom-track data. We also present examples of discoveries utilizing the framework. CMS provides visualization and analytic functions for cancer methylome datasets. A comprehensive collection of datasets, a variety of embedded analytic functions and extensive applications with biological and translational significance make this system powerful and unique in cancer methylation research. CMS is freely accessible at: http://cbbiweb.uthscsa.edu/KMethylomes/.

  12. Final Technical Report for ARRA Funding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rusack, Roger; Mans, Jeremiah; Poling, Ronald

    Final technical report of the University of Minnesota experimental high energy physics group for ARRA support. The Cryogenic Dark Matter Experiment (CDMS) used the funds received to construct a new passive shield to protect a high-purity germanium detector located in the Soudan mine in Northern Minnesota from cosmic rays. The BESIII and the CMS groups purchased computing hardware to assemble computer farms for data analysis and to generate large volumes of simulated data for comparison with the data collected.

  13. Detection of the Diversity of Cytoplasmic Male Sterility Sources in Broccoli (Brassica Oleracea var. Italica) Using Mitochondrial Markers.

    PubMed

    Shu, Jinshuai; Liu, Yumei; Li, Zhansheng; Zhang, Lili; Fang, Zhiyuan; Yang, Limei; Zhuang, Mu; Zhang, Yangyong; Lv, Honghao

    2016-01-01

    Broccoli (Brassica oleracea var. italica) is an important commercial vegetable crop. As part of an efficient pollination system, cytoplasmic male sterility (CMS) has been widely used for broccoli hybrid production. Identifying the original sources of CMS in broccoli accessions has become an important part of broccoli breeding. In this study, the diversity of the CMS sources of 39 broccoli accessions, including 19 CMS lines and 20 hybrids, were analyzed using mitochondrial markers. All CMS accessions contained the ogu orf138-related DNA fragment and the key genes of nap CMS, pol CMS, and tour CMS were not detected. The 39 CMS accessions were divided into five groups using six orf138-related and two simple sequence repeat markers. We observed that ogu CMS R3 constituted 79.49% of the CMS sources. CMS6 and CMS26 were differentiated from the other accessions using a specific primer. CMS32 was distinguished from the other accessions based on a 78-nucleotide deletion at the same locus as the orf138-related sequence. When the coefficient was about 0.90, five CMS accessions (13CMS6, 13CMS23, 13CMS24, 13CMS37, and 13CMS39) exhibiting abnormal floral organs with poor seed setting were grouped together. The polymerase chain reaction amplification profiles for these five accessions differed from those of the other accessions. We identified eight useful molecular markers that can be used to detect CMS types during broccoli breeding. Our data also provide important information relevant to future studies on the possible origins and molecular mechanisms of CMS in broccoli.

  14. Detection of the Diversity of Cytoplasmic Male Sterility Sources in Broccoli (Brassica Oleracea var. Italica) Using Mitochondrial Markers

    PubMed Central

    Shu, Jinshuai; Liu, Yumei; Li, Zhansheng; Zhang, Lili; Fang, Zhiyuan; Yang, Limei; Zhuang, Mu; Zhang, Yangyong; Lv, Honghao

    2016-01-01

    Broccoli (Brassica oleracea var. italica) is an important commercial vegetable crop. As part of an efficient pollination system, cytoplasmic male sterility (CMS) has been widely used for broccoli hybrid production. Identifying the original sources of CMS in broccoli accessions has become an important part of broccoli breeding. In this study, the diversity of the CMS sources of 39 broccoli accessions, including 19 CMS lines and 20 hybrids, were analyzed using mitochondrial markers. All CMS accessions contained the ogu orf138-related DNA fragment and the key genes of nap CMS, pol CMS, and tour CMS were not detected. The 39 CMS accessions were divided into five groups using six orf138-related and two simple sequence repeat markers. We observed that ogu CMS R3 constituted 79.49% of the CMS sources. CMS6 and CMS26 were differentiated from the other accessions using a specific primer. CMS32 was distinguished from the other accessions based on a 78-nucleotide deletion at the same locus as the orf138-related sequence. When the coefficient was about 0.90, five CMS accessions (13CMS6, 13CMS23, 13CMS24, 13CMS37, and 13CMS39) exhibiting abnormal floral organs with poor seed setting were grouped together. The polymerase chain reaction amplification profiles for these five accessions differed from those of the other accessions. We identified eight useful molecular markers that can be used to detect CMS types during broccoli breeding. Our data also provide important information relevant to future studies on the possible origins and molecular mechanisms of CMS in broccoli. PMID:27446156

  15. Electron-Ion Recombination Rate Coefficient Measurements in a Flowing Afterglow Plasma

    NASA Technical Reports Server (NTRS)

    Gougousi, Theodosia; Golde, Michael F.; Johnsen, Rainer

    1996-01-01

    The flowing-afterglow technique in conjunction with computer modeling of the flowing plasma has been used to determine accurate dissociative-recombination rate coefficients alpha for the ions O2(+), HCO(+), CH5(+), C2H5(+), H3O(+), CO2(+), HCO2(+), HN2O(+), and N2O(+) at 295 K. We find that the simple form of data analysis that was employed in earlier experiments was adequate and we largely confirm earlier results. In the case of HCO(+) ions, published coefficients range from 1.1 X 10(exp -7) to 2.8 x 10(exp -7) cu cm/S, while our measurements give a value of 1.9 x 10(exp -7) cu cm/S.

  16. SPARSKIT: A basic tool kit for sparse matrix computations

    NASA Technical Reports Server (NTRS)

    Saad, Youcef

    1990-01-01

    Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.

  17. Comparative Gene Expression Analyses Reveal Distinct Molecular Signatures between Differentially Reprogrammed Cardiomyocytes.

    PubMed

    Zhou, Yang; Wang, Li; Liu, Ziqing; Alimohamadi, Sahar; Yin, Chaoying; Liu, Jiandong; Qian, Li

    2017-09-26

    Cardiomyocytes derived from induced pluripotent stem cells (iPSC-CMs) or directly reprogrammed from non-myocytes (induced cardiomyocytes [iCMs]) are promising sources for heart regeneration or disease modeling. However, the similarities and differences between iPSC-CMs and iCMs are still unknown. Here, we performed transcriptome analyses of beating iPSC-CMs and iCMs generated from cardiac fibroblasts (CFs) of the same origin. Although both iPSC-CMs and iCMs establish CM-like molecular features globally, iPSC-CMs exhibit a relatively hyperdynamic epigenetic status, whereas iCMs exhibit a maturation status that more closely resembles that of adult CMs. Based on gene expression of metabolic enzymes, iPSC-CMs primarily employ glycolysis, whereas iCMs utilize fatty acid oxidation as the main pathway. Importantly, iPSC-CMs and iCMs exhibit different cell-cycle status, alteration of which influenced their maturation. Therefore, our study provides a foundation for understanding the pros and cons of different reprogramming approaches. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  18. Novel Ruggedized Packaging Technology for VCSELs

    DTIC Science & Technology

    2017-03-01

    Novel Ruggedized Packaging Technology for VCSELs Charlie Kuznia ckuznia@ultracomm-inc.com Ultra Communications, Inc. Vista, CA, USA, 92081...n ac hieve l ow-power, E MI-immune links within hi gh-performance m ilitary computing an d sensor systems. Figure 1. Chip-scale-packaging of

  19. Introduction to Software Packages. [Final Report.

    ERIC Educational Resources Information Center

    Frankel, Sheila, Ed.; And Others

    This document provides an introduction to applications computer software packages that support functional managers in government and encourages the use of such packages as an alternative to in-house development. A review of current application areas includes budget/project management, financial management/accounting, payroll, personnel,…

  20. 45 CFR 150.203 - Circumstances requiring CMS enforcement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Circumstances requiring CMS enforcement. 150.203... CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement Processes for... requiring CMS enforcement. CMS enforces HIPAA requirements to the extent warranted (as determined by CMS) in...

  1. Detailed facies analysis of the Upper Cretaceous Tununk Shale Member, Henry Mountains Region, Utah: Implications for mudstone depositional models in epicontinental seas

    NASA Astrophysics Data System (ADS)

    Li, Zhiyang; Schieber, Juergen

    2018-02-01

    Lower-Middle Turonian strata of the Tununk Shale Member of the greater Mancos Shale were deposited along the western margin of the Cretaceous Western Interior Seaway during the Greenhorn second-order sea level cycle. In order to examine depositional controls on facies development in this mudstone-rich succession, this study delineates temporal and spatial relationships in a process-sedimentologic-based approach. The 3-dimensional expression of mudstone facies associations and their stratal architecture is assessed through a fully integrative physical and biologic characterization as exposed in outcrops in south-central Utah. Sedimentologic characteristics from the millimeter- to kilometer-scale are documented in order to fully address the complex nature of sediment transport mechanisms observed in this shelf muddy environment. The resulting facies model developed from this characterization consists of a stack of four lithofacies packages including: 1) carbonate-bearing, silty and sandy mudstone (CSSM), 2) silt-bearing, calcareous mudstone (SCM), 3) carbonate-bearing, silty mudstone to muddy siltstone (CMS), and 4) non-calcareous, silty and sandy mudstone (SSM). Spatial and temporal variations in lithofacies type and sedimentary facies characteristics indicate that the depositional environments of the Tununk Shale shifted in response to the 2nd-order Greenhorn transgressive-regressive sea-level cycle. During this eustatic event, the Tununk shows a characteristic vertical shift from distal middle shelf to outer shelf (CSSM to SCM facies), then from outer shelf to inner shelf environment (SCM to CMS, and to SSM facies). Shifting depositional environments, as well as changes in dominant paleocurrent direction throughout this succession, indicate multiple source areas and transport mechanisms (i.e. longshore currents, offshore-directed underflows, storm reworking). This study provides a rare documentation of the Greenhorn cycle as exposed across the entire shelf setting. High-resolution mapping of genetically-related packages facilitate the development of process-based depositional models that can be utilized for lateral correlations into the equivalent foredeep strata of the Cretaceous Interior.

  2. Effects of Computer Animation Instructional Package on Students' Achievement in Practical Biology

    ERIC Educational Resources Information Center

    Hamzat, Abdulrasaq; Bello, Ganiyu; Abimbola, Isaac Olakanmi

    2017-01-01

    This study examined the effects of computer animation instructional package on secondary school students' achievement in practical biology in Ilorin, Nigeria. The study adopted a pre-test, post-test, control group, non-randomised and nonequivalent quasi-experimental design, with a 2x2x3 factorial design. Two intact classes from two secondary…

  3. Sigma 2 Graphic Display Software Program Description

    NASA Technical Reports Server (NTRS)

    Johnson, B. T.

    1973-01-01

    A general purpose, user oriented graphic support package was implemented. A comprehensive description of the two software components comprising this package is given: Display Librarian and Display Controller. These programs have been implemented in FORTRAN on the XDS Sigma 2 Computer Facility. This facility consists of an XDS Sigma 2 general purpose computer coupled to a Computek Display Terminal.

  4. Paperless Work Package Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilgore, Jr., William R.; Morrell, Jr., Otto K.; Morrison, Dan

    2014-07-31

    Paperless Work Package (PWP) System is a computer program process that takes information from Asset Suite, provides a platform for other electronic inputs, Processes the inputs into an electronic package that can be downloaded onto an electronic work tablet or laptop computer, provides a platform for electronic inputs into the work tablet, and then transposes those inputs back into Asset Suite and to permanent SRS records. The PWP System will basically eliminate paper requirements from the maintenance work control system. The program electronically relays the instructions given by the planner to work on a piece of equipment which is currentlymore » relayed via a printed work package. The program does not control/approve what is done. The planner will continue to plan the work package, the package will continue to be routed, approved, and scheduled. The supervisor reviews and approves the work to be performed and assigns work to individuals or to a work group. (The supervisor conducts pre job briefings with the workers involved in the job) The Operations Manager (Work Controlling Entity) approves the work package electronically for the work that will be done in his facility prior to work starting. The PWP System will provide the package in an electronic form. All the reviews, approvals, and safety measures taken by people outside the electronic package does not change from the paper driven work packages.« less

  5. 10 CFR 431.92 - Definitions concerning commercial air conditioners and heat pumps.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... measurement. Commercial package air-conditioning and heating equipment means air-cooled, water-cooled... Conditioner means a basic model of commercial package air-conditioning and heating equipment (packaged or split) that is: Used in computer rooms, data processing rooms, or other information technology cooling...

  6. Packaging strategies for printed circuit board components. Volume I, materials & thermal stresses.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neilsen, Michael K.; Austin, Kevin N.; Adolf, Douglas Brian

    2011-09-01

    Decisions on material selections for electronics packaging can be quite complicated by the need to balance the criteria to withstand severe impacts yet survive deep thermal cycles intact. Many times, material choices are based on historical precedence perhaps ignorant of whether those initial choices were carefully investigated or whether the requirements on the new component match those of previous units. The goal of this program focuses on developing both increased intuition for generic packaging guidelines and computational methodologies for optimizing packaging in specific components. Initial efforts centered on characterization of classes of materials common to packaging strategies and computational analysesmore » of stresses generated during thermal cycling to identify strengths and weaknesses of various material choices. Future studies will analyze the same example problems incorporating the effects of curing stresses as needed and analyzing dynamic loadings to compare trends with the quasi-static conclusions.« less

  7. ParallelStructure: A R Package to Distribute Parallel Runs of the Population Genetics Program STRUCTURE on Multi-Core Computers

    PubMed Central

    Besnier, Francois; Glover, Kevin A.

    2013-01-01

    This software package provides an R-based framework to make use of multi-core computers when running analyses in the population genetics program STRUCTURE. It is especially addressed to those users of STRUCTURE dealing with numerous and repeated data analyses, and who could take advantage of an efficient script to automatically distribute STRUCTURE jobs among multiple processors. It also consists of additional functions to divide analyses among combinations of populations within a single data set without the need to manually produce multiple projects, as it is currently the case in STRUCTURE. The package consists of two main functions: MPI_structure() and parallel_structure() as well as an example data file. We compared the performance in computing time for this example data on two computer architectures and showed that the use of the present functions can result in several-fold improvements in terms of computation time. ParallelStructure is freely available at https://r-forge.r-project.org/projects/parallstructure/. PMID:23923012

  8. 42 CFR 447.514 - Upper limits for multiple source drugs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... State agency plus an amount established by CMS that is equal to 250 percent of the AMP (as computed... will consider the following additional criteria: (1) The AMP of a terminated NDC will not be used to... section, the AMP of the lowest priced therapeutically and pharmaceutically equivalent drug that is not...

  9. Building Real World Domain-Specific Social Network Websites as a Capstone Project

    ERIC Educational Resources Information Center

    Yue, Kwok-Bun; De Silva, Dilhar; Kim, Dan; Aktepe, Mirac; Nagle, Stewart; Boerger, Chris; Jain, Anubha; Verma, Sunny

    2009-01-01

    This paper describes our experience of using Content Management Software (CMS), specifically Joomla, to build a real world domain-specific social network site (SNS) as a capstone project for graduate information systems and computer science students. As Web 2.0 technologies become increasingly important in driving business application development,…

  10. Integrating an Educational Game in Moodle LMS

    ERIC Educational Resources Information Center

    Minovic, Miroslav; Milovanovic, Milos; Minovic, Jelena; Starcevic, Dusan

    2012-01-01

    The authors present a learning platform based on a computer game. Learning games combine two industries: education and entertainment, which is often called "Edutainment." The game is realized as a strategic game (similar to Risk[TM]), implemented as a module for Moodle CMS, utilizing Java Applet technology. Moodle is an open-source course…

  11. Collaborative Concept Mapping Activities in a Classroom Scenario

    ERIC Educational Resources Information Center

    Elorriaga, J. A.; Arruarte, A.; Calvo, I.; Larrañaga, M.; Rueda, U.; Herrán, E.

    2013-01-01

    The aim of this study is to test collaborative concept mapping activities using computers in a classroom scenario and to evaluate the possibilities that Elkar-CM offers for collaboratively learning non-technical topics. Elkar-CM is a multi-lingual and multi-media software program designed for drawing concept maps (CMs) collaboratively. Concept…

  12. Education review: applied medical informatics--informatics in medical education.

    PubMed

    Naeymi-Rad, F; Trace, D; Moidu, K; Carmony, L; Booden, T

    1994-05-01

    The importance of informatics training within a health sciences program is well recognized and is being implemented on an increasing scale. At Chicago Medical School (CMS), the Informatics program incorporates information technology at every stage of medical education. First-year students are offered an elective in computer topics that concentrate on basic computer literacy. Second-year students learn information management such as entry and information retrieval skills. For example, during the Introduction to Clinical Medicine course, the student is exposed to the Intelligent Medical Record-Entry (IMR-E), allowing the student to enter and organize information gathered from patient encounters. In the third year, students in the Internal Medicine rotation at Norwalk Hospital use Macintosh power books to enter and manage their patients. Patient data gathered by the student are stored in a local server in Norwalk Hospital. In the final year, we teach students the role of informatics in clinical decision making. The present senior class at CMS has been exposed to the power of medical informatics tools for several years. The use of these informatics tools at the point of care is stressed.

  13. Faster than Real-Time Dynamic Simulation for Large-Size Power System with Detailed Dynamic Models using High-Performance Computing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Jin, Shuangshuang; Chen, Yousu

    This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less

  14. Dose-ranging pharmacokinetics of colistin methanesulphonate (CMS) and colistin in rats following single intravenous CMS doses.

    PubMed

    Marchand, Sandrine; Lamarche, Isabelle; Gobin, Patrice; Couet, William

    2010-08-01

    The aim of this study was to evaluate the effect of colistin methanesulphonate (CMS) dose on CMS and colistin pharmacokinetics in rats. Three rats per group received an intravenous bolus of CMS at a dose of 5, 15, 30, 60 or 120 mg/kg. Arterial blood samples were drawn at 0, 5, 15, 30, 60, 90, 120, 150 and 180 min. CMS and colistin plasma concentrations were determined by liquid chromatography-tandem mass spectrometry (LC-MS/MS). The pharmacokinetic parameters of CMS and colistin were calculated by non-compartmental analysis. Linear relationships were observed between CMS and colistin AUCs to infinity and CMS doses, as well as between CMS and colistin C(max) and CMS doses. CMS and colistin pharmacokinetics were linear for a range of colistin concentrations covering the range of values encountered and recommended in patients even during treatment with higher doses.

  15. Software package for modeling spin–orbit motion in storage rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de

    2015-12-15

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less

  16. Space-Shuttle Emulator Software

    NASA Technical Reports Server (NTRS)

    Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram; hide

    2007-01-01

    A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.

  17. The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments

    NASA Astrophysics Data System (ADS)

    Girone, M.; Andreeva, J.; Barreiro Megino, F. H.; Campana, S.; Cinquilli, M.; Di Girolamo, A.; Dimou, M.; Giordano, D.; Karavakis, E.; Kenyon, M. J.; Kokozkiewicz, L.; Lanciotti, E.; Litmaath, M.; Magini, N.; Negri, G.; Roiser, S.; Saiz, P.; Saiz Santos, M. D.; Schovancova, J.; Sciabà, A.; Spiga, D.; Trentadue, R.; Tuckett, D.; Valassi, A.; Van der Ster, D. C.; Shiers, J. D.

    2012-12-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.

  18. Projected Clinical, Resource Use, and Fiscal Impacts of Implementing Low-Dose Computed Tomography Lung Cancer Screening in Medicare.

    PubMed

    Roth, Joshua A; Sullivan, Sean D; Goulart, Bernardo H L; Ravelo, Arliene; Sanderson, Joanna C; Ramsey, Scott D

    2015-07-01

    The Centers for Medicare and Medicaid Services (CMS) recently issued a national coverage determination that provides reimbursement for low-dose computed tomography (CT) lung cancer screening for enrollees age 55 to 77 years with ≥ 30-pack-year smoking history who currently smoke or quit in the last 15 years. The clinical, resource use, and fiscal impacts of this change in screening coverage policy remain uncertain. We developed a simulation model to forecast the 5-year health outcome impacts of the CMS low-dose CT screening policy in Medicare compared with no screening. The model used data from the National Lung Screening Trial, CMS enrollment statistics and reimbursement schedules, and peer-reviewed literature. Outcomes included counts of screening examinations, patient cases of lung cancer detected, stage distribution, and total and per-enrollee per-month fiscal impact. Over 5 years, we project that low-dose CT screening will result in 10.7 million more low-dose CT scans, 52,000 more lung cancers detected, and increased overall expenditure of $6.8 billion ($2.22 per Medicare enrollee per month). The most fiscally impactful factors were the average cost-per-screening episode, proportion of enrollees eligible for screening, and cost of treating stage I lung cancer. Low-dose CT screening is expected to increase lung cancer diagnoses, shift stage at diagnosis toward earlier stages, and substantially increase Medicare expenditures over a 5-year time horizon. These projections can inform planning efforts by Medicare administrators, contracted health care providers, and other stakeholders. Copyright © 2015 by American Society of Clinical Oncology.

  19. Results of computer assisted mini-incision subvastus approach for total knee arthroplasty.

    PubMed

    Turajane, Thana; Larbpaiboonpong, Viroj; Kongtharvonskul, Jatupon; Maungsiri, Samart

    2009-12-01

    Mini-incision subvastus approach is soft tissue preservation of the knee. Advantages of the mini-incision subvastus approach included reduced blood loss, reduced pain, self rehabilitation and faster recovery. However, the improved visualization, component alignment, and more blood preservation have been debatable to achieve the better outcome and preventing early failure of the Total Knee Arthroplasty (TKA). The computer navigation has been introduced to improve alignment and blood loss. The purpose of this study was to evaluate the short term outcomes of the combination of computer assisted mini-incision subvastus approach for Total Knee Arthroplasty (CMS-TKA). A prospective case series of the initial 80 patients who underwent computer assisted mini-incision subvastus approach for CMS-TKA from January 2007 to October 2008 was carried out. The patients' conditions were classified into 2 groups, the simple OA knee (varus deformity was less than 15 degree, BMI was less than 20%, no associated deformities) and the complex deformity (varus deformity was more than 15 degrees, BMI more was than 20%, associated with flexion contractor). There were 59 patients in group 1 and 21 patients in group 2. Of the 80 knees, 38 were on the left and 42 on the right. The results of CMS-TKA [the mean (range)] in group 1: group 2 were respectively shown as the incision length [10.88 (8-13): 11.92 (10-14], the operation time [118 (111.88-125.12): 131 (119.29-143.71) minutes, lateral releases (0 in both groups), postoperative range of motion in flexion [94.5 (90-100): 95.25 (90-105) degree] and extension [1.75 (0-5): 1.5 (0-5) degree] Blood loss in 24 hours [489.09 (414.7-563.48): 520 (503.46-636.54) ml] and blood transfusion [1 (0-1) unit? in both groups], Tibiofemoral angle preoperative [Varus = 4 (varus 0-10): Varus = 17.14 (varus 15.7-18.5) degree, Tibiofemoral angle postoperative [Valgus = 1.38 (Valgus 0-4): Valgus = 2.85 (valgus 2.1-3.5) degree], Tibiofemoral angle outlier (85% both groups), and Knee society score preoperative and postoperative [64.6 (59.8-69.4) and 93.7 (90.8-96.65)]: 69 (63.6-74.39) 92.36 (88.22-96.5)]. The complications found in both groups were similar. No deep vein thrombosis, no fracture at both femur and tibia, no vascular injury, and no pin tract pain or infection was found in both groups. The computer assisted CMS-TKA) is one of the appropriate procedures for all varus deformity, no limitation with the associated bone loss, flexion contractor, BMI, except the fixed valgus deformity. To ensure the clinical outcomes, multiple key steps were considered as the appropriate techniques for this approach which included the accurate registration, precision bone cut and ligament balances, and the good cement techniques.

  20. Bethe-Salpeter Eigenvalue Solver Package (BSEPACK) v0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SHAO, MEIYEU; YANG, CHAO

    2017-04-25

    The BSEPACK contains a set of subroutines for solving the Bethe-Salpeter Eigenvalue (BSE) problem. This type of problem arises in this study of optical excitation of nanoscale materials. The BSE problem is a structured non-Hermitian eigenvalue problem. The BSEPACK software can be used to compute all or subset of eigenpairs of a BSE Hamiltonian. It can also be used to compute the optical absorption spectrum without computing BSE eigenvalues and eigenvectors explicitly. The package makes use of the ScaLAPACK, LAPACK and BLAS.

  1. Attitudes and Achievement in Introductory Psychological Statistics Classes: Traditional versus Computer-Supported Instruction.

    ERIC Educational Resources Information Center

    Gratz, Zandra S.; And Others

    A study was conducted at a large, state-supported college in the Northeast to establish a mechanism by which a popular software package, Statistical Package for the Social Sciences (SPSS), could be used in psychology program statistics courses in such a way that no prior computer expertise would be needed on the part of the faculty or the…

  2. CS2 analysis in presence of non-Gaussian background noise - Effect on traditional estimators and resilience of log-envelope indicators

    NASA Astrophysics Data System (ADS)

    Borghesani, P.; Antoni, J.

    2017-06-01

    Second-order cyclostationary (CS2) analysis has become popular in the field of machine diagnostics and a series of digital signal processing techniques have been developed to extract CS2 components from the background noise. Among those techniques, squared envelope spectrum (SES) and cyclic modulation spectrum (CMS) have gained popularity thanks to their high computational efficiency and simple implementation. The effectiveness of CMS and SES has been previously quantified based on the hypothesis of Gaussian background noise and has led to statistical tests for the presence of CS2 peaks in squared envelope spectra and cyclic modulation spectra. However a recently established link of CMS with SES and of SES with kurtosis has exposed a potential weakness of those indicators in the case of highly leptokurtic background noise. This case is often present in practice when the machine is subjected to highly impulsive phenomena, either due to harsh operating conditions or to electric noise generated by power electronics and captured by the sensor. This study investigates and quantifies for the first time the effect of leptokurtic noise on the capabilities of SES and CMS, by analysing three progressively harsh situations: high kurtosis, infinite kurtosis and alpha-stable background noise (for which even first and second-order moments are not defined). Then the resilience of a recently proposed family of CS2 indicators, based on the log-envelope, is verified analytically, numerically and experimentally in the case of highly leptokurtic noise.

  3. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    The three dimensional quasi-analytical sensitivity analysis and the ancillary driver programs are developed needed to carry out the studies and perform comparisons. The code is essentially contained in one unified package which includes the following: (1) a three dimensional transonic wing analysis program (ZEBRA); (2) a quasi-analytical portion which determines the matrix elements in the quasi-analytical equations; (3) a method for computing the sensitivity coefficients from the resulting quasi-analytical equations; (4) a package to determine for comparison purposes sensitivity coefficients via the finite difference approach; and (5) a graphics package.

  4. Understanding Climate Policy Data Needs. NASA Carbon Monitoring System Briefing: Characterizing Flux Uncertainty, Washington D.C., 11 January 2012

    NASA Technical Reports Server (NTRS)

    Brown, Molly E.; Macauley, Molly

    2012-01-01

    Climate policy in the United States is currently guided by public-private partnerships and actions at the local and state levels. This mitigation strategy is made up of programs that focus on energy efficiency, renewable energy, agricultural practices and implementation of technologies to reduce greenhouse gases. How will policy makers know if these strategies are working, particularly at the scales at which they are being implemented? The NASA Carbon Monitoring System (CMS) will provide information on carbon dioxide fluxes derived from observations of earth's land, ocean and atmosphere used in state of the art models describing their interactions. This new modeling system could be used to assess the impact of specific policy interventions on CO2 reductions, enabling an iterative, results-oriented policy process. In January of 2012, the CMS team held a meeting with carbon policy and decision makers in Washington DC to describe the developing modeling system to policy makers. The NASA CMS will develop pilot studies to provide information across a range of spatial scales, consider carbon storage in biomass, and improve measures of the atmospheric distribution of carbon dioxide. The pilot involves multiple institutions (four NASA centers as well as several universities) and over 20 scientists in its work. This pilot study will generate CO2 flux maps for two years using observational constraints in NASA's state-of -the-art models. Bottom-up surface flux estimates will be computed using data-constrained land and ocean models; comparison of the different techniques will provide some knowledge of uncertainty in these estimates. Ensembles of atmospheric carbon distributions will be computed using an atmospheric general circulation model (GEOS-5), with perturbations to the surface fluxes and to transport. Top-down flux estimates will be computed from observed atmospheric CO2 distributions (ACOS/GOSAT retrievals) alongside the forward-model fields, in conjunction with an inverse approach based on the CO2 model of GEOS ]Chem. The forward model ensembles will be used to build understanding of relationships among surface flux perturbations, transport uncertainty and atmospheric carbon concentration. This will help construct uncertainty estimates and information on the true spatial resolution of the top-down flux calculations. The relationship between the top-down and bottom-up flux distributions will be documented. Because the goal of NASA CMS is to be policy relevant, the scientists involved in the flux modeling pilot need to understand and be focused on the needs of the climate policy and decision making community. If policy makers are to use CMS products, they must be aware of the modeling effort and begin to design policies that can be evaluated with information. Improving estimates of carbon sequestered in forests, for example, will require information on the spatial variability of forest biomass that is far more explicit than is presently possible using only ground observations. Carbon mitigation policies being implemented by cities around the United States could be designed with the CMS data in mind, enabling sequential evaluation and subsequent improvements in incentives, structures and programs. The success of climate mitigation programs being implemented in the United States today will hang on the depth of the relationship between scientists and their policy and decision making counterparts. Ensuring that there is two-way communication between data providers and users is important for the success both of the policies and the scientific products meant to support them..

  5. Matrigel Mattress: A Method for the Generation of Single Contracting Human-Induced Pluripotent Stem Cell-Derived Cardiomyocytes.

    PubMed

    Feaster, Tromondae K; Cadar, Adrian G; Wang, Lili; Williams, Charles H; Chun, Young Wook; Hempel, Jonathan E; Bloodworth, Nathaniel; Merryman, W David; Lim, Chee Chew; Wu, Joseph C; Knollmann, Björn C; Hong, Charles C

    2015-12-04

    The lack of measurable single-cell contractility of human-induced pluripotent stem cell-derived cardiac myocytes (hiPSC-CMs) currently limits the utility of hiPSC-CMs for evaluating contractile performance for both basic research and drug discovery. To develop a culture method that rapidly generates contracting single hiPSC-CMs and allows quantification of cell shortening with standard equipment used for studying adult CMs. Single hiPSC-CMs were cultured for 5 to 7 days on a 0.4- to 0.8-mm thick mattress of undiluted Matrigel (mattress hiPSC-CMs) and compared with hiPSC-CMs maintained on a control substrate (<0.1-mm thick 1:60 diluted Matrigel, control hiPSC-CMs). Compared with control hiPSC-CMs, mattress hiPSC-CMs had more rod-shape morphology and significantly increased sarcomere length. Contractile parameters of mattress hiPSC-CMs measured with video-based edge detection were comparable with those of freshly isolated adult rabbit ventricular CMs. Morphological and contractile properties of mattress hiPSC-CMs were consistent across cryopreserved hiPSC-CMs generated independently at another institution. Unlike control hiPSC-CMs, mattress hiPSC-CMs display robust contractile responses to positive inotropic agents, such as myofilament calcium sensitizers. Mattress hiPSC-CMs exhibit molecular changes that include increased expression of the maturation marker cardiac troponin I and significantly increased action potential upstroke velocity because of a 2-fold increase in sodium current (INa). The Matrigel mattress method enables the rapid generation of robustly contracting hiPSC-CMs and enhances maturation. This new method allows quantification of contractile performance at the single-cell level, which should be valuable to disease modeling, drug discovery, and preclinical cardiotoxicity testing. © 2015 American Heart Association, Inc.

  6. High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects

    NASA Technical Reports Server (NTRS)

    Schutt-Aine, Jose E.

    1996-01-01

    The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.

  7. 77 FR 31618 - Medicaid Program; Announcement of Requirements and Registration for CMS Provider Screening...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-29

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services (CMS) [CMS-2382-N... Challenge AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Notice. SUMMARY: The Centers for Medicare & Medicaid Services (CMS), is announcing the launch of the ``CMS Provider Screening...

  8. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs.

    PubMed

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-05-28

    Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.

  9. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs

    PubMed Central

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-01-01

    Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045

  10. Diagnostic Testing Package DX v 2.0 Technical Specification. Methodology Project.

    ERIC Educational Resources Information Center

    McArthur, David

    This paper contains the technical specifications, schematic diagrams, and program printout for a computer software package for the development and administration of diagnostic tests. The second version of the Diagnostic Testing Package DX consists of a PASCAL-based set of modules located in two main programs: (1) EDITTEST creates, modifies, and…

  11. Scoria: a Python module for manipulating 3D molecular data.

    PubMed

    Ropp, Patrick; Friedman, Aaron; Durrant, Jacob D

    2017-09-18

    Third-party packages have transformed the Python programming language into a powerful computational-biology tool. Package installation is easy for experienced users, but novices sometimes struggle with dependencies and compilers. This presents a barrier that can hinder the otherwise broad adoption of new tools. We present Scoria, a Python package for manipulating three-dimensional molecular data. Unlike similar packages, Scoria requires no dependencies, compilation, or system-wide installation. One can incorporate the Scoria source code directly into their own programs. But Scoria is not designed to compete with other similar packages. Rather, it complements them. Our package leverages others (e.g. NumPy, SciPy), if present, to speed and extend its own functionality. To show its utility, we use Scoria to analyze a molecular dynamics trajectory. Our FootPrint script colors the atoms of one chain by the frequency of their contacts with a second chain. We are hopeful that Scoria will be a useful tool for the computational-biology community. A copy is available for download free of charge (Apache License 2.0) at http://durrantlab.com/scoria/ . Graphical abstract .

  12. cisPath: an R/Bioconductor package for cloud users for visualization and management of functional protein interaction networks.

    PubMed

    Wang, Likun; Yang, Luhe; Peng, Zuohan; Lu, Dan; Jin, Yan; McNutt, Michael; Yin, Yuxin

    2015-01-01

    With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services.

  13. cisPath: an R/Bioconductor package for cloud users for visualization and management of functional protein interaction networks

    PubMed Central

    2015-01-01

    Background With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. Results With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. Conclusions This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services. PMID:25708840

  14. Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.

    PubMed

    Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B

    2017-03-30

    Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.

  15. Tissue and Animal Models of Sudden Cardiac Death

    PubMed Central

    Sallam, Karim; Li, Yingxin; Sager, Philip T.; Houser, Steven R.; Wu, Joseph C.

    2015-01-01

    Sudden Cardiac Death (SCD) is a common cause of death in patients with structural heart disease, genetic mutations or acquired disorders affecting cardiac ion channels. A wide range of platforms exist to model and study disorders associated with SCD. Human clinical studies are cumbersome and are thwarted by the extent of investigation that can be performed on human subjects. Animal models are limited by their degree of homology to human cardiac electrophysiology including ion channel expression. Most commonly used cellular models are cellular transfection models, which are able to mimic the expression of a single ion channel offering incomplete insight into changes of the action potential profile. Induced pluripotent stem cell derived Cardiomyocytes (iPSC-CMs) resemble, but are not identical, to adult human cardiomyocytes, and provide a new platform for studying arrhythmic disorders leading to SCD. A variety of platforms exist to phenotype cellular models including conventional and automated patch clamp, multi-electrode array, and computational modeling. iPSC-CMs have been used to study Long QT syndrome, catecholaminergic polymorphic ventricular tachycardia, hypertrophic cardiomyopathy and other hereditary cardiac disorders. Although iPSC-CMs are distinct from adult cardiomyocytes, they provide a robust platform to advance the science and clinical care of SCD. PMID:26044252

  16. The Effectiveness of a Computer-Assisted Instruction Package in Supplementing Teaching of Selected Concepts in High School Chemistry: Writing Formulas and Balancing Chemical Equations.

    ERIC Educational Resources Information Center

    Wainwright, Camille L.

    Four classes of high school chemistry students (N=108) were randomly assigned to experimental and control groups to investigate the effectiveness of a computer assisted instruction (CAI) package during a unit on writing/naming of chemical formulas and balancing equations. Students in the experimental group received drill, review, and reinforcement…

  17. 42 CFR 484.215 - Initial establishment of the calculation of the national 60-day episode payment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... costs for the period. CMS determines the national mean cost per visit. (b) Determining HHA utilization... mean utilization for each of the six disciplines using home health claims data. (c) Use of the market... computing the national mean utilization for each discipline. (3) By multiplying the mean national cost per...

  18. Program Helps Generate Boundary-Element Mathematical Models

    NASA Technical Reports Server (NTRS)

    Goldberg, R. K.

    1995-01-01

    Composite Model Generation-Boundary Element Method (COM-GEN-BEM) computer program significantly reduces time and effort needed to construct boundary-element mathematical models of continuous-fiber composite materials at micro-mechanical (constituent) scale. Generates boundary-element models compatible with BEST-CMS boundary-element code for anlaysis of micromechanics of composite material. Written in PATRAN Command Language (PCL).

  19. Computers in medical education 1: evaluation of a problem-orientated learning package.

    PubMed

    Devitt, P; Palmer, E

    1998-04-01

    A computer-based learning package has been developed, aimed at expanding students' knowledge base, as well as improving data-handling abilities and clinical problem-solving skills. The program was evaluated by monitoring its use by students, canvassing users' opinions and measuring its effectiveness as a learning tool compared to tutorials on the same material. Evaluation was undertaken using three methods: initially, by a questionnaire on computers as a learning tool and the applicability of the content: second, through monitoring by the computer of student use, decisions and performance; finally, through pre- and post-test assessment of fifth-year students who either used a computer package or attended a tutorial on equivalent material. Most students provided positive comments on the learning material and expressed a willingness to see computer-aided learning (CAL) introduced into the curriculum. Over a 3-month period, 26 modules in the program were used on 1246 occasions. Objective measurement showed a significant gain in knowledge, data handling and problem-solving skills. Computer-aided learning is a valuable learning resource that deserves better attention in medical education. When used appropriately, the computer can be an effective learning resource, not only for the delivery of knowledge. but also to help students develop their problem-solving skills.

  20. PICSiP: new system-in-package technology using a high bandwidth photonic interconnection layer for converged microsystems

    NASA Astrophysics Data System (ADS)

    Tekin, Tolga; Töpper, Michael; Reichl, Herbert

    2009-05-01

    Technological frontiers between semiconductor technology, packaging, and system design are disappearing. Scaling down geometries [1] alone does not provide improvement of performance, less power, smaller size, and lower cost. It will require "More than Moore" [2] through the tighter integration of system level components at the package level. System-in-Package (SiP) will deliver the efficient use of three dimensions (3D) through innovation in packaging and interconnect technology. A key bottleneck to the implementation of high-performance microelectronic systems, including SiP, is the lack of lowlatency, high-bandwidth, and high density off-chip interconnects. Some of the challenges in achieving high-bandwidth chip-to-chip communication using electrical interconnects include the high losses in the substrate dielectric, reflections and impedance discontinuities, and susceptibility to crosstalk [3]. Obviously, the incentive for the use of photonics to overcome the challenges and leverage low-latency and highbandwidth communication will enable the vision of optical computing within next generation architectures. Supercomputers of today offer sustained performance of more than petaflops, which can be increased by utilizing optical interconnects. Next generation computing architectures are needed with ultra low power consumption; ultra high performance with novel interconnection technologies. In this paper we will discuss a CMOS compatible underlying technology to enable next generation optical computing architectures. By introducing a new optical layer within the 3D SiP, the development of converged microsystems, deployment for next generation optical computing architecture will be leveraged.

  1. Circulation patterns in the deep Subtropical Northeast Atlantic with ARGO data

    NASA Astrophysics Data System (ADS)

    Calheiros, Tomas; Bashmachnikov, Igor

    2014-05-01

    In this work we study the dominant circulation patterns in the Subtropical Northeast Atlantic using ARGO data [25-45o N, 5-35o W]. The data were obtained from the Coriolis operational data center (ftp://ftp.ifremer.fr) for the years 1999-2013. During this period of time in the study there were available area 376 floats with 15062 float-months of total time. The floats were launched in the depths range between 300 and 2000 m, but most of the floats were concentrated at 1000 m (2000 float-months) and 1500 m (3400 float-months). In the upper 400-m layer there were also about 1000 float-months, but their number and distribution did not allow analysis of the mean currents over the study region. For each float position Lagrangian current velocity was computed as the difference between the position when the buoy started sinking to the reference depth and the consequent position of surfacing of the float, divided by the respective time interval. This allowed reducing the noise related with sea-surface drift of the buoys during the data-transmission periods. Mean Eulerian velocity and its error were computed in each of the 2ox2o square. Whenever in a 2ox2o square more than 150 observations of the Lagrangian velocity were available, the square was split into 4 smaller 1ox1o squares, in each of which the mean Eulerian velocities and their errors were estimated. Eulerian currents at 1000 m, as well as at 1500 m depth formed an overall anticyclonic circulation pattern in the study region. The modal velocity of all buoys at 1000 m level was 4 cm/s with an error of the mean of 1.8 cm/s. The modal velocity of all buoys at 1500m was 3 cm/s with an error of the mean of 1.4 cm/s. The southwestward flows near the Madeira Island and further westwards flow along the zonal band of 25-30o N at 1500 m depth well corresponded to the extension of the deep fraction of the Mediterranean Water salt tong.

  2. Computational modelling of a thermoforming process for thermoplastic starch

    NASA Astrophysics Data System (ADS)

    Szegda, D.; Song, J.; Warby, M. K.; Whiteman, J. R.

    2007-05-01

    Plastic packaging waste currently forms a significant part of municipal solid waste and as such is causing increasing environmental concerns. Such packaging is largely non-biodegradable and is particularly difficult to recycle or to reuse due to its complex composition. Apart from limited recycling of some easily identifiable packaging wastes, such as bottles, most packaging waste ends up in landfill sites. In recent years, in an attempt to address this problem in the case of plastic packaging, the development of packaging materials from renewable plant resources has received increasing attention and a wide range of bioplastic materials based on starch are now available. Environmentally these bioplastic materials also reduce reliance on oil resources and have the advantage that they are biodegradable and can be composted upon disposal to reduce the environmental impact. Many food packaging containers are produced by thermoforming processes in which thin sheets are inflated under pressure into moulds to produce the required thin wall structures. Hitherto these thin sheets have almost exclusively been made of oil-based polymers and it is for these that computational models of thermoforming processes have been developed. Recently, in the context of bioplastics, commercial thermoplastic starch sheet materials have been developed. The behaviour of such materials is influenced both by temperature and, because of the inherent hydrophilic characteristics of the materials, by moisture content. Both of these aspects affect the behaviour of bioplastic sheets during the thermoforming process. This paper describes experimental work and work on the computational modelling of thermoforming processes for thermoplastic starch sheets in an attempt to address the combined effects of temperature and moisture content. After a discussion of the background of packaging and biomaterials, a mathematical model for the deformation of a membrane into a mould is presented, together with its finite element discretisation. This model depends on material parameters of the thermoplastic and details of tests undertaken to determine these and the results produced are given. Finally the computational model is applied for a thin sheet of commercially available thermoplastic starch material which is thermoformed into a specific mould. Numerical results of thickness and shape for this problem are given.

  3. A multimedia adult literacy program: Combining NASA technology, instructional design theory, and authentic literacy concepts

    NASA Technical Reports Server (NTRS)

    Willis, Jerry W.

    1993-01-01

    For a number of years, the Software Technology Branch of the Information Systems Directorate has been involved in the application of cutting edge hardware and software technologies to instructional tasks related to NASA projects. The branch has developed intelligent computer aided training shells, instructional applications of virtual reality and multimedia, and computer-based instructional packages that use fuzzy logic for both instructional and diagnostic decision making. One outcome of the work on space-related technology-supported instruction has been the creation of a significant pool of human talent in the branch with current expertise on the cutting edges of instructional technologies. When the human talent is combined with advanced technologies for graphics, sound, video, CD-ROM, and high speed computing, the result is a powerful research and development group that both contributes to the applied foundations of instructional technology and creates effective instructional packages that take advantage of a range of advanced technologies. Several branch projects are currently underway that combine NASA-developed expertise to significant instructional problems in public education. The branch, for example, has developed intelligent computer aided software to help high school students learn physics and staff are currently working on a project to produce educational software for young children with language deficits. This report deals with another project, the adult literacy tutor. Unfortunately, while there are a number of computer-based instructional packages available for adult literacy instruction, most of them are based on the same instructional models that failed these students when they were in school. The teacher-centered, discrete skill and drill-oriented, instructional strategies, even when they are supported by color computer graphics and animation, that form the foundation for most of the computer-based literacy packages currently on the market may not be the most effective or most desirable way to use computer technology in literacy programs. This project is developing a series of instructional packages that are based on a different instructional model - authentic instruction. The instructional development model used to create these packages is also different. Instead of using the traditional five stage linear, sequential model based on behavioral learning theory, the project uses the recursive, reflective design and development model (R2D2) that is based on cognitive learning theory, particularly the social constructivism of Vygotsky, and an epistemology based on critical theory. Using alternative instructional and instructional development theories, the result of the summer faculty fellowship is LiteraCity, a multimedia adult literacy instructional package that is a simulation of finding and applying for a job. The program, which is about 120 megabytes, is distributed on CD-ROM.

  4. Increasing Student Retention Through Application of Attitude Change Packages (and) Increasing GPA and Student Retention of Low Income Minority Community College Students Through Application of Nightengale Conant Change Packages; A Pilot STUDY.

    ERIC Educational Resources Information Center

    Preising, Paul P.; Frost, Robert

    The first of two studies reported was conducted to determine whether unemployed aerospace engineers who received computer science training as well as the Nightengale-Conant attitude change packages would have a significantly higher course completion rate than control classes who were given the same training without the attitude change packages.…

  5. FREQ: A computational package for multivariable system loop-shaping procedures

    NASA Technical Reports Server (NTRS)

    Giesy, Daniel P.; Armstrong, Ernest S.

    1989-01-01

    Many approaches in the field of linear, multivariable time-invariant systems analysis and controller synthesis employ loop-sharing procedures wherein design parameters are chosen to shape frequency-response singular value plots of selected transfer matrices. A software package, FREQ, is documented for computing within on unified framework many of the most used multivariable transfer matrices for both continuous and discrete systems. The matrices are evaluated at user-selected frequency-response values, and singular values against frequency. Example computations are presented to demonstrate the use of the FREQ code.

  6. The development of an engineering computer graphics laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, D. C.; Garrett, R. E.

    1975-01-01

    Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.

  7. Motivation, values, and work design as drivers of participation in the R open source project for statistical computing

    PubMed Central

    Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt

    2015-01-01

    One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation. PMID:26554005

  8. Motivation, values, and work design as drivers of participation in the R open source project for statistical computing.

    PubMed

    Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt

    2015-12-01

    One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation.

  9. X based interactive computer graphics applications for aerodynamic design and education

    NASA Technical Reports Server (NTRS)

    Benson, Thomas J.; Higgs, C. Fred, III

    1995-01-01

    Six computer applications packages have been developed to solve a variety of aerodynamic problems in an interactive environment on a single workstation. The packages perform classical one dimensional analysis under the control of a graphical user interface and can be used for preliminary design or educational purposes. The programs were originally developed on a Silicon Graphics workstation and used the GL version of the FORMS library as the graphical user interface. These programs have recently been converted to the XFORMS library of X based graphics widgets and have been tested on SGI, IBM, Sun, HP and PC-Lunix computers. The paper will show results from the new VU-DUCT program as a prime example. VU-DUCT has been developed as an educational package for the study of subsonic open and closed loop wind tunnels.

  10. BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.

    PubMed

    Huang, Hailiang; Tata, Sandeep; Prill, Robert J

    2013-01-01

    Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp

  11. Community-driven computational biology with Debian Linux.

    PubMed

    Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles

    2010-12-21

    The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.

  12. MGtoolkit: A python package for implementing metagraphs

    NASA Astrophysics Data System (ADS)

    Ranathunga, D.; Nguyen, H.; Roughan, M.

    In this paper we present MGtoolkit: an open-source Python package for implementing metagraphs - a first of its kind. Metagraphs are commonly used to specify and analyse business and computer-network policies alike. MGtoolkit can help verify such policies and promotes learning and experimentation with metagraphs. The package currently provides purely textual output for visualising metagraphs and their analysis results.

  13. General-Purpose Ada Software Packages

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.

    1991-01-01

    Collection of subprograms brings to Ada many features from other programming languages. All generic packages designed to be easily instantiated for types declared in user's facility. Most packages have widespread applicability, although some oriented for avionics applications. All designed to facilitate writing new software in Ada. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  14. CMIP: a software package capable of reconstructing genome-wide regulatory networks using gene expression data.

    PubMed

    Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang

    2016-12-23

    A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .

  15. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  16. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  17. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  18. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  19. 42 CFR 493.1773 - Standard: Basic inspection requirements for all laboratories issued a CLIA certificate and CLIA...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... issued a certificate of accreditation, must permit CMS or a CMS agent to conduct validation and complaint inspections. (b) General requirements. As part of the inspection process, CMS or a CMS agent may require the... testing process (preanalytic, analytic, and postanalytic). (4) Permit CMS or a CMS agent access to all...

  20. The Amber Biomolecular Simulation Programs

    PubMed Central

    CASE, DAVID A.; CHEATHAM, THOMAS E.; DARDEN, TOM; GOHLKE, HOLGER; LUO, RAY; MERZ, KENNETH M.; ONUFRIEV, ALEXEY; SIMMERLING, CARLOS; WANG, BING; WOODS, ROBERT J.

    2006-01-01

    We describe the development, current features, and some directions for future development of the Amber package of computer programs. This package evolved from a program that was constructed in the late 1970s to do Assisted Model Building with Energy Refinement, and now contains a group of programs embodying a number of powerful tools of modern computational chemistry, focused on molecular dynamics and free energy calculations of proteins, nucleic acids, and carbohydrates. PMID:16200636

  1. [Intranarcotic infusion therapy -- a computer interpretation using the program package SPSS (Statistical Package for the Social Sciences)].

    PubMed

    Link, J; Pachaly, J

    1975-08-01

    In a retrospective 18-month study the infusion therapy applied in a great anesthesia institute is examined. The data of the course of anesthesia recorded on magnetic tape by routine are analysed for this purpose bya computer with the statistical program SPSS. It could be proved that the behaviour of the several anesthetists is very different. Various correlations are discussed.

  2. 1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.

  3. Scilab software as an alternative low-cost computing in solving the linear equations problem

    NASA Astrophysics Data System (ADS)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  4. CMS users data management service integration and first experiences with its NoSQL data storage

    NASA Astrophysics Data System (ADS)

    Riahi, H.; Spiga, D.; Boccali, T.; Ciangottini, D.; Cinquilli, M.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Santocchia, A.

    2014-06-01

    The distributed data analysis workflow in CMS assumes that jobs run in a different location to where their results are finally stored. Typically the user outputs must be transferred from one site to another by a dedicated CMS service, AsyncStageOut. This new service is originally developed to address the inefficiency in using the CMS computing resources when transferring the analysis job outputs, synchronously, once they are produced in the job execution node to the remote site. The AsyncStageOut is designed as a thin application relying only on the NoSQL database (CouchDB) as input and data storage. It has progressed from a limited prototype to a highly adaptable service which manages and monitors the whole user files steps, namely file transfer and publication. The AsyncStageOut is integrated with the Common CMS/Atlas Analysis Framework. It foresees the management of nearly nearly 200k users' files per day of close to 1000 individual users per month with minimal delays, and providing a real time monitoring and reports to users and service operators, while being highly available. The associated data volume represents a new set of challenges in the areas of database scalability and service performance and efficiency. In this paper, we present an overview of the AsyncStageOut model and the integration strategy with the Common Analysis Framework. The motivations for using the NoSQL technology are also presented, as well as data design and the techniques used for efficient indexing and monitoring of the data. We describe deployment model for the high availability and scalability of the service. We also discuss the hardware requirements and the results achieved as they were determined by testing with actual data and realistic loads during the commissioning and the initial production phase with the Common Analysis Framework.

  5. CMS Analysis and Data Reduction with Apache Spark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutsche, Oliver; Canali, Luca; Cremer, Illia

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis ofmore » very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.« less

  6. Estimating job runtime for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Sfiligoi, I.

    2014-06-01

    The basic premise of pilot systems is to create an overlay scheduling system on top of leased resources. And by definition, leases have a limited lifetime, so any job that is scheduled on such resources must finish before the lease is over, or it will be killed and all the computation is wasted. In order to effectively schedule jobs to resources, the pilot system thus requires the expected runtime of the users' jobs. Past studies have shown that relying on user provided estimates is not a valid strategy, so the system should try to make an estimate by itself. This paper provides a study of the historical data obtained from the Compact Muon Solenoid (CMS) experiment's Analysis Operations submission system. Clear patterns are observed, suggesting that making prediction of an expected job lifetime range is achievable with high confidence level in this environment.

  7. Documentation of a computer program to simulate aquifer-system compaction using the modular finite-difference ground-water flow model

    USGS Publications Warehouse

    Leake, S.A.; Prudic, David E.

    1991-01-01

    Removal of ground water by pumping from aquifers may result in compaction of compressible fine-grained beds that are within or adjacent to the aquifers. Compaction of the sediments and resulting land subsidence may be permanent if the head declines result in vertical stresses beyond the previous maximum stress. The process of permanent compaction is not routinely included in simulations of ground-water flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U.S. Geological Survey modular finite-difference ground- water flow model. The new program, the Interbed-Storage Package, is designed to be incorporated into this model. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of the skeletal component of elastic specific storage and the thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the ground-water flow model by adding an additional term to the right-hand side of the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum (preconsolidation) head. Two tests were performed to verify that the package works correctly. The first test compared model-calculated storage and compaction changes to hand-calculated values for a three-dimensional simulation. Model and hand-calculated values were essentially equal. The second test was performed to compare the results of the Interbed-Storage Package with results of the one-dimensional Helm compaction model. This test problem simulated compaction in doubly draining confining beds stressed by head changes in adjacent aquifers. The Interbed-Storage Package and the Helm model computed essentially equal values of compaction. Documentation of the Interbed-Storage Package includes data input instructions, flow charts, narratives, and listings for each of the five modules included in the package. The documentation also includes an appendix describing input instructions and a listing of a computer program for time-variant specified-head boundaries. That package was developed to reduce the amount of data input and output associated with one of the Interbed-Storage Package test problems.

  8. Effects of intravenous bolus injection of nicorandil on renal artery flow velocity assessed by color Doppler ultrasound.

    PubMed

    Shimamoto, Yukiko; Kubo, Takashi; Tanabe, Kazumi; Emori, Hiroki; Katayama, Yosuke; Nishiguchi, Tsuyoshi; Taruya, Akira; Kameyama, Takeyoshi; Orii, Makoto; Yamano, Takashi; Kuroi, Akio; Yamaguchi, Tomoyuki; Takemoto, Kazushi; Matsuo, Yoshiki; Ino, Yasushi; Tanaka, Atsushi; Hozumi, Takeshi; Terada, Masaki; Akasaka, Takashi

    2017-01-01

    Previous animal studies have shown that a potassium channel opener, nicorandil, provokes vasodilation in renal microvasculature and increases renal blood flow. We conducted a clinical study that aimed to evaluate the effect of nicorandil on renal artery blood flow in comparison with nitroglycerin by using color Doppler ultrasound. The present study enrolled 40 patients with stable coronary artery disease who had no renal arterial stenosis and renal parenchymal disease. The patients received intravenous administration of nicorandil (n=20) or nitroglycerin (n=20). Before and after the administration, renal artery blood flow velocity was measured by color-guided pulsed-wave Doppler. The peak-systolic, end-diastolic, and mean renal artery blood flow velocities before the administration were not different between the nicorandil group and the nitroglycerin group. The peak-systolic (79±15cm/s to 99±21cm/s, p<0.001; and 78±19cm/s to 85±19cm/s, p=0.004), end-diastolic (22±5cm/s to 28±8cm/s, p<0.001; and 24±6cm/s to 26±6cm/s, p=0.005) and mean (41±6cm/s to 49±9cm/s, p<0.001; and 43±9cm/s to 45±9cm/s, p=0.009) renal artery flow velocities increased significantly in either group. The nominal changes in the peak-systolic (20±10cm/s vs. 7±8cm/s, p<0.001), end-diastolic (5±4cm/s vs. 2±3cm/s, p=0.001), and mean (8±5cm/s vs. 2±2cm/s, p<0.001) renal artery blood flow velocities were significantly greater in the nicorandil group compared with the nitroglycerin group. Intravenous nicorandil increased renal artery blood flow velocity in comparison with nitroglycerin. Nicorandil has a significant effect on renal hemodynamics. Copyright © 2016 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  9. Altered iPSC-derived neurons’ sodium channel properties in subjects with Monge's disease

    PubMed Central

    Zhao, Huiwen W.; Gu, Xiang Q.; Chailangkarn, Thanathom; Perkins, Guy; Callacondo, David; Appenzeller, Otto; Poulsen, Orit; Zhou, Dan; Muotri, Alysson R.; Haddad, Gabriel G.

    2015-01-01

    Monge's disease, also known as chronic mountain sickness (CMS), is a disease that potentially threatens more than 140 million highlanders during extended time living at a high altitude (over 2500m). The prevalence of CMS in Andeans is about 15-20%, suggesting that the majority of highlanders (non-CMS) are rather healthy at the high altitude; however, CMS subjects experience severe hypoxemia, erythrocytosis and many neurologic manifestations including migraine, headache, mental fatigue, confusion, and memory loss. The underlying mechanisms of CMS neuropathology are not well understood and no ideal treatment is available to prevent or cure CMS, except for phlebotomy. In the current study, we reprogrammed fibroblast cells from both CMS and non-CMS subjects’ skin biopsies into the induced pluripotent stem cells (iPSCs), then differentiated into neurons and compared their neuronal properties. We discovered that CMS neurons were much less excitable (higher rheobase) than non-CMS neurons. This decreased excitability was not caused by differences in passive neuronal properties, but instead by a significantly lowered Na+ channel current density and by a shift of the voltage-conductance curve in the depolarization direction. Our findings provide, for the first time, evidence of a neuronal abnormality in CMS subjects as compared to non-CMS subjects, hoping that such studies can pave the way to a better understanding of the neuropathology in CMS. PMID:25559931

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grant, Robert

    Under this grant, three significant software packages were developed or improved, all with the goal of improving the ease-of-use of HPC libraries. The first component is a Python package, named DistArray (originally named Odin), that provides a high-level interface to distributed array computing. This interface is based on the popular and widely used NumPy package and is integrated with the IPython project for enhanced interactive parallel distributed computing. The second Python package is the Distributed Array Protocol (DAP) that enables separate distributed array libraries to share arrays efficiently without copying or sending messages. If a distributed array library supports themore » DAP, it is then automatically able to communicate with any other library that also supports the protocol. This protocol allows DistArray to communicate with the Trilinos library via PyTrilinos, which was also enhanced during this project. A third package, PyTrilinos, was extended to support distributed structured arrays (in addition to the unstructured arrays of its original design), allow more flexible distributed arrays (i.e., the restriction to double precision data was lifted), and implement the DAP. DAP support includes both exporting the protocol so that external packages can use distributed Trilinos data structures, and importing the protocol so that PyTrilinos can work with distributed data from external packages.« less

  11. Experience in Grid Site Testing for ATLAS, CMS and LHCb with HammerCloud

    NASA Astrophysics Data System (ADS)

    Elmsheuser, Johannes; Medrano Llamas, Ramón; Legger, Federica; Sciabà, Andrea; Sciacca, Gianfranco; Úbeda García, Mario; van der Ster, Daniel

    2012-12-01

    Frequent validation and stress testing of the network, storage and CPU resources of a grid site is essential to achieve high performance and reliability. HammerCloud was previously introduced with the goals of enabling VO- and site-administrators to run such tests in an automated or on-demand manner. The ATLAS, CMS and LHCb experiments have all developed VO plugins for the service and have successfully integrated it into their grid operations infrastructures. This work will present the experience in running HammerCloud at full scale for more than 3 years and present solutions to the scalability issues faced by the service. First, we will show the particular challenges faced when integrating with CMS and LHCb offline computing, including customized dashboards to show site validation reports for the VOs and a new API to tightly integrate with the LHCbDIRAC Resource Status System. Next, a study of the automatic site exclusion component used by ATLAS will be presented along with results for tuning the exclusion policies. A study of the historical test results for ATLAS, CMS and LHCb will be presented, including comparisons between the experiments’ grid availabilities and a search for site-based or temporal failure correlations. Finally, we will look to future plans that will allow users to gain new insights into the test results; these include developments to allow increased testing concurrency, increased scale in the number of metrics recorded per test job (up to hundreds), and increased scale in the historical job information (up to many millions of jobs per VO).

  12. Downscaling Global Emissions and Its Implications Derived from Climate Model Experiments

    PubMed Central

    Abe, Manabu; Kinoshita, Tsuguki; Hasegawa, Tomoko; Kawase, Hiroaki; Kushida, Kazuhide; Masui, Toshihiko; Oka, Kazutaka; Shiogama, Hideo; Takahashi, Kiyoshi; Tatebe, Hiroaki; Yoshikawa, Minoru

    2017-01-01

    In climate change research, future scenarios of greenhouse gas and air pollutant emissions generated by integrated assessment models (IAMs) are used in climate models (CMs) and earth system models to analyze future interactions and feedback between human activities and climate. However, the spatial resolutions of IAMs and CMs differ. IAMs usually disaggregate the world into 10–30 aggregated regions, whereas CMs require a grid-based spatial resolution. Therefore, downscaling emissions data from IAMs into a finer scale is necessary to input the emissions into CMs. In this study, we examined whether differences in downscaling methods significantly affect climate variables such as temperature and precipitation. We tested two downscaling methods using the same regionally aggregated sulfur emissions scenario obtained from the Asian-Pacific Integrated Model/Computable General Equilibrium (AIM/CGE) model. The downscaled emissions were fed into the Model for Interdisciplinary Research on Climate (MIROC). One of the methods assumed a strong convergence of national emissions intensity (e.g., emissions per gross domestic product), while the other was based on inertia (i.e., the base-year remained unchanged). The emissions intensities in the downscaled spatial emissions generated from the two methods markedly differed, whereas the emissions densities (emissions per area) were similar. We investigated whether the climate change projections of temperature and precipitation would significantly differ between the two methods by applying a field significance test, and found little evidence of a significant difference between the two methods. Moreover, there was no clear evidence of a difference between the climate simulations based on these two downscaling methods. PMID:28076446

  13. Multi-core processing and scheduling performance in CMS

    NASA Astrophysics Data System (ADS)

    Hernández, J. M.; Evans, D.; Foulkes, S.

    2012-12-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.

  14. The Development of a Distributive Interactive Computing Model in Consumer Economics, Utilizing Jerome S. Bruner's Theory of Instruction.

    ERIC Educational Resources Information Center

    Morrison, James L.

    A computerized delivery system in consumer economics developed at the University of Delaware uses the PLATO system to provide a basis for analyzing consumer behavior in the marketplace. The 16 sequential lessons, part of the Consumer in the Marketplace Series (CMS), demonstrate consumer economic theory in layman's terms and are structured to focus…

  15. The structure and content of telephonic scripts found useful in a Medicaid Chronic Disease Management Program.

    PubMed

    Roth, Alexis M; Ackermann, Ronald T; Downs, Stephen M; Downs, Anne M; Zillich, Alan J; Holmes, Ann M; Katz, Barry P; Murray, Michael D; Inui, Thomas S

    2010-06-01

    In 2003, the Indiana Office of Medicaid Policy and Planning launched the Indiana Chronic Disease Management Program (ICDMP), a programme intended to improve the health and healthcare utilization of 15,000 Aged, Blind and Disabled Medicaid members living with diabetes and/or congestive heart failure in Indiana. Within ICDMP, programme components derived from the Chronic Care Model and education based on an integrated theoretical framework were utilized to create a telephonic care management intervention that was delivered by trained, non-clinical Care Managers (CMs) working under the supervision of a Registered Nurse. CMs utilized computer-assisted health education scripts to address clinically important topics, including medication adherence, diet, exercise and prevention of disease-specific complications. Employing reflective listening techniques, barriers to optimal self-management were assessed and members were encouraged to engage in health-improving actions. ICDMP evaluation results suggest that this low-intensity telephonic intervention shifted utilization and lowered costs. We discuss this patient-centred method for motivating behaviour change, the theoretical constructs underlying the scripts and the branched-logic format that makes them suitable to use as a computer-based application. Our aim is to share these public-domain materials with other programmes.

  16. Fabrication of fibrillized collagen microspheres with the microstructure resembling an extracellular matrix.

    PubMed

    Matsuhashi, Aki; Nam, Kwangwoo; Kimura, Tsuyoshi; Kishida, Akio

    2015-04-14

    Microspheres using artificial or natural materials have been widely applied in the field of tissue engineering and drug delivery systems. Collagen is being widely used for microspheres because of its abundancy in the extracellular matrix (ECM), and its good biocompatibility. The purpose of this study is to establish the appropriate condition for preparing collagen microspheres (CMS) and fibrillized collagen microspheres (fCMS) using water-in-oil (W/O) emulsion. Collagen can be tailored to mimic the native cell environment possessing a similar microstructure to that of the ECM by conditioning the aqueous solution. We focused on the preparation of stable and injectable CMS and fCMS which is stable and would promote the healing response. Controlling the interfacial properties of hydrophilic-lipophilic balance (HLB), we obtained CMS and fCMS with various sizes and various morphologies. The microsphere prepared with wetting agents showed good microsphere formation, but too low or too high HLB value caused low yield and uncontrollable size distribution. The change in the surfactant amount and the rotor speed also affected the formation of the CMS and fCMS, where the low surfactant amount and fast rotor speed produced smaller CMS and fCMS. In the case of fCMS, the presence of NaCl made it possible to prepare stable fCMS without using any cross-linker due to fibrillogenesis and gelling of collagen molecules. The microstructure of fCMS was similar to that of the native tissue indicating that the fCMS would replicate its function in vivo.

  17. Penetration, Completeness, and Representativeness of The Society of Thoracic Surgeons Adult Cardiac Surgery Database.

    PubMed

    Jacobs, Jeffrey P; Shahian, David M; He, Xia; O'Brien, Sean M; Badhwar, Vinay; Cleveland, Joseph C; Furnary, Anthony P; Magee, Mitchell J; Kurlansky, Paul A; Rankin, J Scott; Welke, Karl F; Filardo, Giovanni; Dokholyan, Rachel S; Peterson, Eric D; Brennan, J Matthew; Han, Jane M; McDonald, Donna; Schmitz, DeLaine; Edwards, Fred H; Prager, Richard L; Grover, Frederick L

    2016-01-01

    The Society of Thoracic Surgeons (STS) Adult Cardiac Surgery Database (ACSD) has been successfully linked to the Centers for Medicare and Medicaid (CMS) Medicare database, thereby facilitating comparative effectiveness research and providing information about long-term follow-up and cost. The present study uses this link to determine contemporary completeness, penetration, and representativeness of the STS ACSD. Using variables common to both STS and CMS databases, STS operations were linked to CMS data for all CMS coronary artery bypass graft (CABG) surgery hospitalizations discharged between 2000 and 2012, inclusive. For each CMS CABG hospitalization, it was determined whether a matching STS record existed. Center-level penetration (number of CMS sites with at least one matched STS participant divided by the total number of CMS CABG sites) increased from 45% in 2000 to 90% in 2012. In 2012, 973 of 1,081 CMS CABG sites (90%) were linked to an STS site. Patient-level penetration (number of CMS CABG hospitalizations done at STS sites divided by the total number of CMS CABG hospitalizations) increased from 51% in 2000 to 94% in 2012. In 2012, 71,634 of 76,072 CMS CABG hospitalizations (94%) occurred at an STS site. Completeness of case inclusion at STS sites (number of CMS CABG cases at STS sites linked to STS records divided by the total number of CMS CABG cases at STS sites) increased from 88% in 2000 to 98% in 2012. In 2012, 69,213 of 70,932 CMS CABG hospitalizations at STS sites (98%) were linked to an STS record. Linkage of STS and CMS databases demonstrates high and increasing penetration and completeness of the STS database. Linking STS and CMS data facilitates studying long-term outcomes and costs of cardiothoracic surgery. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  18. AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.

    PubMed

    Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld

    2016-08-01

    There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. System and Method for Providing a Climate Data Persistence Service

    NASA Technical Reports Server (NTRS)

    Schnase, John L. (Inventor); Ripley, III, William David (Inventor); Duffy, Daniel Q. (Inventor); Thompson, John H. (Inventor); Strong, Savannah L. (Inventor); McInerney, Mark (Inventor); Sinno, Scott (Inventor); Tamkin, Glenn S. (Inventor); Nadeau, Denis (Inventor)

    2018-01-01

    A system, method and computer-readable storage devices for providing a climate data persistence service. A system configured to provide the service can include a climate data server that performs data and metadata storage and management functions for climate data objects, a compute-storage platform that provides the resources needed to support a climate data server, provisioning software that allows climate data server instances to be deployed as virtual climate data servers in a cloud computing environment, and a service interface, wherein persistence service capabilities are invoked by software applications running on a client device. The climate data objects can be in various formats, such as International Organization for Standards (ISO) Open Archival Information System (OAIS) Reference Model Submission Information Packages, Archive Information Packages, and Dissemination Information Packages. The climate data server can enable scalable, federated storage, management, discovery, and access, and can be tailored for particular use cases.

  20. diffuStats: an R package to compute diffusion-based scores on biological networks.

    PubMed

    Picart-Armada, Sergio; Thompson, Wesley K; Buil, Alfonso; Perera-Lluna, Alexandre

    2018-02-01

    Label propagation and diffusion over biological networks are a common mathematical formalism in computational biology for giving context to molecular entities and prioritizing novel candidates in the area of study. There are several choices in conceiving the diffusion process-involving the graph kernel, the score definitions and the presence of a posterior statistical normalization-which have an impact on the results. This manuscript describes diffuStats, an R package that provides a collection of graph kernels and diffusion scores, as well as a parallel permutation analysis for the normalized scores, that eases the computation of the scores and their benchmarking for an optimal choice. The R package diffuStats is publicly available in Bioconductor, https://bioconductor.org, under the GPL-3 license. sergi.picart@upc.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  1. Enhancement of the CAVE computer code. [aerodynamic heating package for nose cones and scramjet engine sidewalls

    NASA Technical Reports Server (NTRS)

    Rathjen, K. A.; Burk, H. O.

    1983-01-01

    The computer code CAVE (Conduction Analysis via Eigenvalues) is a convenient and efficient computer code for predicting two dimensional temperature histories within thermal protection systems for hypersonic vehicles. The capabilities of CAVE were enhanced by incorporation of the following features into the code: real gas effects in the aerodynamic heating predictions, geometry and aerodynamic heating package for analyses of cone shaped bodies, input option to change from laminar to turbulent heating predictions on leading edges, modification to account for reduction in adiabatic wall temperature with increase in leading sweep, geometry package for two dimensional scramjet engine sidewall, with an option for heat transfer to external and internal surfaces, print out modification to provide tables of select temperatures for plotting and storage, and modifications to the radiation calculation procedure to eliminate temperature oscillations induced by high heating rates. These new features are described.

  2. Hermetic electronic packaging of an implantable brain-machine-interface with transcutaneous optical data communication.

    PubMed

    Schuettler, Martin; Kohler, Fabian; Ordonez, Juan S; Stieglitz, Thomas

    2012-01-01

    Future brain-computer-interfaces (BCIs) for severely impaired patients are implanted to electrically contact the brain tissue. Avoiding percutaneous cables requires amplifier and telemetry electronics to be implanted too. We developed a hermetic package that protects the electronic circuitry of a BCI from body moisture while permitting infrared communication through the package wall made from alumina ceramic. The ceramic package is casted in medical grade silicone adhesive, for which we identified MED2-4013 as a promising candidate.

  3. ESRD Managed Care Demonstration: Financial Implications

    PubMed Central

    Dykstra, Dawn M.; Beronja, Nancy; Menges, Joel; Gaylin, Daniel S.; Oppenheimer, Caitlin Carroll; Shapiro, Jennifer R.; Wolfe, Robert A.; Rubin, Robert J.; Held, Philip J.

    2003-01-01

    In 1996, CMS launched the end stage renal disease (ESRD) managed care demonstration to study the experience of offering managed care to ESRD patients. This article analyzes the financial impact of the demonstration, which sought to assess its economic impact on the Federal Government, the sites, and the ESRD Medicare beneficiaries. Medicare's costs for demonstration enrollees were greater than they would have been if these enrollees had remained in the fee-for-service (FFS) system. This loss was driven by the lower than average predicted Medicare spending given the demonstration patients' conditions. The sites experienced losses or only modest gains, primarily because they provided a larger benefit package than traditional Medicare coverage, including no patient obligations and other benefits, especially prescription drugs. Patient financial benefits were approximately $9,000 annually. PMID:14628400

  4. 3D interconnect metrology in CMS/ITRI

    NASA Astrophysics Data System (ADS)

    Ku, Y. S.; Shyu, D. M.; Hsu, W. T.; Chang, P. Y.; Chen, Y. C.; Pang, H. L.

    2011-05-01

    Semiconductor device packaging technology is rapidly advancing, in response to the demand for thinner and smaller electronic devices. Three-dimensional chip/wafer stacking that uses through-silicon vias (TSV) is a key technical focus area, and the continuous development of this novel technology has created a need for non-contact characterization. Many of these challenges are novel to the industry due to the relatively large variety of via sizes and density, and new processes such as wafer thinning and stacked wafer bonding. This paper summarizes the developing metrology that has been used during via-middle & via-last TSV process development at EOL/ITRI. While there is a variety of metrology and inspection applications for 3D interconnect processing, the main topics covered here are via CD/depth measurement, thinned wafer inspection and wafer warpage measurement.

  5. 78 FR 38986 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-28

    ... that information was collected under Part B. The QIMS Account Registration and the ESRD Application..., CMS-1728-94, CMS-10174, CMS-10305 and CMS-10488] Agency Information Collection Activities: Proposed... comment on CMS' intention to collect information from the public. Under the Paperwork Reduction Act of...

  6. 42 CFR 482.74 - Condition of participation: Notification to CMS.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition of participation: Notification to CMS... participation: Notification to CMS. (a) A transplant center must notify CMS immediately of any significant... conditions of participation. Instances in which CMS should receive information for follow up, as appropriate...

  7. Perceptions, use and attitudes of pharmacy customers on complementary medicines and pharmacy practice.

    PubMed

    Braun, Lesley A; Tiralongo, Evelin; Wilkinson, Jenny M; Spitzer, Ondine; Bailey, Michael; Poole, Susan; Dooley, Michael

    2010-07-20

    Complementary medicines (CMs) are popular amongst Australians and community pharmacy is a major supplier of these products. This study explores pharmacy customer use, attitudes and perceptions of complementary medicines, and their expectations of pharmacists as they relate to these products. Pharmacy customers randomly selected from sixty large and small, metropolitan and rural pharmacies in three Australian states completed an anonymous, self administered questionnaire that had been pre-tested and validated. 1,121 customers participated (response rate 62%). 72% had used CMs within the previous 12 months, 61% used prescription medicines daily and 43% had used both concomitantly. Multivitamins, fish oils, vitamin C, glucosamine and probiotics were the five most popular CMs. 72% of people using CMs rated their products as 'very effective' or 'effective enough'. CMs were as frequently used by customers aged 60 years or older as younger customers (69% vs. 72%) although the pattern of use shifted with older age. Most customers (92%) thought pharmacists should provide safety information about CMs, 90% thought they should routinely check for interactions, 87% thought they should recommend effective CMs, 78% thought CMs should be recorded in customer's medication profile and 58% thought pharmacies stocking CMs should also employ a complementary medicine practitioner. Of those using CMs, 93% thought it important for pharmacists to be knowledgeable about CMs and 48% felt their pharmacist provides useful information about CMs. CMs are widely used by pharmacy customers of all ages who want pharmacists to be more involved in providing advice about these products.

  8. Recombination Events Involving the atp9 Gene Are Associated with Male Sterility of CMS PET2 in Sunflower.

    PubMed

    Reddemann, Antje; Horn, Renate

    2018-03-11

    Cytoplasmic male sterility (CMS) systems represent ideal mutants to study the role of mitochondria in pollen development. In sunflower, CMS PET2 also has the potential to become an alternative CMS source for commercial sunflower hybrid breeding. CMS PET2 originates from an interspecific cross of H. petiolaris and H. annuus as CMS PET1, but results in a different CMS mechanism. Southern analyses revealed differences for atp6 , atp9 and cob between CMS PET2, CMS PET1 and the male-fertile line HA89. A second identical copy of atp6 was present on an additional CMS PET2-specific fragment. In addition, the atp9 gene was duplicated. However, this duplication was followed by an insertion of 271 bp of unknown origin in the 5' coding region of the atp9 gene in CMS PET2, which led to the creation of two unique open reading frames orf288 and orf231 . The first 53 bp of orf288 are identical to the 5' end of atp9 . Orf231 consists apart from the first 3 bp, being part of the 271-bp-insertion, of the last 228 bp of atp9 . These CMS PET2-specific orfs are co-transcribed. All 11 editing sites of the atp9 gene present in orf231 are fully edited. The anther-specific reduction of the co-transcript in fertility-restored hybrids supports the involvement in male-sterility based on CMS PET2.

  9. A review of evaluative studies of computer-based learning in nursing education.

    PubMed

    Lewis, M J; Davies, R; Jenkins, D; Tait, M I

    2001-01-01

    Although there have been numerous attempts to evaluate the learning benefits of computer-based learning (CBL) packages in nursing education, the results obtained have been equivocal. A literature search conducted for this review found 25 reports of the evaluation of nursing CBL packages since 1966. Detailed analysis of the evaluation methods used in these reports revealed that most had significant design flaws, including the use of too small a sample group, the lack of a control group, etc. Because of this, the conclusions reached were not always valid. More effort is required in the design of future evaluation studies of nursing CBL packages. Copyright 2001 Harcourt Publishers Ltd.

  10. Pse-Analysis: a python package for DNA/RNA and protein/ peptide sequence analysis based on pseudo components and kernel methods.

    PubMed

    Liu, Bin; Wu, Hao; Zhang, Deyuan; Wang, Xiaolong; Chou, Kuo-Chen

    2017-02-21

    To expedite the pace in conducting genome/proteome analysis, we have developed a Python package called Pse-Analysis. The powerful package can automatically complete the following five procedures: (1) sample feature extraction, (2) optimal parameter selection, (3) model training, (4) cross validation, and (5) evaluating prediction quality. All the work a user needs to do is to input a benchmark dataset along with the query biological sequences concerned. Based on the benchmark dataset, Pse-Analysis will automatically construct an ideal predictor, followed by yielding the predicted results for the submitted query samples. All the aforementioned tedious jobs can be automatically done by the computer. Moreover, the multiprocessing technique was adopted to enhance computational speed by about 6 folds. The Pse-Analysis Python package is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/Pse-Analysis/, and can be directly run on Windows, Linux, and Unix.

  11. NORTICA—a new code for cyclotron analysis

    NASA Astrophysics Data System (ADS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  12. Eigensolver for a Sparse, Large Hermitian Matrix

    NASA Technical Reports Server (NTRS)

    Tisdale, E. Robert; Oyafuso, Fabiano; Klimeck, Gerhard; Brown, R. Chris

    2003-01-01

    A parallel-processing computer program finds a few eigenvalues in a sparse Hermitian matrix that contains as many as 100 million diagonal elements. This program finds the eigenvalues faster, using less memory, than do other, comparable eigensolver programs. This program implements a Lanczos algorithm in the American National Standards Institute/ International Organization for Standardization (ANSI/ISO) C computing language, using the Message Passing Interface (MPI) standard to complement an eigensolver in PARPACK. [PARPACK (Parallel Arnoldi Package) is an extension, to parallel-processing computer architectures, of ARPACK (Arnoldi Package), which is a collection of Fortran 77 subroutines that solve large-scale eigenvalue problems.] The eigensolver runs on Beowulf clusters of computers at the Jet Propulsion Laboratory (JPL).

  13. PRROC: computing and visualizing precision-recall and receiver operating characteristic curves in R.

    PubMed

    Grau, Jan; Grosse, Ivo; Keilwagen, Jens

    2015-08-01

    Precision-recall (PR) and receiver operating characteristic (ROC) curves are valuable measures of classifier performance. Here, we present the R-package PRROC, which allows for computing and visualizing both PR and ROC curves. In contrast to available R-packages, PRROC allows for computing PR and ROC curves and areas under these curves for soft-labeled data using a continuous interpolation between the points of PR curves. In addition, PRROC provides a generic plot function for generating publication-quality graphics of PR and ROC curves. © The Author 2015. Published by Oxford University Press.

  14. Quantum Computing Architectural Design

    NASA Astrophysics Data System (ADS)

    West, Jacob; Simms, Geoffrey; Gyure, Mark

    2006-03-01

    Large scale quantum computers will invariably require scalable architectures in addition to high fidelity gate operations. Quantum computing architectural design (QCAD) addresses the problems of actually implementing fault-tolerant algorithms given physical and architectural constraints beyond those of basic gate-level fidelity. Here we introduce a unified framework for QCAD that enables the scientist to study the impact of varying error correction schemes, architectural parameters including layout and scheduling, and physical operations native to a given architecture. Our software package, aptly named QCAD, provides compilation, manipulation/transformation, multi-paradigm simulation, and visualization tools. We demonstrate various features of the QCAD software package through several examples.

  15. An Implemented Strategy for Campus Connectivity and Cooperative Computing.

    ERIC Educational Resources Information Center

    Halaris, Antony S.; Sloan, Lynda W.

    1989-01-01

    ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)

  16. Aspects of perturbation theory in quantum mechanics: The BenderWuMATHEMATICA® package

    NASA Astrophysics Data System (ADS)

    Sulejmanpasic, Tin; Ünsal, Mithat

    2018-07-01

    We discuss a general setup which allows the study of the perturbation theory of an arbitrary, locally harmonic 1D quantum mechanical potential as well as its multi-variable (many-body) generalization. The latter may form a prototype for regularized quantum field theory. We first generalize the method of Bender-Wu,and derive exact recursion relations which allow the determination of the perturbative wave-function and energy corrections to an arbitrary order, at least in principle. For 1D systems, we implement these equations in an easy to use MATHEMATICA® package we call BenderWu. Our package enables quick home-computer computation of high orders of perturbation theory (about 100 orders in 10-30 s, and 250 orders in 1-2 h) and enables practical study of a large class of problems in Quantum Mechanics. We have two hopes concerning the BenderWu package. One is that due to resurgence, large amount of non-perturbative information, such as non-perturbative energies and wave-functions (e.g. WKB wave functions), can in principle be extracted from the perturbative data. We also hope that the package may be used as a teaching tool, providing an effective bridge between perturbation theory and non-perturbative physics in textbooks. Finally, we show that for the multi-variable case, the recursion relation acquires a geometric character, and has a structure which allows parallelization to computer clusters.

  17. 42 CFR 423.2063 - Applicability of laws, regulations and CMS Rulings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Applicability of laws, regulations and CMS Rulings..., ALJ Hearings, MAC review, and Judicial Review § 423.2063 Applicability of laws, regulations and CMS... on ALJs and the MAC. (b) CMS Rulings are published under the authority of the CMS Administrator...

  18. 42 CFR 426.517 - CMS' statement regarding new evidence.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false CMS' statement regarding new evidence. 426.517... DETERMINATIONS Review of an NCD § 426.517 CMS' statement regarding new evidence. (a) CMS may review any new... experts; and (5) Presented during any hearing. (b) CMS may submit a statement regarding whether the new...

  19. 42 CFR 405.1063 - Applicability of laws, regulations and CMS Rulings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Applicability of laws, regulations and CMS Rulings... Medicare Coverage Policies § 405.1063 Applicability of laws, regulations and CMS Rulings. (a) All laws and... the MAC. (b) CMS Rulings are published under the authority of the Administrator, CMS. Consistent with...

  20. How to Spot Congenital Myasthenic Syndromes Resembling the Lambert-Eaton Myasthenic Syndrome? A Brief Review of Clinical, Electrophysiological, and Genetics Features.

    PubMed

    Lorenzoni, Paulo José; Scola, Rosana Herminia; Kay, Claudia Suemi Kamoi; Werneck, Lineu Cesar; Horvath, Rita; Lochmüller, Hanns

    2018-06-01

    Congenital myasthenic syndromes (CMS) are heterogeneous genetic diseases in which neuromuscular transmission is compromised. CMS resembling the Lambert-Eaton myasthenic syndrome (CMS-LEMS) are emerging as a rare group of distinct presynaptic CMS that share the same electrophysiological features. They have low compound muscular action potential amplitude that increment after brief exercise (facilitation) or high-frequency repetitive nerve stimulation. Although clinical signs similar to LEMS can be present, the main hallmark is the electrophysiological findings, which are identical to autoimmune LEMS. CMS-LEMS occurs due to deficits in acetylcholine vesicle release caused by dysfunction of different components in its pathway. To date, the genes that have been associated with CMS-LEMS are AGRN, SYT2, MUNC13-1, VAMP1, and LAMA5. Clinicians should keep in mind these newest subtypes of CMS-LEMS to achieve the correct diagnosis and therapy. We believe that CMS-LEMS must be included as an important diagnostic clue to genetic investigation in the diagnostic algorithms to CMS. We briefly review the main features of CMS-LEMS.

  1. The equipment access software for a distributed UNIX-based accelerator control system

    NASA Astrophysics Data System (ADS)

    Trofimov, Nikolai; Zelepoukine, Serguei; Zharkov, Eugeny; Charrue, Pierre; Gareyte, Claire; Poirier, Hervé

    1994-12-01

    This paper presents a generic equipment access software package for a distributed control system using computers with UNIX or UNIX-like operating systems. The package consists of three main components, an application Equipment Access Library, Message Handler and Equipment Data Base. An application task, which may run in any computer in the network, sends requests to access equipment through Equipment Library calls. The basic request is in the form Equipment-Action-Data and is routed via a remote procedure call to the computer to which the given equipment is connected. In this computer the request is received by the Message Handler. According to the type of the equipment connection, the Message Handler either passes the request to the specific process software in the same computer or forwards it to a lower level network of equipment controllers using MIL1553B, GPIB, RS232 or BITBUS communication. The answer is then returned to the calling application. Descriptive information required for request routing and processing is stored in the real-time Equipment Data Base. The package has been written to be portable and is currently available on DEC Ultrix, LynxOS, HPUX, XENIX, OS-9 and Apollo domain.

  2. QDENSITY—A Mathematica quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2009-03-01

    This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples included in the package, e.g., the tutorial, Shor's examples, Teleportation examples and Grover's search, run in less than a minute on a Pentium 4 processor (2.6 GHz). The running time for a quantum computation depends crucially on the number of qubits employed.

  3. A novel photo-grafting of acrylamide onto carboxymethyl starch. 1. Utilization of CMS-g-PAAm in easy care finishing of cotton fabrics.

    PubMed

    El-Sheikh, Manal A

    2016-11-05

    The photosensitized grafting of vinyl monomers onto a range of polymeric substrates has been the subject of particular interest in the recent past. Carboxymethyl starch (CMS)-poly acrylamide (PAAm) graft copolymer (CMS-g-PAAm) with high graft yield was successfully prepared by grafting of acrylamide onto CMS using UV irradiation in the presence of the water soluble 4-(trimethyl ammoniummethyl) benzophenone chloride photoinitiator. CMS-g-PAAm with nitrogen content of 8.3% and grafting efficiency up to 98.9% was obtained using 100% AAm, a material: liquor ratio of 1:14 and 1% photinitiator at 30°C for 1h of UV irradiation. The synthesis of CMS-g-PAAm was confirmed by FTIR and Nitrogen content (%). Surface morphology of CMS and surface morphological changes of CMS after grafting with AAm were studied using SEM. Thermal properties of both CMS and CMS-g-PAAm were studied using TGA and DSC. To impart easy care finishing to cotton fabrics, aqueous formulations of: CMS-g-PAAm, dimethylol dihydroxy ethylene urea (DMDHEU), CMS-g-PAAm-DMDHEU mixture or methylolated CMS-g-PAAm were used. Cotton fabrics were padded in these formulations, squeezed to a wet pick up 100%, dried at 100°C for 5min, cured at 150°C for 5min, washed at 50°C for 10min and air-dried. CRA (crease recovery angle) of untreated fabrics and fabrics finished with a mixture of 2% CMS-g-PAAm and 10% DMDHEU or methylolated CMS-g-PAAm (10% formaldehyde) were: 136°, 190°, 288° respectively. Increasing the number of washing cycles up to five cycles results in an insignificant decrease in the CRA and a significant decrease in RF (releasable formaldehyde) of finished fabric samples. The morphologies of the finished and unfinished cotton fabrics were performed by SEM. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Engaging Undergraduate Math Majors in Geoscience Research using Interactive Simulations and Computer Art

    NASA Astrophysics Data System (ADS)

    Matott, L. S.; Hymiak, B.; Reslink, C. F.; Baxter, C.; Aziz, S.

    2012-12-01

    As part of the NSF-sponsored 'URGE (Undergraduate Research Group Experiences) to Compute' program, Dr. Matott has been collaborating with talented Math majors to explore the design of cost-effective systems to safeguard groundwater supplies from contaminated sites. Such activity is aided by a combination of groundwater modeling, simulation-based optimization, and high-performance computing - disciplines largely unfamiliar to the students at the outset of the program. To help train and engage the students, a number of interactive and graphical software packages were utilized. Examples include: (1) a tutorial for exploring the behavior of evolutionary algorithms and other heuristic optimizers commonly used in simulation-based optimization; (2) an interactive groundwater modeling package for exploring alternative pump-and-treat containment scenarios at a contaminated site in Billings, Montana; (3) the R software package for visualizing various concepts related to subsurface hydrology; and (4) a job visualization tool for exploring the behavior of numerical experiments run on a large distributed computing cluster. Further engagement and excitement in the program was fostered by entering (and winning) a computer art competition run by the Coalition for Academic Scientific Computation (CASC). The winning submission visualizes an exhaustively mapped optimization cost surface and dramatically illustrates the phenomena of artificial minima - valley locations that correspond to designs whose costs are only partially optimal.

  5. Seismic waveform modeling over cloud

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Friederich, Wolfgang

    2016-04-01

    With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.

  6. A Distributed-Memory Package for Dense Hierarchically Semi-Separable Matrix Computations Using Randomization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter

    In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less

  7. A Distributed-Memory Package for Dense Hierarchically Semi-Separable Matrix Computations Using Randomization

    DOE PAGES

    Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter; ...

    2016-06-30

    In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less

  8. CRAB3: Establishing a new generation of services for distributed analysis at CMS

    NASA Astrophysics Data System (ADS)

    Cinquilli, M.; Spiga, D.; Grandi, C.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Riahi, H.; Vaandering, E.

    2012-12-01

    In CMS Computing the highest priorities for analysis tools are the improvement of the end users’ ability to produce and publish reliable samples and analysis results as well as a transition to a sustainable development and operations model. To achieve these goals CMS decided to incorporate analysis processing into the same framework as data and simulation processing. This strategy foresees that all workload tools (TierO, Tier1, production, analysis) share a common core with long term maintainability as well as the standardization of the operator interfaces. The re-engineered analysis workload manager, called CRAB3, makes use of newer technologies, such as RESTFul based web services and NoSQL Databases, aiming to increase the scalability and reliability of the system. As opposed to CRAB2, in CRAB3 all work is centrally injected and managed in a global queue. A pool of agents, which can be geographically distributed, consumes work from the central services serving the user tasks. The new architecture of CRAB substantially changes the deployment model and operations activities. In this paper we present the implementation of CRAB3, emphasizing how the new architecture improves the workflow automation and simplifies maintainability. In particular, we will highlight the impact of the new design on daily operations.

  9. Perceptions, use and attitudes of pharmacy customers on complementary medicines and pharmacy practice

    PubMed Central

    2010-01-01

    Background Complementary medicines (CMs) are popular amongst Australians and community pharmacy is a major supplier of these products. This study explores pharmacy customer use, attitudes and perceptions of complementary medicines, and their expectations of pharmacists as they relate to these products. Methods Pharmacy customers randomly selected from sixty large and small, metropolitan and rural pharmacies in three Australian states completed an anonymous, self administered questionnaire that had been pre-tested and validated. Results 1,121 customers participated (response rate 62%). 72% had used CMs within the previous 12 months, 61% used prescription medicines daily and 43% had used both concomitantly. Multivitamins, fish oils, vitamin C, glucosamine and probiotics were the five most popular CMs. 72% of people using CMs rated their products as 'very effective' or 'effective enough'. CMs were as frequently used by customers aged 60 years or older as younger customers (69% vs. 72%) although the pattern of use shifted with older age. Most customers (92%) thought pharmacists should provide safety information about CMs, 90% thought they should routinely check for interactions, 87% thought they should recommend effective CMs, 78% thought CMs should be recorded in customer's medication profile and 58% thought pharmacies stocking CMs should also employ a complementary medicine practitioner. Of those using CMs, 93% thought it important for pharmacists to be knowledgeable about CMs and 48% felt their pharmacist provides useful information about CMs. Conclusions CMs are widely used by pharmacy customers of all ages who want pharmacists to be more involved in providing advice about these products. PMID:20646290

  10. Recombination Events Involving the atp9 Gene Are Associated with Male Sterility of CMS PET2 in Sunflower

    PubMed Central

    Reddemann, Antje; Horn, Renate

    2018-01-01

    Cytoplasmic male sterility (CMS) systems represent ideal mutants to study the role of mitochondria in pollen development. In sunflower, CMS PET2 also has the potential to become an alternative CMS source for commercial sunflower hybrid breeding. CMS PET2 originates from an interspecific cross of H. petiolaris and H. annuus as CMS PET1, but results in a different CMS mechanism. Southern analyses revealed differences for atp6, atp9 and cob between CMS PET2, CMS PET1 and the male-fertile line HA89. A second identical copy of atp6 was present on an additional CMS PET2-specific fragment. In addition, the atp9 gene was duplicated. However, this duplication was followed by an insertion of 271 bp of unknown origin in the 5′ coding region of the atp9 gene in CMS PET2, which led to the creation of two unique open reading frames orf288 and orf231. The first 53 bp of orf288 are identical to the 5′ end of atp9. Orf231 consists apart from the first 3 bp, being part of the 271-bp-insertion, of the last 228 bp of atp9. These CMS PET2-specific orfs are co-transcribed. All 11 editing sites of the atp9 gene present in orf231 are fully edited. The anther-specific reduction of the co-transcript in fertility-restored hybrids supports the involvement in male-sterility based on CMS PET2. PMID:29534485

  11. Developments in blade shape design for a Darrieus vertical axis wind turbine

    NASA Astrophysics Data System (ADS)

    Ashwill, T. D.; Leonard, T. M.

    1986-09-01

    A new computer program package has been developed that determines the troposkein shape for a Darrieus Vertical Axis Wind Turbine Blade with any geometrical configuration or rotation rate. This package allows users to interact and develop a buildable blade whose shape closely approximates the troposkein. Use of this package can significantly reduce flatwise mean bending stresses in the blade and increase fatigue life.

  12. MC-GenomeKey: a multicloud system for the detection and annotation of genomic variants.

    PubMed

    Elshazly, Hatem; Souilmi, Yassine; Tonellato, Peter J; Wall, Dennis P; Abouelhoda, Mohamed

    2017-01-20

    Next Generation Genome sequencing techniques became affordable for massive sequencing efforts devoted to clinical characterization of human diseases. However, the cost of providing cloud-based data analysis of the mounting datasets remains a concerning bottleneck for providing cost-effective clinical services. To address this computational problem, it is important to optimize the variant analysis workflow and the used analysis tools to reduce the overall computational processing time, and concomitantly reduce the processing cost. Furthermore, it is important to capitalize on the use of the recent development in the cloud computing market, which have witnessed more providers competing in terms of products and prices. In this paper, we present a new package called MC-GenomeKey (Multi-Cloud GenomeKey) that efficiently executes the variant analysis workflow for detecting and annotating mutations using cloud resources from different commercial cloud providers. Our package supports Amazon, Google, and Azure clouds, as well as, any other cloud platform based on OpenStack. Our package allows different scenarios of execution with different levels of sophistication, up to the one where a workflow can be executed using a cluster whose nodes come from different clouds. MC-GenomeKey also supports scenarios to exploit the spot instance model of Amazon in combination with the use of other cloud platforms to provide significant cost reduction. To the best of our knowledge, this is the first solution that optimizes the execution of the workflow using computational resources from different cloud providers. MC-GenomeKey provides an efficient multicloud based solution to detect and annotate mutations. The package can run in different commercial cloud platforms, which enables the user to seize the best offers. The package also provides a reliable means to make use of the low-cost spot instance model of Amazon, as it provides an efficient solution to the sudden termination of spot machines as a result of a sudden price increase. The package has a web-interface and it is available for free for academic use.

  13. Turbofan noise generation. Volume 2: Computer programs

    NASA Technical Reports Server (NTRS)

    Ventres, C. S.; Theobald, M. A.; Mark, W. D.

    1982-01-01

    The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.

  14. Turbofan noise generation. Volume 2: Computer programs

    NASA Astrophysics Data System (ADS)

    Ventres, C. S.; Theobald, M. A.; Mark, W. D.

    1982-07-01

    The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.

  15. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  16. MULTIVARIATERESIDUES : A Mathematica package for computing multivariate residues

    NASA Astrophysics Data System (ADS)

    Larsen, Kasper J.; Rietkerk, Robbert

    2018-01-01

    Multivariate residues appear in many different contexts in theoretical physics and algebraic geometry. In theoretical physics, they for example give the proper definition of generalized-unitarity cuts, and they play a central role in the Grassmannian formulation of the S-matrix by Arkani-Hamed et al. In realistic cases their evaluation can be non-trivial. In this paper we provide a Mathematica package for efficient evaluation of multivariate residues based on methods from computational algebraic geometry.

  17. The Computer: An Effective Research Assistant

    PubMed Central

    Gancher, Wendy

    1984-01-01

    The development of software packages such as data management systems and statistical packages has made it possible to process large amounts of research data. Data management systems make the organization and manipulation of such data easier. Floppy disks ease the problem of storing and retrieving records. Patient information can be kept confidential by limiting access to computer passwords linked with research files, or by using floppy disks. These attributes make the microcomputer essential to modern primary care research. PMID:21279042

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grote, D. P.

    Forthon generates links between Fortran and Python. Python is a high level, object oriented, interactive and scripting language that allows a flexible and versatile interface to computational tools. The Forthon package generates the necessary wrapping code which allows access to the Fortran database and to the Fortran subroutines and functions. This provides a development package where the computationally intensive parts of a code can be written in efficient Fortran, and the high level controlling code can be written in the much more versatile Python language.

  19. An Analysis of Computer Aided Design (CAD) Packages Used at MSFC for the Recent Initiative to Integrate Engineering Activities

    NASA Technical Reports Server (NTRS)

    Smith, Leigh M.; Parker, Nelson C. (Technical Monitor)

    2002-01-01

    This paper analyzes the use of Computer Aided Design (CAD) packages at NASA's Marshall Space Flight Center (MSFC). It examines the effectiveness of recent efforts to standardize CAD practices across MSFC engineering activities. An assessment of the roles played by management, designers, analysts, and manufacturers in this initiative will be explored. Finally, solutions are presented for better integration of CAD across MSFC in the future.

  20. Hybrid male sterility in Mimulus (Phrymaceae) is associated with a geographically restricted mitochondrial rearrangement.

    PubMed

    Case, Andrea L; Willis, John H

    2008-05-01

    Cytoplasmic male sterility (CMS) and nuclear fertility restoration (Rf) involves intergenomic coevolution. Although male-sterile phenotypes are rarely expressed in natural populations of angiosperms, CMS genes are thought to be common. The evolutionary dynamics of CMS/Rf systems are poorly understood, leaving gaps in our understanding of mechanisms and consequences of cytonuclear interactions. We characterized the molecular basis and geographic distribution of a CMS gene in Mimulus guttatus. We used outcrossing M. guttatus (with CMS and Rf) to self-fertilizing M. nasutus (lacking CMS and Rf) to generate hybrids segregating for CMS. Mitochondrial transcripts containing an essential gene (nad6) were perfectly associated with male sterility. The CMS mitotype was completely absent in M. nasutus, present in all genotypes collected from the original collection site, but in only two individuals from 34 other M. guttatus populations. This pattern suggests that the CMS likely originated at a single locality, spread to fixation within the population, but has not spread to other populations, indicating possible ecological or genetic constraints on dispersal of this CMS mitotype between populations. Extreme localization may be characteristic of CMS in hermaphroditic species, in contrast to geographically widespread mitotypes commonly found in gynodioecious species, and could directly contribute to hybrid incompatibilities in nature.

  1. Angle-corrected imaging transcranial doppler sonography versus imaging and nonimaging transcranial doppler sonography in children with sickle cell disease.

    PubMed

    Krejza, J; Rudzinski, W; Pawlak, M A; Tomaszewski, M; Ichord, R; Kwiatkowski, J; Gor, D; Melhem, E R

    2007-09-01

    Nonimaging transcranial Doppler sonography (TCD) and imaging TCD (TCDI) are used for determination of the risk of stroke in children with sickle cell disease (SCD). The purpose was to compare angle-corrected, uncorrected TCDI, and TCD blood flow velocities in children with SCD. A total of 37 children (mean age, 7.8 +/- 3.0 years) without intracranial arterial narrowing determined with MR angiography, were studied with use of TCD and TCDI at the same session. Depth of insonation and TCDI mean velocities with and without correction for the angle of insonation in the terminal internal carotid artery (ICA) and middle (MCA), anterior (ACA), and posterior (PCA) cerebral arteries were compared with TCD velocities with use of a paired t test. Two arteries were not found on TCDI compared with 15 not found on TCD. Average angle of insonation in the MCA, ACA, ICA, and PCA was 31 degrees , 44 degrees , 25 degrees , and 29 degrees , respectively. TCDI and TCD mean depth of insonation for all arteries did not differ significantly; however, individual differences varied substantially. TCDI velocities were significantly lower than TCD velocities, respectively, for the right and left sides (mean +/- SD): MCA, 106 +/- 22 cm/s and 111 +/- 33 cm/s versus 130 +/- 19 cm/s and 134 +/- 26 cm/s; ICA, 90 +/- 14 cm/s and 98 +/- 27 cm/s versus 117 +/- 18 cm/s and 119 +/- 23 cm/s; ACA, 74 +/- 24 cm/s and 88 +/- 25 cm/s versus 105 +/- 23 cm/s and 105 +/- 31 cm/s; and PCA, 84 +/- 27 cm/s and 82 +/- 21 cm/s versus 95 +/- 23 cm/s and 94 +/- 20 cm/s. TCD and angle-corrected TCDI velocities were not statistically different except for higher angle-corrected TCDI values in the left ACA and right PCA. TCD velocities are significantly higher than TCDI velocities but are not different from the angle-corrected TCDI velocities. TCDI identifies the major intracranial arteries more effectively than TCD.

  2. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  3. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  4. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  5. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  6. 42 CFR 488.417 - Denial of payment for all new admissions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... to CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State (for non-State operated NFs against which CMS is imposing no remedies); and (2) CMS (for all facilities except non-State operated NFs against which CMS is imposing no remedies) or the State...

  7. 45 CFR 150.317 - Factors CMS uses to determine the amount of penalty.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Factors CMS uses to determine the amount of... RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement With Respect to Issuers and Non-Federal Governmental Plans-Civil Money Penalties § 150.317 Factors CMS...

  8. 40 CFR Table 1 to Subpart Qqqqq of... - Applicability of General Provisions to Subpart QQQQQ

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... § 63.8(e) CMS Performance Evaluation No Subpart QQQQQ does not require CMS performance evaluations... QQQQQ does not require performance tests or CMS performance evaluations. § 63.9(e) Notification of... CMS No Subpart QQQQQ does not require CMS performance evaluations. § 63.10(a), (b), (d)(1), (d)(4)-(5...

  9. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  10. 42 CFR 422.510 - Termination of contract by CMS.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) MEDICARE PROGRAM MEDICARE ADVANTAGE PROGRAM Application Procedures and Contracts for Medicare Advantage Organizations § 422.510 Termination of contract by CMS. (a) Termination by CMS. CMS may at any...

  11. Challenges of Particle Flow reconstruction in the CMS High-Granularity Calorimeter at the High-Luminosity LHC

    NASA Astrophysics Data System (ADS)

    Chlebana, Frank; CMS Collaboration

    2017-11-01

    The challenges of the High-Luminosity LHC (HL-LHC) are driven by the large number of overlapping proton-proton collisions (pileup) in each bunch-crossing and the extreme radiation dose to detectors at high pseudorapidity. To overcome this challenge CMS is developing an endcap electromagnetic+hadronic sampling calorimeter employing silicon sensors in the electromagnetic and front hadronic sections, comprising over 6 million channels, and highly-segmented plastic scintillators in the rear part of the hadronic section. This High- Granularity Calorimeter (HGCAL) will be the first of its kind used in a colliding beam experiment. Clustering deposits of energy over many cells and layers is a complex and challenging computational task, particularly in the high-pileup environment of HL-LHC. Baseline detector performance results are presented for electromagnetic and hadronic objects, and studies demonstrating the advantages of fine longitudinal and transverse segmentation are explored.

  12. A New Event Builder for CMS Run II

    NASA Astrophysics Data System (ADS)

    Albertsson, K.; Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Infiniband FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. We present performance measurements from small-scale prototypes and from the full-scale production system.

  13. A new event builder for CMS Run II

    DOE PAGES

    Albertsson, K.; Andre, J-M; Andronidis, A.; ...

    2015-12-23

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Innibandmore » FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. In conclusion, ee present performance measurements from small-scale prototypes and from the full-scale production system.« less

  14. Model Updating of Complex Structures Using the Combination of Component Mode Synthesis and Kriging Predictor

    PubMed Central

    Li, Yan; Wang, Dejun; Zhang, Shaoyi

    2014-01-01

    Updating the structural model of complex structures is time-consuming due to the large size of the finite element model (FEM). Using conventional methods for these cases is computationally expensive or even impossible. A two-level method, which combined the Kriging predictor and the component mode synthesis (CMS) technique, was proposed to ensure the successful implementing of FEM updating of large-scale structures. In the first level, the CMS was applied to build a reasonable condensed FEM of complex structures. In the second level, the Kriging predictor that was deemed as a surrogate FEM in structural dynamics was generated based on the condensed FEM. Some key issues of the application of the metamodel (surrogate FEM) to FEM updating were also discussed. Finally, the effectiveness of the proposed method was demonstrated by updating the FEM of a real arch bridge with the measured modal parameters. PMID:24634612

  15. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services

    PubMed Central

    Castaño-Díez, Daniel

    2017-01-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909

  16. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services.

    PubMed

    Castaño-Díez, Daniel

    2017-06-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.

  17. The Status of the Cms Experiment

    NASA Astrophysics Data System (ADS)

    Green, Dan

    The CMS experiment was completely assembled in the fall of 2008 after a decade of design, construction and installation. During the last two years, cosmic ray data were taken on a regular basis. These data have enabled CMS to align the detector components, both spatially and temporally. Initial use of muons has also established the relative alignment of the CMS tracking and muon systems. In addition, the CMS calorimetry has been crosschecked with test beam data, thus providing an initial energy calibration of CMS calorimetry to about 5%. The CMS magnet has been powered and field mapped. The trigger and data acquisition systems have been installed and run at full speed. The tiered data analysis system has been exercised at full design bandwidth for Tier0, Tier1 and Tier2 sites. Monte Carlo simulation of the CMS detector has been constructed at a detailed geometric level and has been tuned to test beam and other production data to provide a realistic model of the CMS detector prior to first collisions.

  18. Parallel computation and the Basis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, G.R.

    1992-12-16

    A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to-use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communication costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis and Parallelmore » Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less

  19. Parallel computation and the basis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, G.R.

    1993-05-01

    A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communications costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis andmore » Parallel Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less

  20. Community-driven computational biology with Debian Linux

    PubMed Central

    2010-01-01

    Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984

  1. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    PubMed

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  2. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  3. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  4. Cardiotoxicity evaluation using human embryonic stem cells and induced pluripotent stem cell-derived cardiomyocytes.

    PubMed

    Zhao, Qi; Wang, Xijie; Wang, Shuyan; Song, Zheng; Wang, Jiaxian; Ma, Jing

    2017-03-09

    Cardiotoxicity remains an important concern in drug discovery. Human pluripotent stem cell-derived cardiomyocytes (hPSC-CMs) have become an attractive platform to evaluate cardiotoxicity. However, the consistency between human embryonic stem cell-derived cardiomyocytes (hESC-CMs) and human induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs) in prediction of cardiotoxicity has yet to be elucidated. Here we screened the toxicities of four representative drugs (E-4031, isoprenaline, quinidine, and haloperidol) using both hESC-CMs and hiPSC-CMs, combined with an impedance-based bioanalytical method. It showed that both hESC-CMs and hiPSC-CMs can recapitulate cardiotoxicity and identify the effects of well-characterized compounds. The combined platform of hPSC-CMs and an impedance-based bioanalytical method could improve preclinical cardiotoxicity screening, holding great potential for increasing drug development accuracy.

  5. Resident intruder paradigm-induced aggression relieves depressive-like behaviors in male rats subjected to chronic mild stress

    PubMed Central

    Wei, Sheng; Ji, Xiao-wei; Wu, Chun-ling; Li, Zi-fa; Sun, Peng; Wang, Jie-qiong; Zhao, Qi-tao; Gao, Jie; Guo, Ying-hui; Sun, Shi-guang; Qiao, Ming-qi

    2014-01-01

    Background Accumulating epidemiological evidence shows that life event stressors are major vulnerability factors for psychiatric diseases such as major depression. It is also well known that the resident intruder paradigm (RIP) results in aggressive behavior in male rats. However, it is not known how resident intruder paradigm-induced aggression affects depressive-like behavior in isolated male rats subjected to chronic mild stress (CMS), which is an animal model of depression. Material/Methods Male Wistar rats were divided into 3 groups: non-stressed controls, isolated rats subjected to the CMS protocol, and resident intruder paradigm-exposed rats subjected to the CMS protocol. Results In the sucrose intake test, ingestion of a 1% sucrose solution by rats in the CMS group was significantly lower than in control and CMS+RIP rats after 3 weeks of stress. In the open-field test, CMS rats had significantly lower open-field scores compared to control rats. Furthermore, the total scores given the CMS group were significantly lower than in the CMS+RIP rats. In the forced swimming test (FST), the immobility times of CMS rats were significantly longer than those of the control or CMS+RIP rats. However, no differences were observed between controls and CMS+RIP rats. Conclusions Our data show that aggressive behavior evoked by the resident intruder paradigm could relieve broad-spectrum depressive-like behaviors in isolated adult male rats subjected to CMS. PMID:24911067

  6. Cyber-Ed.

    ERIC Educational Resources Information Center

    Ruben, Barbara

    1994-01-01

    Reviews a number of interactive environmental computer education networks and software packages. Computer networks include National Geographic Kids Network, Global Lab, and Global Rivers Environmental Education Network. Computer software involve environmental decision making, simulation games, tropical rainforests, the ocean, the greenhouse…

  7. NCCS Regression Test Harness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tharrington, Arnold N.

    2015-09-09

    The NCCS Regression Test Harness is a software package that provides a framework to perform regression and acceptance testing on NCCS High Performance Computers. The package is written in Python and has only the dependency of a Subversion repository to store the regression tests.

  8. Buying CAM.

    ERIC Educational Resources Information Center

    Meloy, Jim; And Others

    1990-01-01

    The relationship between computer-aided design (CAD), computer-aided manufacturing (CAM), and computer numerical control (CNC) computer applications is described. Tips for helping educate the CAM buyer on what to look for and what to avoid when searching for the most appropriate instructional CAM package are provided. (KR)

  9. Research and Development of Fully Automatic Alien Smoke Stack and Packaging System

    NASA Astrophysics Data System (ADS)

    Yang, Xudong; Ge, Qingkuan; Peng, Tao; Zuo, Ping; Dong, Weifu

    2017-12-01

    The problem of low efficiency of manual sorting packaging for the current tobacco distribution center, which developed a set of safe efficient and automatic type of alien smoke stack and packaging system. The functions of fully automatic alien smoke stack and packaging system adopt PLC control technology, servo control technology, robot technology, image recognition technology and human-computer interaction technology. The characteristics, principles, control process and key technology of the system are discussed in detail. Through the installation and commissioning fully automatic alien smoke stack and packaging system has a good performance and has completed the requirements for shaped cigarette.

  10. 42 CFR 426.415 - CMS' role in the LCD review.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false CMS' role in the LCD review. 426.415 Section 426... Review of an LCD § 426.415 CMS' role in the LCD review. CMS may provide to the ALJ, and all parties to the LCD review, information identifying the person who represents the contractor or CMS, if necessary...

  11. CMS-Wave

    DTIC Science & Technology

    2015-10-30

    Coastal Inlets Research Program CMS -Wave CMS -Wave is a two-dimensional spectral wind-wave generation and transformation model that employs a forward...marching, finite-difference method to solve the wave action conservation equation. Capabilities of CMS -Wave include wave shoaling, refraction... CMS -Wave can be used in either on a half- or full-plane mode, with primary waves propagating from the seaward boundary toward shore. It can

  12. A net-shaped multicellular formation facilitates the maturation of hPSC-derived cardiomyocytes through mechanical and electrophysiological stimuli

    PubMed Central

    Liu, Taoyan; Huang, Chengwu; Li, Hongxia; Wu, Fujian; Luo, Jianwen; Lu, Wenjing

    2018-01-01

    The use of human-induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs) is limited in drug discovery and cardiac disease mechanism studies due to cell immaturity. Although many approaches have been reported to improve the maturation of hiPSC-CMs, the elucidation of the process of maturation is crucial. We applied a small-molecule-based differentiation method to generate cardiomyocytes (CMs) with multiple aggregation forms. The motion analysis revealed significant physical differences in the differently shaped CMs, and the net-shaped CMs had larger motion amplitudes and faster velocities than the sheet-shaped CMs. The net-shaped CMs displayed accelerated maturation at the transcriptional level and were more similar to CMs with a prolonged culture time (30 days) than to sheet-d15. Ion channel genes and gap junction proteins were up-regulated in net-shaped CMs, indicating that robust contraction was coupled with enhanced ion channel and connexin expression. The net-shaped CMs also displayed improved myofibril ultrastructure under transmission electron microscopy. In conclusion, different multicellular hPSC-CM structures, such as the net-shaped pattern, are formed using the conditioned induction method, providing a useful tool to improve cardiac maturation. PMID:29661985

  13. MODFLOW-2005 : the U.S. Geological Survey modular ground-water model--the ground-water flow process

    USGS Publications Warehouse

    Harbaugh, Arlen W.

    2005-01-01

    This report presents MODFLOW-2005, which is a new version of the finite-difference ground-water model commonly called MODFLOW. Ground-water flow is simulated using a block-centered finite-difference approach. Layers can be simulated as confined or unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and rivers, also can be simulated. The report includes detailed explanations of physical and mathematical concepts on which the model is based, an explanation of how those concepts are incorporated in the modular structure of the computer program, instructions for using the model, and details of the computer code. The modular structure consists of a MAIN Program and a series of highly independent subroutines. The subroutines are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system that is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving the set of simultaneous equations resulting from the finite-difference method. Several solution methods are incorporated, including the Preconditioned Conjugate-Gradient method. The division of the program into packages permits the user to examine specific hydrologic features of the model independently. This also facilitates development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program also are designed to permit maximum flexibility. The program is designed to allow other capabilities, such as transport and optimization, to be incorporated, but this report is limited to describing the ground-water flow capability. The program is written in Fortran 90 and will run without modification on most computers that have a Fortran 90 compiler.

  14. Human renal adipose tissue induces the invasion and progression of renal cell carcinoma.

    PubMed

    Campo-Verde-Arbocco, Fiorella; López-Laur, José D; Romeo, Leonardo R; Giorlando, Noelia; Bruna, Flavia A; Contador, David E; López-Fontana, Gastón; Santiano, Flavia E; Sasso, Corina V; Zyla, Leila E; López-Fontana, Constanza M; Calvo, Juan C; Carón, Rubén W; Creydt, Virginia Pistone

    2017-11-07

    We evaluated the effects of conditioned media (CMs) of human adipose tissue from renal cell carcinoma located near the tumor (hRATnT) or farther away from the tumor (hRATfT), on proliferation, adhesion and migration of tumor (786-O and ACHN) and non-tumor (HK-2) human renal epithelial cell lines. Human adipose tissues were obtained from patients with renal cell carcinoma (RCC) and CMs from hRATnT and hRATfT incubation. Proliferation, adhesion and migration were quantified in 786-O, ACHN and HK-2 cell lines incubated with hRATnT-, hRATfT- or control-CMs. We evaluated versican, adiponectin and leptin expression in CMs from hRATnT and hRATfT. We evaluated AdipoR1/2, ObR, pERK, pAkt y pPI3K expression on cell lines incubated with CMs. No differences in proliferation of cell lines was found after 24 h of treatment with CMs. All cell lines showed a significant decrease in cell adhesion and increase in cell migration after incubation with hRATnT-CMs vs. hRATfT- or control-CMs. hRATnT-CMs showed increased levels of versican and leptin, compared to hRATfT-CMs. AdipoR2 in 786-O and ACHN cells decreased significantly after incubation with hRATfT- and hRATnT-CMs vs. control-CMs. We observed a decrease in the expression of pAkt in HK-2, 786-O and ACHN incubated with hRATnT-CMs. This result could partially explain the observed changes in migration and cell adhesion. We conclude that hRATnT released factors, such as leptin and versican, could enhance the invasive potential of renal epithelial cell lines and could modulate the progression of the disease.

  15. Down-regulation of Inwardly Rectifying K+ Currents in Astrocytes Derived from Patients with Monge's Disease.

    PubMed

    Wu, Wei; Yao, Hang; Zhao, Helen W; Wang, Juan; Haddad, Gabriel G

    2018-03-15

    Chronic mountain sickness (CMS) or Monge's disease is a disease in highlanders. These patients have a variety of neurologic symptoms such as migraine, mental fatigue, confusion, dizziness, loss of appetite, memory loss and neuronal degeneration. The cellular and molecular mechanisms underlying CMS neuropathology is not understood. In the previous study, we demonstrated that neurons derived from CMS patients' fibroblasts have a decreased expression and altered gating properties of voltage-gated sodium channel. In this study, we further characterize the electrophysiological properties of iPSC-derived astrocytes from CMS patients. We found that the current densities of the inwardly rectifying potassium (Kir) channels in CMS astrocytes (-5.7 ± 2.2 pA/pF at -140 mV) were significantly decreased as compared to non-CMS (-28.4 ± 3.4 pA/pF at -140 mV) and sea level subjects (-28.3 ± 5.3 pA/pF at -140 mV). We further demonstrated that the reduced Kir current densities in CMS astrocytes were caused by their decreased protein expression of Kir4.1 and Kir2.3 channels, while single channel properties (i.e., P o , conductance) of Kir channel in CMS astrocytes were not altered. In addition, we found no significant differences of outward potassium currents between CMS and non-CMS astrocytes. As compared to non-CMS and sea level subjects, the K + uptake ability in CMS astrocytes was significantly decreased. Taken together, our results suggest that down-regulation of Kir channels and the resulting decreased K + uptake ability in astrocytes could be one of the major molecular mechanisms underlying the neurologic manifestations in CMS patients. Published by Elsevier Ltd.

  16. From Early Embryonic to Adult Stage: Comparative Study of Action Potentials of Native and Pluripotent Stem Cell-Derived Cardiomyocytes.

    PubMed

    Peinkofer, Gabriel; Burkert, Karsten; Urban, Katja; Krausgrill, Benjamin; Hescheler, Jürgen; Saric, Tomo; Halbach, Marcel

    2016-10-01

    Cardiomyocytes (CMs) derived from induced pluripotent stem cells (iPS-CMs) are promising candidates for cell therapy, drug screening, and developmental studies. It is known that iPS-CMs possess immature electrophysiological properties, but an exact characterization of their developmental stage and subtype differentiation is hampered by a lack of knowledge of electrophysiological properties of native CMs from different developmental stages and origins within the heart. Thus, we sought to systematically investigate action potential (AP) properties of native murine CMs and to establish a database that allows classification of stem cell-derived CMs. Hearts from 129S2PasCrl mice were harvested at days 9-10, 12-14, and 16-18 postcoitum, as well as 1 day, 3-4 days, 1-2 weeks, 3-4 weeks, and 6 weeks postpartum. AP recordings in left and right atria and at apical, medial, and basal left and right ventricles were performed with sharp glass microelectrodes. Measurements revealed significant changes in AP morphology during pre- and postnatal murine development and significant differences between atria and ventricles, enabling a classification of developmental stage and subtype differentiation of stem cell-derived CMs based on their AP properties. For iPS-CMs derived from cell line TiB7.4, a typical ventricular phenotype was demonstrated at later developmental stages, while there were electrophysiological differences from atrial as well as ventricular native CMs at earlier stages. This finding supports that iPS-CMs can develop AP properties similar to native CMs, but points to differences in the maturation process between iPS-CMs and native CMs, which may be explained by dissimilar conditions during in vitro differentiation and in vivo development.

  17. Pharmacokinetics of colistin methanesulfonate (CMS) in healthy Chinese subjects after single and multiple intravenous doses.

    PubMed

    Zhao, Miao; Wu, Xiao-Jie; Fan, Ya-Xin; Zhang, Ying-Yuan; Guo, Bei-Ning; Yu, Ji-Cheng; Cao, Guo-Ying; Chen, Yuan-Cheng; Wu, Ju-Fang; Shi, Yao-Guo; Li, Jian; Zhang, Jing

    2018-05-01

    The high prevalence of extensively drug-resistant Gram-negative pathogens has forced clinicians to use colistin as a last-line therapy. Knowledge on the pharmacokinetics of colistin methanesulfonate (CMS), an inactive prodrug, and colistin has increased substantially; however, the pharmacokinetics in the Chinese population is still unknown due to lack of a CMS product in China. This study aimed to evaluate the pharmacokinetics of a new CMS product developed in China in order to optimise dosing regimens. A total of 24 healthy subjects (12 female, 12 male) were enrolled in single- and multiple-dose pharmacokinetic (PK) studies. Concentrations of CMS and formed colistin in plasma and urine were measured, and PK analysis was conducted using a non-compartmental approach. Following a single CMS dose [2.36 mg colistin base activity (CBA) per kg, 1 h infusion], peak concentrations (C max ) of CMS and formed colistin were 18.0 mg/L and 0.661 mg/L, respectively. The estimated half-life (t 1/2 ) of CMS and colistin were 1.38 h and 4.49 h, respectively. Approximately 62.5% of the CMS dose was excreted via urine within 24 h after dosing, whilst only 1.28% was present in the form of colistin. Following multiple CMS doses, colistin reached steady-state within 24 h; there was no accumulation of CMS, but colistin accumulated slightly (R AUC  = 1.33). This study provides the first PK data in the Chinese population and is essential for designing CMS dosing regimens for use in Chinese hospitals. The urinary PK data strongly support the use of intravenous CMS for serious urinary tract infections. Copyright © 2018 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.

  18. Differences in pharmacokinetics and pharmacodynamics of colistimethate sodium (CMS) and colistin between three different CMS dosage regimens in a critically ill patient infected by a multidrug-resistant Acinetobacter baumannii.

    PubMed

    Luque, Sònia; Grau, Santiago; Valle, Marta; Sorlí, Luisa; Horcajada, Juan Pablo; Segura, Concha; Alvarez-Lerma, Francisco

    2013-08-01

    Use of colistin has re-emerged for the treatment of infections caused by multidrug-resistant (MDR) Gram-negative bacteria, but information on its pharmacokinetics and pharmacodynamics is limited, especially in critically ill patients. Recent data from pharmacokinetic/pharmacodynamic (PK/PD) population studies have suggested that this population could benefit from administration of higher than standard doses of colistimethate sodium (CMS), but the relationship between administration of incremental doses of CMS and corresponding PK/PD parameters as well as its efficacy and toxicity have not yet been investigated in a clinical setting. The objective was to study the PK/PD differences of CMS and colistin between three different CMS dosage regimens in the same critically ill patient. A critically ill patient with nosocomial pneumonia caused by a MDR Acinetobacter baumannii received incremental doses of CMS. During administration of the different CMS dosage regimens, CMS and colistin plasma concentrations were determined and PK/PD indexes were calculated. With administration of the highest CMS dose once daily (720 mg every 24h), the peak plasma concentration of CMS and colistin increased to 40.51 mg/L and 1.81 mg/L, respectively, and the AUC0-24/MIC of colistin was 184.41. This dosage regimen was efficacious, and no nephrotoxicity or neurotoxicity was observed. In conclusion, a higher and extended-interval CMS dosage made it possible to increase the exposure of CMS and colistin in a critically ill patient infected by a MDR A. baumannii and allowed a clinical and microbiological optimal response to be achieved without evidence of toxicity. Copyright © 2013 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.

  19. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    ERIC Educational Resources Information Center

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  20. Promoting Intrinsic and Extrinsic Motivation among Chemistry Students Using Computer-Assisted Instruction

    ERIC Educational Resources Information Center

    Gambari, Isiaka A.; Gbodi, Bimpe E.; Olakanmi, Eyitao U.; Abalaka, Eneojo N.

    2016-01-01

    The role of computer-assisted instruction in promoting intrinsic and extrinsic motivation among Nigerian secondary school chemistry students was investigated in this study. The study employed two modes of computer-assisted instruction (computer simulation instruction and computer tutorial instructional packages) and two levels of gender (male and…

  1. Multi-core processing and scheduling performance in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, J. M.; Evans, D.; Foulkes, S.

    2012-01-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resultingmore » in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.« less

  2. Structural and Functional Maturation of Cardiomyocytes Derived from Human Pluripotent Stem Cells

    PubMed Central

    Lundy, Scott D.; Zhu, Wei-Zhong

    2013-01-01

    Despite preclinical studies demonstrating the functional benefit of transplanting human pluripotent stem cell-derived cardiomyocytes (PSC-CMs) into damaged myocardium, the ability of these immature cells to adopt a more adult-like cardiomyocyte (CM) phenotype remains uncertain. To address this issue, we tested the hypothesis that prolonged in vitro culture of human embryonic stem cell (hESC)- and human induced pluripotent stem cell (hiPSC)-derived CMs would result in the maturation of their structural and contractile properties to a more adult-like phenotype. Compared to their early-stage counterparts (PSC-CMs after 20–40 days of in vitro differentiation and culture), late-stage hESC-CMs and hiPSC-CMs (80–120 days) showed dramatic differences in morphology, including increased cell size and anisotropy, greater myofibril density and alignment, sarcomeres visible by bright-field microscopy, and a 10-fold increase in the fraction of multinucleated CMs. Ultrastructural analysis confirmed improvements in the myofibrillar density, alignment, and morphology. We measured the contractile performance of late-stage hESC-CMs and hiPSC-CMs and noted a doubling in shortening magnitude with slowed contraction kinetics compared to the early-stage cells. We then examined changes in the calcium-handling properties of these matured CMs and found an increase in calcium release and reuptake rates with no change in the maximum amplitude. Finally, we performed electrophysiological assessments in hESC-CMs and found that late-stage myocytes have hyperpolarized maximum diastolic potentials, increased action potential amplitudes, and faster upstroke velocities. To correlate these functional changes with gene expression, we performed qPCR and found a robust induction of the key cardiac structural markers, including β-myosin heavy chain and connexin-43, in late-stage hESC-CMs and hiPSC-CMs. These findings suggest that PSC-CMs are capable of slowly maturing to more closely resemble the phenotype of adult CMs and may eventually possess the potential to regenerate the lost myocardium with robust de novo force-producing tissue. PMID:23461462

  3. Ogura-CMS in Chinese cabbage (Brassica rapa ssp. pekinensis) causes delayed expression of many nuclear genes.

    PubMed

    Dong, Xiangshu; Kim, Wan Kyu; Lim, Yong-Pyo; Kim, Yeon-Ki; Hur, Yoonkang

    2013-02-01

    We investigated the mechanism regulating cytoplasmic male sterility (CMS) in Brassica rapa ssp. pekinensis using floral bud transcriptome analyses of Ogura-CMS Chinese cabbage and its maintainer line in B. rapa 300-K oligomeric probe (Br300K) microarrays. Ogura-CMS Chinese cabbage produced few and infertile pollen grains on indehiscent anthers. Compared to the maintainer line, CMS plants had shorter filaments and plant growth, and delayed flowering and pollen development. In microarray analysis, 4646 genes showed different expression, depending on floral bud size, between Ogura-CMS and its maintainer line. We found 108 and 62 genes specifically expressed in Ogura-CMS and its maintainer line, respectively. Ogura-CMS line-specific genes included stress-related, redox-related, and B. rapa novel genes. In the maintainer line, genes related to pollen coat and germination were specifically expressed in floral buds longer than 3mm, suggesting insufficient expression of these genes in Ogura-CMS is directly related to dysfunctional pollen. In addition, many nuclear genes associated with auxin response, ATP synthesis, pollen development and stress response had delayed expression in Ogura-CMS plants compared to the maintainer line, which is consistent with the delay in growth and development of Ogura-CMS plants. Delayed expression may reduce pollen grain production and/or cause sterility, implying that mitochondrial, retrograde signaling delays nuclear gene expression. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Phospholipase A2 activity-dependent and -independent fusogenic activity of Naja nigricollis CMS-9 on zwitterionic and anionic phospholipid vesicles.

    PubMed

    Chiou, Yi-Ling; Chen, Ying-Jung; Lin, Shinne-Ren; Chang, Long-Sen

    2011-11-01

    CMS-9, a phospholipase A(2) (PLA(2)) from Naja nigricollis venom, induced the death of human breast cancer MCF-7 cells accompanied with the formation of cell clumps without clear boundaries between cells. Annexin V-FITC staining indicated that abundant phosphatidylserine appeared on the outer membrane of MCF-7 cell clumps, implying the possibility that CMS-9 may promote membrane fusion via anionic phospholipids. To validate this proposition, fusogenic activity of CMS-9 on vesicles composed of zwitterionic phospholipid alone or a combination of zwitterionic and anionic phospholipids was examined. Although CMS-9-induced fusion of zwitterionic phospholipid vesicles depended on PLA(2) activity, CMS-9-induced fusion of vesicles containing anionic phospholipids could occur without the involvement of PLA(2) activity. Membrane-damaging activity of CMS-9 was associated with its fusogenicity. Moreover, CMS-9 induced differently membrane leakage and membrane fusion of vesicles with different compositions. Membrane fluidity and binding capability with phospholipid vesicles were not related to the fusogenicity of CMS-9. However, membrane-bound conformation and mode of CMS-9 depended on phospholipid compositions. Collectively, our data suggest that PLA(2) activity-dependent and -independent fusogenicity of CMS-9 are closely related to its membrane-bound modes and targeted membrane compositions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. 42 CFR 411.382 - CMS's right to rescind advisory opinions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... rescind advisory opinions. Any advice CMS gives in an opinion does not prejudice its right to reconsider... faith reliance upon CMS's advice under this part, provided— (a) The requestor presented to CMS a full...

  6. 42 CFR 411.382 - CMS's right to rescind advisory opinions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... rescind advisory opinions. Any advice CMS gives in an opinion does not prejudice its right to reconsider... faith reliance upon CMS's advice under this part, provided— (a) The requestor presented to CMS a full...

  7. Computer Aided Design Parameters for Forward Basing

    DTIC Science & Technology

    1988-12-01

    21 meters. Systematic errors within limits stated for absolute accuracy are tolerated at this level. DEM data acquired photogrammetrically using manual ...This is a professional drawing package, 19 capable of the manipulation required for this project. With the AutoLISP programming language (a variation on...Table 2). 0 25 Data Conversion Package II GWN System’s Digital Terrain Modeling (DTM) package was used. This AutoLISP -based third party software is

  8. smwrGraphs—An R package for graphing hydrologic data, version 1.1.2

    USGS Publications Warehouse

    Lorenz, David L.; Diekoff, Aliesha L.

    2017-01-31

    This report describes an R package called smwrGraphs, which consists of a collection of graphing functions for hydrologic data within R, a programming language and software environment for statistical computing. The functions in the package have been developed by the U.S. Geological Survey to create high-quality graphs for publication or presentation of hydrologic data that meet U.S. Geological Survey graphics guidelines.

  9. Ultra high speed image processing techniques. [electronic packaging techniques

    NASA Technical Reports Server (NTRS)

    Anthony, T.; Hoeschele, D. F.; Connery, R.; Ehland, J.; Billings, J.

    1981-01-01

    Packaging techniques for ultra high speed image processing were developed. These techniques involve the development of a signal feedthrough technique through LSI/VLSI sapphire substrates. This allows the stacking of LSI/VLSI circuit substrates in a 3 dimensional package with greatly reduced length of interconnecting lines between the LSI/VLSI circuits. The reduced parasitic capacitances results in higher LSI/VLSI computational speeds at significantly reduced power consumption levels.

  10. Documentation of a computer program to simulate aquifer-system compaction using the modular finite-difference ground-water flow model

    USGS Publications Warehouse

    Leake, S.A.; Prudic, David E.

    1988-01-01

    The process of permanent compaction is not routinely included in simulations of groundwater flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U. S. Geological Survey modular finite-difference groundwater flow model. The new program is called the Interbed-Storage Package. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of skeletal component of elastic specific storage and thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the groundwater flow model by adding an additional term to the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum head. Another package that allows for a time-varying specified-head boundary is also documented. This package was written to reduce the data requirements for test simulations of the Interbed-Storage Package. (USGS)

  11. Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.

  12. Determinant Computation on the GPU using the Condensation Method

    NASA Astrophysics Data System (ADS)

    Anisul Haque, Sardar; Moreno Maza, Marc

    2012-02-01

    We report on a GPU implementation of the condensation method designed by Abdelmalek Salem and Kouachi Said for computing the determinant of a matrix. We consider two types of coefficients: modular integers and floating point numbers. We evaluate the performance of our code by measuring its effective bandwidth and argue that it is numerical stable in the floating point number case. In addition, we compare our code with serial implementation of determinant computation from well-known mathematical packages. Our results suggest that a GPU implementation of the condensation method has a large potential for improving those packages in terms of running time and numerical stability.

  13. Comparative analysis of mitochondrial genomes between the hau cytoplasmic male sterility (CMS) line and its iso-nuclear maintainer line in Brassica juncea to reveal the origin of the CMS-associated gene orf288.

    PubMed

    Heng, Shuangping; Wei, Chao; Jing, Bing; Wan, Zhengjie; Wen, Jing; Yi, Bin; Ma, Chaozhi; Tu, Jinxing; Fu, Tingdong; Shen, Jinxiong

    2014-04-30

    Cytoplasmic male sterility (CMS) is not only important for exploiting heterosis in crop plants, but also as a model for investigating nuclear-cytoplasmic interaction. CMS may be caused by mutations, rearrangement or recombination in the mitochondrial genome. Understanding the mitochondrial genome is often the first and key step in unraveling the molecular and genetic basis of CMS in plants. Comparative analysis of the mitochondrial genome of the hau CMS line and its maintainer line in B. juneca (Brassica juncea) may help show the origin of the CMS-associated gene orf288. Through next-generation sequencing, the B. juncea hau CMS mitochondrial genome was assembled into a single, circular-mapping molecule that is 247,903 bp in size and 45.08% in GC content. In addition to the CMS associated gene orf288, the genome contains 35 protein-encoding genes, 3 rRNAs, 25 tRNA genes and 29 ORFs of unknown function. The mitochondrial genome sizes of the maintainer line and another normal type line "J163-4" are both 219,863 bp and with GC content at 45.23%. The maintainer line has 36 genes with protein products, 3 rRNAs, 22 tRNA genes and 31 unidentified ORFs. Comparative analysis the mitochondrial genomes of the hau CMS line and its maintainer line allowed us to develop specific markers to separate the two lines at the seedling stage. We also confirmed that different mitotypes coexist substoichiometrically in hau CMS lines and its maintainer lines in B. juncea. The number of repeats larger than 100 bp in the hau CMS line (16 repeats) are nearly twice of those found in the maintainer line (9 repeats). Phylogenetic analysis of the CMS-associated gene orf288 and four other homologous sequences in Brassicaceae show that orf288 was clearly different from orf263 in Brassica tournefortii despite of strong similarity. The hau CMS mitochondrial genome was highly rearranged when compared with its iso-nuclear maintainer line mitochondrial genome. This study may be useful for studying the mechanism of natural CMS in B. juncea, performing comparative analysis on sequenced mitochondrial genomes in Brassicas, and uncovering the origin of the hau CMS mitotype and structural and evolutionary differences between different mitotypes.

  14. Comparison of computer based instruction to behavior skills training for teaching staff implementation of discrete-trial instruction with an adult with autism.

    PubMed

    Nosik, Melissa R; Williams, W Larry; Garrido, Natalia; Lee, Sarah

    2013-01-01

    In the current study, behavior skills training (BST) is compared to a computer based training package for teaching discrete trial instruction to staff, teaching an adult with autism. The computer based training package consisted of instructions, video modeling and feedback. BST consisted of instructions, modeling, rehearsal and feedback. Following training, participants were evaluated in terms of their accuracy on completing critical skills for running a discrete trial program. Six participants completed training; three received behavior skills training and three received the computer based training. Participants in the BST group performed better overall after training and during six week probes than those in the computer based training group. There were differences across both groups between research assistant and natural environment competency levels. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. 42 CFR 405.1012 - When CMS or its contractors may be a party to a hearing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false When CMS or its contractors may be a party to a... Hearings § 405.1012 When CMS or its contractors may be a party to a hearing. (a) CMS and/or one or more of... unrepresented beneficiary. (b) CMS and/or the contractor(s) advises the ALJ, appellant, and all other parties...

  16. Development of Cytoplasmic Male Sterile IR24 and IR64 Using CW-CMS/Rf17 System.

    PubMed

    Toriyama, Kinya; Kazama, Tomohiko

    2016-12-01

    A wild-abortive-type (WA) cytoplasmic male sterility (CMS) has been almost exclusively used for breeding three-line hybrid rice. Many indica cultivars are known to carry restorer genes for WA-CMS lines and cannot be used as maintainer lines. Especially elite indica cultivars IR24 and IR64 are known to be restorer lines for WA-CMS lines, and are used as male parents for hybrid seed production. If we develop CMS IR24 and CMS IR64, the combination of F1 pairs in hybrid rice breeding programs will be greatly broadened. For production of CMS lines and restorer lines of IR24 and IR64, we employed Chinese wild rice (CW)-type CMS/Restorer of fertility 17 (Rf17) system, in which fertility is restored by a single nuclear gene, Rf17. Successive backcrossing and marker-assisted selection of Rf17 succeeded to produce completely male sterile CMS lines and fully restored restorer lines of IR24 and IR64. CW-cytoplasm did not affect agronomic characteristics. Since IR64 is one of the most popular mega-varieties and used for breeding of many modern varieties, the CW-CMS line of IR64 will be useful for hybrid rice breeding.

  17. PyBoolNet: a python package for the generation, analysis and visualization of boolean networks.

    PubMed

    Klarner, Hannes; Streck, Adam; Siebert, Heike

    2017-03-01

    The goal of this project is to provide a simple interface to working with Boolean networks. Emphasis is put on easy access to a large number of common tasks including the generation and manipulation of networks, attractor and basin computation, model checking and trap space computation, execution of established graph algorithms as well as graph drawing and layouts. P y B ool N et is a Python package for working with Boolean networks that supports simple access to model checking via N u SMV, standard graph algorithms via N etwork X and visualization via dot . In addition, state of the art attractor computation exploiting P otassco ASP is implemented. The package is function-based and uses only native Python and N etwork X data types. https://github.com/hklarner/PyBoolNet. hannes.klarner@fu-berlin.de. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. NORAD LOOK ANGLES AND PIO SATELLITE PACKAGE

    NASA Technical Reports Server (NTRS)

    ANONYMOUS

    1994-01-01

    This program package consists of two programs. First is the NORAD Look Angles Program, which computes satellite look angles (azimuth, elevation, and range) as well as the subsatellite points (latitude, longitude, and height). The second program in this package is the PIO Satellite Program, which computes sighting directions, visibility times, and the maximum elevation angle attained during each pass of an earth-orbiting satellite. Computations take into consideration the observing location and the effect of the earth's shadow on the satellite visibility. Input consists of a magnetic tape prepared by the NORAD Look Angles Program and punched cards containing reference Julian date, right ascension, declination, mean sidereal time at zero hours universal time of the reference date, and daily changes of these quantities. Output consists of a tabulated listing of the satellite's rise and set times, direction, and the maximum elevation angle visible from each observing location. This program has been implemented on the GE 635. The program Assembler code can easily be replaced by FORTRAN statements.

  19. ChemoPy: freely available python package for computational biology and chemoinformatics.

    PubMed

    Cao, Dong-Sheng; Xu, Qing-Song; Hu, Qian-Nan; Liang, Yi-Zeng

    2013-04-15

    Molecular representation for small molecules has been routinely used in QSAR/SAR, virtual screening, database search, ranking, drug ADME/T prediction and other drug discovery processes. To facilitate extensive studies of drug molecules, we developed a freely available, open-source python package called chemoinformatics in python (ChemoPy) for calculating the commonly used structural and physicochemical features. It computes 16 drug feature groups composed of 19 descriptors that include 1135 descriptor values. In addition, it provides seven types of molecular fingerprint systems for drug molecules, including topological fingerprints, electro-topological state (E-state) fingerprints, MACCS keys, FP4 keys, atom pairs fingerprints, topological torsion fingerprints and Morgan/circular fingerprints. By applying a semi-empirical quantum chemistry program MOPAC, ChemoPy can also compute a large number of 3D molecular descriptors conveniently. The python package, ChemoPy, is freely available via http://code.google.com/p/pychem/downloads/list, and it runs on Linux and MS-Windows. Supplementary data are available at Bioinformatics online.

  20. Computerizing the Accounting Curriculum.

    ERIC Educational Resources Information Center

    Nash, John F.; England, Thomas G.

    1986-01-01

    Discusses the use of computers in college accounting courses. Argues that the success of new efforts in using computers in teaching accounting is dependent upon increasing instructors' computer skills, and choosing appropriate hardware and software, including commercially available business software packages. (TW)

  1. Social Mathematics and Media: Using Pictures, Maps, Charts, and Graphs. Media Corner.

    ERIC Educational Resources Information Center

    Braun, Joseph A., Jr., Ed.

    1993-01-01

    Asserts that integrating disciplines is a goal of elementary social studies education. Presents a bibliographic essay describing instructional materials that can be used to integrate mathematics and social studies. Includes recommended photograph packages, computer databases, and data interpretation packages. (CFR)

  2. Evaluation of Five Microcomputer CAD Packages.

    ERIC Educational Resources Information Center

    Leach, James A.

    1987-01-01

    Discusses the similarities, differences, advanced features, applications and number of users of five microcomputer computer-aided design (CAD) packages. Included are: "AutoCAD (V.2.17)"; "CADKEY (V.2.0)"; "CADVANCE (V.1.0)"; "Super MicroCAD"; and "VersaCAD Advanced (V.4.00)." Describes the…

  3. graphkernels: R and Python packages for graph comparison

    PubMed Central

    Ghisu, M Elisabetta; Llinares-López, Felipe; Borgwardt, Karsten

    2018-01-01

    Abstract Summary Measuring the similarity of graphs is a fundamental step in the analysis of graph-structured data, which is omnipresent in computational biology. Graph kernels have been proposed as a powerful and efficient approach to this problem of graph comparison. Here we provide graphkernels, the first R and Python graph kernel libraries including baseline kernels such as label histogram based kernels, classic graph kernels such as random walk based kernels, and the state-of-the-art Weisfeiler-Lehman graph kernel. The core of all graph kernels is implemented in C ++ for efficiency. Using the kernel matrices computed by the package, we can easily perform tasks such as classification, regression and clustering on graph-structured samples. Availability and implementation The R and Python packages including source code are available at https://CRAN.R-project.org/package=graphkernels and https://pypi.python.org/pypi/graphkernels. Contact mahito@nii.ac.jp or elisabetta.ghisu@bsse.ethz.ch Supplementary information Supplementary data are available online at Bioinformatics. PMID:29028902

  4. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  5. graphkernels: R and Python packages for graph comparison.

    PubMed

    Sugiyama, Mahito; Ghisu, M Elisabetta; Llinares-López, Felipe; Borgwardt, Karsten

    2018-02-01

    Measuring the similarity of graphs is a fundamental step in the analysis of graph-structured data, which is omnipresent in computational biology. Graph kernels have been proposed as a powerful and efficient approach to this problem of graph comparison. Here we provide graphkernels, the first R and Python graph kernel libraries including baseline kernels such as label histogram based kernels, classic graph kernels such as random walk based kernels, and the state-of-the-art Weisfeiler-Lehman graph kernel. The core of all graph kernels is implemented in C ++ for efficiency. Using the kernel matrices computed by the package, we can easily perform tasks such as classification, regression and clustering on graph-structured samples. The R and Python packages including source code are available at https://CRAN.R-project.org/package=graphkernels and https://pypi.python.org/pypi/graphkernels. mahito@nii.ac.jp or elisabetta.ghisu@bsse.ethz.ch. Supplementary data are available online at Bioinformatics. © The Author(s) 2017. Published by Oxford University Press.

  6. Human embryonic and induced pluripotent stem cell-derived cardiomyocytes exhibit beat rate variability and power-law behavior.

    PubMed

    Mandel, Yael; Weissman, Amir; Schick, Revital; Barad, Lili; Novak, Atara; Meiry, Gideon; Goldberg, Stanislav; Lorber, Avraham; Rosen, Michael R; Itskovitz-Eldor, Joseph; Binah, Ofer

    2012-02-21

    The sinoatrial node is the main impulse-generating tissue in the heart. Atrioventricular conduction block and arrhythmias caused by sinoatrial node dysfunction are clinically important and generally treated with electronic pacemakers. Although an excellent solution, electronic pacemakers incorporate limitations that have stimulated research on biological pacing. To assess the suitability of potential biological pacemakers, we tested the hypothesis that the spontaneous electric activity of human embryonic stem cell-derived cardiomyocytes (hESC-CMs) and induced pluripotent stem cell-derived cardiomyocytes (iPSC-CMs) exhibit beat rate variability and power-law behavior comparable to those of human sinoatrial node. We recorded extracellular electrograms from hESC-CMs and iPSC-CMs under stable conditions for up to 15 days. The beat rate time series of the spontaneous activity were examined in terms of their power spectral density and additional methods derived from nonlinear dynamics. The major findings were that the mean beat rate of hESC-CMs and iPSC-CMs was stable throughout the 15-day follow-up period and was similar in both cell types, that hESC-CMs and iPSC-CMs exhibited intrinsic beat rate variability and fractal behavior, and that isoproterenol increased and carbamylcholine decreased the beating rate in both hESC-CMs and iPSC-CMs. This is the first study demonstrating that hESC-CMs and iPSC-CMs exhibit beat rate variability and power-law behavior as in humans, thus supporting the potential capability of these cell sources to serve as biological pacemakers. Our ability to generate sinoatrial-compatible spontaneous cardiomyocytes from the patient's own hair (via keratinocyte-derived iPSCs), thus eliminating the critical need for immunosuppression, renders these myocytes an attractive cell source as biological pacemakers.

  7. Temperature Calculations in the Coastal Modeling System

    DTIC Science & Technology

    2017-04-01

    tide) and river discharge at model boundaries, wave radiation stress, and wind forcing over a model computational domain. Physical processes calculated...calculated in the CMS using the following meteorological parameters: solar radiation, cloud cover, air temperature, wind speed, and surface water temperature...during a clear (i.e., cloudless) sky (Wm-2); CLDC is the cloud cover fraction (0-1.0); SWR is the surface reflection coefficient; and SHDf is the

  8. The ENSDF Java Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sonzogni, A.A.

    2005-05-24

    A package of computer codes has been developed to process and display nuclear structure and decay data stored in the ENSDF (Evaluated Nuclear Structure Data File) library. The codes were written in an object-oriented fashion using the java language. This allows for an easy implementation across multiple platforms as well as deployment on web pages. The structure of the different java classes that make up the package is discussed as well as several different implementations.

  9. Wide-field Imaging System and Rapid Direction of Optical Zoom (WOZ)

    DTIC Science & Technology

    2010-09-25

    commercial software packages: SolidWorks, COMSOL Multiphysics, and ZEMAX optical design. SolidWorks is a computer aided design package, which as a live...interface to COMSOL. COMSOL is a finite element analysis/partial differential equation solver. ZEMAX is an optical design package. Both COMSOL and... ZEMAX have live interfaces to MatLab. Our initial investigations have enabled a model in SolidWorks to be updated in COMSOL, an FEA calculation

  10. The LARSYS Educational Package: Instructor's Notes for Use with the Data 100

    NASA Technical Reports Server (NTRS)

    Lindenlaub, J. C.; Russell, J. D.

    1977-01-01

    The LARSYS Educational Package is a set of instructional materials developed to train people to analyze remotely sensed multispectral data using LARSYS, a computer software system. The materials included in this volume have been designed to assist LARSYS instructors as they guide students through the LARSYS Educational Package. All of the materials have been updated from the previous version to reflect the use of a Data 100 Remote Terminal.

  11. Muscle Velocity and Inertial Force from Phase Contrast Magnetic Resonance Imaging

    PubMed Central

    Wentland, Andrew L.; McWalter, Emily J.; Pal, Saikat; Delp, Scott L.; Gold, Garry E.

    2014-01-01

    Purpose To evaluate velocity waveforms in muscle and to create a tool and algorithm for computing and analyzing muscle inertial forces derived from 2D phase contrast (PC) MRI. Materials and Methods PC MRI was performed in the forearm of four healthy volunteers during 1 Hz cycles of wrist flexion-extension as well as in the lower leg of six healthy volunteers during 1 Hz cycles of plantarflexion-dorsiflexion. Inertial forces (F) were derived via the equation F = ma. The mass, m, was derived by multiplying voxel volume by voxel-by-voxel estimates of density via fat-water separation techniques. Acceleration, a, was obtained via the derivative of the PC MRI velocity waveform. Results Mean velocities in the flexors of the forearm and lower leg were 1.94 ± 0.97 cm/s and 5.57 ± 2.72 cm/s, respectively, as averaged across all subjects; the inertial forces in the flexors of the forearm and lower leg were 1.9 × 10-3 ± 1.3 × 10-3 N and 1.1 × 10-2 ± 6.1 × 10-3 N, respectively, as averaged across all subjects. Conclusion PC MRI provided a promising means of computing muscle velocities and inertial forces—providing the first method for quantifying inertial forces. PMID:25425185

  12. Software.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1989

    1989-01-01

    Presented are reviews of two computer software packages for Apple II computers; "Organic Spectroscopy," and "Videodisc Display Program" for use with "The Periodic Table Videodisc." A sample spectrograph from "Organic Spectroscopy" is included. (CW)

  13. NLM microcomputer-based tutorials (for microcomputers). Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkins, M.

    1990-04-01

    The package consists of TOXLEARN--a microcomputer-based training package for TOXLINE (Toxicology Information Online), CHEMLEARN-a microcomputer-based training package for CHEMLINE (Chemical Information Online), MEDTUTOR--a microcomputer-based training package for MEDLINE (Medical Information Online), and ELHILL LEARN--a microcomputer-based training package for the ELHILL search and retrieval software that supports the above-mentioned databases...Software Description: The programs were developed under PILOTplus using the NLM LEARN Programmer. They run on IBM-PC, XT, AT, PS/2, and fully compatible computers. The programs require 512K RAM memory, one disk drive, and DOS 2.0 or higher. The software supports most monochrome, color graphics, enhanced color graphics, or visual graphics displays.

  14. Human renal adipose tissue induces the invasion and progression of renal cell carcinoma

    PubMed Central

    Campo-Verde-Arbocco, Fiorella; López-Laur, José D.; Romeo, Leonardo R.; Giorlando, Noelia; Bruna, Flavia A.; Contador, David E.; López-Fontana, Gastón; Santiano, Flavia E.; Sasso, Corina V.; Zyla, Leila E.; López-Fontana, Constanza M.; Calvo, Juan C.; Carón, Rubén W.; Creydt, Virginia Pistone

    2017-01-01

    We evaluated the effects of conditioned media (CMs) of human adipose tissue from renal cell carcinoma located near the tumor (hRATnT) or farther away from the tumor (hRATfT), on proliferation, adhesion and migration of tumor (786-O and ACHN) and non-tumor (HK-2) human renal epithelial cell lines. Human adipose tissues were obtained from patients with renal cell carcinoma (RCC) and CMs from hRATnT and hRATfT incubation. Proliferation, adhesion and migration were quantified in 786-O, ACHN and HK-2 cell lines incubated with hRATnT-, hRATfT- or control-CMs. We evaluated versican, adiponectin and leptin expression in CMs from hRATnT and hRATfT. We evaluated AdipoR1/2, ObR, pERK, pAkt y pPI3K expression on cell lines incubated with CMs. No differences in proliferation of cell lines was found after 24 h of treatment with CMs. All cell lines showed a significant decrease in cell adhesion and increase in cell migration after incubation with hRATnT-CMs vs. hRATfT- or control-CMs. hRATnT-CMs showed increased levels of versican and leptin, compared to hRATfT-CMs. AdipoR2 in 786-O and ACHN cells decreased significantly after incubation with hRATfT- and hRATnT-CMs vs. control-CMs. We observed a decrease in the expression of pAkt in HK-2, 786-O and ACHN incubated with hRATnT-CMs. This result could partially explain the observed changes in migration and cell adhesion. We conclude that hRATnT released factors, such as leptin and versican, could enhance the invasive potential of renal epithelial cell lines and could modulate the progression of the disease. PMID:29212223

  15. Mitochondrial nad2 gene is co-transcripted with CMS-associated orfB gene in cytoplasmic male-sterile stem mustard (Brassica juncea).

    PubMed

    Yang, Jing-Hua; Zhang, Ming-Fang; Yu, Jing-Quan

    2009-02-01

    The transcriptional patterns of mitochondrial respiratory related genes were investigated in cytoplasmic male-sterile and fertile maintainer lines of stem mustard, Brassica juncea. There were numerous differences in nad2 (subunit 2 of NADH dehydrogenase) between stem mustard CMS and its maintainer line. One novel open reading frame, hereafter named orfB gene, was located at the downstream of mitochondrial nad2 gene in the CMS. The novel orfB gene had high similarity with YMF19 family protein, orfB in Raphanus sativus, Helianthus annuus, Nicotiana tabacum and Beta vulgaris, orfB-CMS in Daucus carota, atp8 gene in Arabidopsis thaliana, 5' flanking of orf224 in B. napus (nap CMS) and 5' flanking of orf220 gene in CMS Brassica juncea. Three copies probed by specific fragment (amplified by primers of nad2F and nad2R from CMS) were found in the CMS line following Southern blotting digested with HindIII, but only a single copy in its maintainer line. Meanwhile, two transcripts were shown in the CMS line following Northern blotting while only one transcript was detected in the maintainer line, which were probed by specific fragment (amplified by primers of nad2F and nad2R from CMS). Meanwhile, the expression of nad2 gene was reduced in CMS bud compared to that in its maintainer line. We thus suggested that nad2 gene may be co-transcripted with CMS-associated orfB gene in the CMS. In addition, the specific fragment that was amplified by primers of nad2F and nad2R just spanned partial sequences of nad2 gene and orfB gene. Such alterations in the nad2 gene would impact the activity of NADH dehydrogenase, and subsequently signaling, inducing the expression of nuclear genes involved in male sterility in this type of cytoplasmic male sterility.

  16. ATLAS and LHC computing on CRAY

    NASA Astrophysics Data System (ADS)

    Sciacca, F. G.; Haug, S.; ATLAS Collaboration

    2017-10-01

    Access and exploitation of large scale computing resources, such as those offered by general purpose HPC centres, is one important measure for ATLAS and the other Large Hadron Collider experiments in order to meet the challenge posed by the full exploitation of the future data within the constraints of flat budgets. We report on the effort of moving the Swiss WLCG T2 computing, serving ATLAS, CMS and LHCb, from a dedicated cluster to the large Cray systems at the Swiss National Supercomputing Centre CSCS. These systems do not only offer very efficient hardware, cooling and highly competent operators, but also have large backfill potentials due to size and multidisciplinary usage and potential gains due to economy at scale. Technical solutions, performance, expected return and future plans are discussed.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salama, A.; Mikhail, M.

    Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less

  18. Responding to GPs' information resource needs: implementation and evaluation of a complementary medicines information resource in Queensland general practice

    PubMed Central

    2011-01-01

    Background Australian General Practitioners (GPs) are in the forefront of primary health care and in an excellent position to communicate with their patients and educate them about Complementary Medicines (CMs) use. However previous studies have demonstrated that GPs lack the knowledge required about CMs to effectively communicate with patients about their CMs use and they perceive a need for information resources on CMs to use in their clinical practice. This study aimed to develop, implement, and evaluate a CMs information resource in Queensland (Qld) general practice. Methods The results of the needs assessment survey of Qld general practitioners (GPs) informed the development of a CMs information resource which was then put through an implementation and evaluation cycle in Qld general practice. The CMs information resource was a set of evidence-based herbal medicine fact sheets. This resource was utilised by 100 Qld GPs in their clinical practice for four weeks and was then evaluated. The evaluation assessed GPs' (1) utilisation of the resource (2) perceived quality, usefulness and satisfaction with the resource and (3) perceived impact of the resource on their knowledge, attitudes, and practice of CMs. Results Ninety two out of the 100 GPs completed the four week evaluation of the fact sheets and returned the post-intervention survey. The herbal medicine fact sheets produced by this study were well accepted and utilised by Qld GPs. The majority of GPs perceived that the fact sheets were a useful resource for their clinical practice. The fact sheets improved GPs' attitudes towards CMs, increased their knowledge of those herbal medicines and improved their communication with their patients about those specific herbs. Eighty-six percent of GPs agreed that if they had adequate resources on CMs, like the herbal medicine fact sheets, then they would communicate more to their patients about their use of CMs. Conclusion Further educational interventions on CMs need to be provided to GPs to increase their knowledge of CMs and to improve their communication with patients about their CMs use. PMID:21933434

  19. Repeated asenapine treatment does not participate in the mild stress induced FosB/ΔFosB expression in the rat hypothalamic paraventricular nucleus neurons.

    PubMed

    Kiss, Alexander; Majercikova, Zuzana

    2017-02-01

    Effect of repeated asenapine (ASE) treatment on FosB/ΔFosB expression was studied in the hypothalamic paraventricular nucleus (PVN) of male rats exposed to chronic mild stress (CMS) for 21days. Our intention was to find out whether repeated ASE treatment for 14days may: 1) induce FosB/ΔFosB expression in the PVN; 2) activate selected PVN neuronal phenotypes, synthesizing oxytocin (OXY), vasopressin (AVP), corticoliberin (CRH) or tyrosine hydroxylase (TH); and 3) interfere with the impact of CMS. Control, ASE, CMS, and CMS+ASE treated groups were used. CMS included restraint, social isolation, crowding, swimming, and cold. From the 7th day of CMS, rats received ASE (0.3mg/kg) or saline (300μl/rat) subcutaneously, twice a day for 14days. They were sacrificed on the day 22nd (16-18h after last treatments). FosB/ΔFosB was visualized with avidin biotin peroxidase complex and OXY, AVP, CRH or TH antibodies by fluorescent dyes. Saline and ASE did not promote FosB/ΔFosB expression in the PVN. CMS and CMS+ASE elicited FosB/ΔFosB-expression in the PVN, whereas, ASE did not augment or attenuate FosB/ΔFosB induction elicited by CMS. FosB/ΔFosB-CRH occurred after CMS and CMS+ASE treatments in the PVN middle sector, while FosB/ΔFosB-AVP and FosB/ΔFosB-OXY after CMS and CMS+ASE treatments in the PVN posterior sector. FosB/ΔFosB-TH colocalization was rare. Larger FosB/ΔFosB profiles, running above the PVN, did not show any colocalizations. The study provides an anatomical/functional knowledge about an unaccented nature of prolonged ASE treatment at the level of PVN and excludes its positive or negative interplay with CMS effect. Data indicate that long-lasting ASE treatment might not act as a stressor acting at the PVN level. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. ASP: Automated symbolic computation of approximate symmetries of differential equations

    NASA Astrophysics Data System (ADS)

    Jefferson, G. F.; Carminati, J.

    2013-03-01

    A recent paper (Pakdemirli et al. (2004) [12]) compared three methods of determining approximate symmetries of differential equations. Two of these methods are well known and involve either a perturbation of the classical Lie symmetry generator of the differential system (Baikov, Gazizov and Ibragimov (1988) [7], Ibragimov (1996) [6]) or a perturbation of the dependent variable/s and subsequent determination of the classical Lie point symmetries of the resulting coupled system (Fushchych and Shtelen (1989) [11]), both up to a specified order in the perturbation parameter. The third method, proposed by Pakdemirli, Yürüsoy and Dolapçi (2004) [12], simplifies the calculations required by Fushchych and Shtelen's method through the assignment of arbitrary functions to the non-linear components prior to computing symmetries. All three methods have been implemented in the new MAPLE package ASP (Automated Symmetry Package) which is an add-on to the MAPLE symmetry package DESOLVII (Vu, Jefferson and Carminati (2012) [25]). To our knowledge, this is the first computer package to automate all three methods of determining approximate symmetries for differential systems. Extensions to the theory have also been suggested for the third method and which generalise the first method to systems of differential equations. Finally, a number of approximate symmetries and corresponding solutions are compared with results in the literature.

  1. MELCOR computer code manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less

  2. ppcor: An R Package for a Fast Calculation to Semi-partial Correlation Coefficients.

    PubMed

    Kim, Seongho

    2015-11-01

    Lack of a general matrix formula hampers implementation of the semi-partial correlation, also known as part correlation, to the higher-order coefficient. This is because the higher-order semi-partial correlation calculation using a recursive formula requires an enormous number of recursive calculations to obtain the correlation coefficients. To resolve this difficulty, we derive a general matrix formula of the semi-partial correlation for fast computation. The semi-partial correlations are then implemented on an R package ppcor along with the partial correlation. Owing to the general matrix formulas, users can readily calculate the coefficients of both partial and semi-partial correlations without computational burden. The package ppcor further provides users with the level of the statistical significance with its test statistic.

  3. Xarray: multi-dimensional data analysis in Python

    NASA Astrophysics Data System (ADS)

    Hoyer, Stephan; Hamman, Joe; Maussion, Fabien

    2017-04-01

    xarray (http://xarray.pydata.org) is an open source project and Python package that provides a toolkit and data structures for N-dimensional labeled arrays, which are the bread and butter of modern geoscientific data analysis. Key features of the package include label-based indexing and arithmetic, interoperability with the core scientific Python packages (e.g., pandas, NumPy, Matplotlib, Cartopy), out-of-core computation on datasets that don't fit into memory, a wide range of input/output options, and advanced multi-dimensional data manipulation tools such as group-by and resampling. In this contribution we will present the key features of the library and demonstrate its great potential for a wide range of applications, from (big-)data processing on super computers to data exploration in front of a classroom.

  4. Enhanced Electric Power Transmission by Hybrid Compensation Technique

    NASA Astrophysics Data System (ADS)

    Palanichamy, C.; Kiu, G. Q.

    2015-04-01

    In today's competitive environment, new power system engineers are likely to contribute immediately to the task, without years of seasoning via on-the-job training, mentoring, and rotation assignments. At the same time it is becoming obligatory to train power system engineering graduates for an increasingly quality-minded corporate environment. In order to achieve this, there is a need to make available better-quality tools for educating and training power system engineering students and in-service system engineers too. As a result of the swift advances in computer hardware and software, many windows-based computer software packages were developed for the purpose of educating and training. In line with those packages, a simulation package called Hybrid Series-Shunt Compensators (HSSC) has been developed and presented in this paper for educational purposes.

  5. Development of a change management system

    NASA Technical Reports Server (NTRS)

    Parks, Cathy Bonifas

    1993-01-01

    The complexity and interdependence of software on a computer system can create a situation where a solution to one problem causes failures in dependent software. In the computer industry, software problems arise and are often solved with 'quick and dirty' solutions. But in implementing these solutions, documentation about the solution or user notification of changes is often overlooked, and new problems are frequently introduced because of insufficient review or testing. These problems increase when numerous heterogeneous systems are involved. Because of this situation, a change management system plays an integral part in the maintenance of any multisystem computing environment. At the NASA Ames Advanced Computational Facility (ACF), the Online Change Management System (OCMS) was designed and developed to manage the changes being applied to its multivendor computing environment. This paper documents the research, design, and modifications that went into the development of this change management system (CMS).

  6. The elution of colistimethate sodium from polymethylmethacrylate and calcium phosphate cement beads.

    PubMed

    Waterman, Paige; Barber, Melissa; Weintrob, Amy C; VanBrakle, Regina; Howard, Robin; Kozar, Michael P; Andersen, Romney; Wortmann, Glenn

    2012-06-01

    Gram-negative bacilli resistance to all antibiotics, except for colistimethate sodium (CMS), is an emerging healthcare concern. Incorporating CMS into orthopedic cement to treat bone and soft-tissue infections due to these bacteria is attractive, but the data regarding the elution of CMS from cement are conflicting. The in vitro analysis of the elution of CMS from polymethylmethacrylate (PMMA) and calcium phosphate (CP) cement beads is reported. PMMA and CP beads containing CMS were incubated in phosphate-buffered saline and the eluate sampled at sequential time points. The inhibition of the growth of a strain of Acinetobacter baumannii complex by the eluate was measured by disk diffusion and microbroth dilution assays, and the presence of CMS in the eluate was measured by mass spectroscopy. Bacterial growth was inhibited by the eluate from both PMMA and CP beads. Mass spectroscopy demonstrated greater elution of CMS from CP beads than PMMA beads. The dose of CMS in PMMA beads was limited by failure of bead integrity. CMS elutes from both CP and PMMA beads in amounts sufficient to inhibit bacterial growth in vitro. The clinical implications of these findings require further study.

  7. Software Reviews. Programs Worth a Second Look.

    ERIC Educational Resources Information Center

    Schneider, Roxanne; Eiser, Leslie

    1989-01-01

    Reviewed are three computer software packages for use in middle/high school classrooms. Included are "MacWrite II," a word-processing program for MacIntosh computers; "Super Story Tree," a word-processing program for Apple and IBM computers; and "Math Blaster Mystery," for IBM, Apple, and Tandy computers. (CW)

  8. Computer-Aided Engineering Education at the K.U. Leuven.

    ERIC Educational Resources Information Center

    Snoeys, R.; Gobin, R.

    1987-01-01

    Describes some recent initiatives and developments in the computer-aided design program in the engineering faculty of the Katholieke Universiteit Leuven (Belgium). Provides a survey of the engineering curriculum, the computer facilities, and the main software packages available. (TW)

  9. Software Reviews.

    ERIC Educational Resources Information Center

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  10. 42 CFR 423.890 - Appeals.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... sponsor or by CMS before notice of the reconsidered determination is made. (6) Decision of the informal written reconsideration. CMS informs the sponsor of the decision orally or through electronic mail. CMS sends a written decision to the sponsor on the sponsor's request. (7) Effect of CMS informal written...

  11. 45 CFR 150.221 - Transition to State enforcement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement Processes for... State enforcement. (a) If CMS determines that a State for which it has assumed enforcement authority has... appropriate to return enforcement authority to the State, CMS will enter into discussions with State officials...

  12. 45 CFR 150.213 - Form and content of notice.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement Processes for Determining... consequence of a State's failure to substantially enforce HIPAA requirements is that CMS enforces them. (d... information that the State wishes CMS to consider in making the preliminary determination described in § 150...

  13. 45 CFR 150.321 - Determining the amount of penalty-aggravating circumstances.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement..., if there are substantial or several aggravating circumstances, CMS sets the aggregate amount of the.... CMS considers the following circumstances to be aggravating circumstances: (a) The frequency of...

  14. 45 CFR 150.343 - Notice of proposed penalty.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement With Respect to Issuers and Non-Federal Governmental Plans-Civil Money Penalties § 150.343 Notice of proposed penalty. If CMS... penalty. The notice includes the following: (a) A description of the HIPAA requirements that CMS has...

  15. Computer Aided Management for Information Processing Projects.

    ERIC Educational Resources Information Center

    Akman, Ibrahim; Kocamustafaogullari, Kemal

    1995-01-01

    Outlines the nature of information processing projects and discusses some project management programming packages. Describes an in-house interface program developed to utilize a selected project management package (TIMELINE) by using Oracle Data Base Management System tools and Pascal programming language for the management of information system…

  16. International Inventory of Software Packages in the Information Field.

    ERIC Educational Resources Information Center

    Keren, Carl, Ed.; Sered, Irina, Ed.

    Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…

  17. A Curriculum Review: The Voyage of the Mimi.

    ERIC Educational Resources Information Center

    Johns, Kenneth W.

    1988-01-01

    The curriculum package, "The Voyage of the Mimi," uses computer, videocassette, student text, and workbook for integrated study of the great whales and the impact of social actions on society and the environment. This review suggests that the package also offers many ancillary teaching opportunities. (CB)

  18. CMS-G from Beta vulgaris ssp. maritima is maintained in natural populations despite containing an atypical cytochrome c oxidase.

    PubMed

    Meyer, Etienne H; Lehmann, Caroline; Boivin, Stéphane; Brings, Lea; De Cauwer, Isabelle; Bock, Ralph; Kühn, Kristina; Touzet, Pascal

    2018-02-23

    While mitochondrial mutants of the respiratory machinery are rare and often lethal, cytoplasmic male sterility (CMS), a mitochondrially inherited trait that results in pollen abortion, is frequently encountered in wild populations. It generates a breeding system called gynodioecy. In Beta vulgaris ssp. maritima , a gynodioecious species, we found CMS-G to be widespread across the distribution range of the species. Despite the sequencing of the mitochondrial genome of CMS-G, the mitochondrial sterilizing factor causing CMS-G is still unknown. By characterizing biochemically CMS-G, we found that the expression of several mitochondrial proteins is altered in CMS-G plants. In particular, Cox1, a core subunit of the cytochrome c oxidase (complex IV), is larger but can still assemble into complex IV. However, the CMS-G-specific complex IV was only detected as a stabilized dimer. We did not observe any alteration of the affinity of complex IV for cytochrome c ; however, in CMS-G, complex IV capacity is reduced. Our results show that CMS-G is maintained in many natural populations despite being associated with an atypical complex IV. We suggest that the modified complex IV could incur the associated cost predicted by theoretical models to maintain gynodioecy in wild populations. © 2018 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  19. Cardiometabolic syndrome and its association with education, smoking, diet, physical activity, and social support: findings from the Pennsylvania 2007 BRFSS Survey.

    PubMed

    Liu, Longjian; Núñez, Ana E

    2010-07-01

    The authors aimed to examine the prevalence of cardiometabolic syndrome (CMS) and its association with education, smoking, diet, physical activity, and social support among white, black, and Hispanic adults using data from the 2007 Pennsylvania Behavior Risk Factor Surveillance System (BRFSS) survey, the largest population-based survey in the state. The authors examined associations between CMS and associated factors cross-sectionally using univariate and multivariate methods. The study included a representative sample of 12,629 noninstitutionalized Pennsylvanians aged > or =18. Components of CMS included obesity, hypercholesterolemia, angina (as a surrogate for decreased high-density lipoprotein), prehypertension or hypertension, and prediabetes or diabetes. CMS was identified as the presence of > or =3 CMS components. The results show that the prevalence of CMS was 20.48% in blacks, followed by Hispanics (19.14%) and whites (12.26%), (P<.01). Multivariate logistic regression analyses indicated that physical inactivity, lower educational levels, smoking, daily consumption of vegetables and/or fruits <3 servings, and lack of social support were significantly associated with the odds of having CMS. In conclusion, black and Hispanic adults have a significantly higher prevalence of CMS than whites. The significant association between CMS and risk factors provides new insights in the direction of health promotion to prevent and control CMS in those who are at high risk.

  20. The mitochondrial gene orfH79 plays a critical role in impairing both male gametophyte development and root growth in CMS-Honglian rice.

    PubMed

    Peng, Xiaojue; Wang, Kun; Hu, Chaofeng; Zhu, Youlin; Wang, Ting; Yang, Jing; Tong, Jiping; Li, Shaoqing; Zhu, Yingguo

    2010-06-24

    Cytoplasmic male sterility (CMS) has often been associated with abnormal mitochondrial open reading frames. The mitochondrial gene orfH79 is a candidate gene for causing the CMS trait in CMS-Honglian (CMS-HL) rice. However, whether the orfH79 expression can actually induce CMS in rice remains unclear. Western blot analysis revealed that the ORFH79 protein is mainly present in mitochondria of CMS-HL rice and is absent in the fertile line. To investigate the function of ORFH79 protein in mitochondria, this gene was fused to a mitochondrial transit peptide sequence and used to transform wild type rice, where its expression induced the gametophytic male sterile phenotype. In addition, excessive accumulation of reactive oxygen species (ROS) in the microspore, a reduced ATP/ADP ratio, decreased mitochondrial membrane potential and a lower respiration rate in the transgenic plants were found to be similar to those in CMS-HL rice. Moreover, retarded growth of primary and lateral roots accompanied by abnormal accumulation of ROS in the root tip was observed in both transgenic rice and CMS-HL rice (YTA). These results suggest that the expression of orfH79 in mitochondria impairs mitochondrial function, which affects the development of both male gametophytes and the roots of CMS-HL rice.

  1. CDX2 prognostic value in stage II/III resected colon cancer is related to CMS classification.

    PubMed

    Pilati, C; Taieb, J; Balogoun, R; Marisa, L; de Reyniès, A; Laurent-Puig, P

    2017-05-01

    Caudal-type homeobox transcription factor 2 (CDX2) is involved in colon cancer (CC) oncogenesis and has been proposed as a prognostic biomarker in patients with stage II or III CC. We analyzed CDX2 expression in a series of 469 CC typed for the new international consensus molecular subtype (CMS) classification, and we confirmed results in a series of 90 CC. Here, we show that lack of CDX2 expression is only present in the mesenchymal subgroup (CMS4) and in MSI-immune tumors (CMS1) and not in CMS2 and CMS3 colon cancer. Although CDX2 expression was a globally independent prognostic factor, loss of CDX2 expression is not associated with a worse prognosis in the CMS1 group, but is highly prognostic in CMS4 patients for both relapse free and overall survival. Similarly, lack of CDX2 expression was a bad prognostic factor in MSS patients, but not in MSI. Our work suggests that combination of the consensual CMS classification and lack of CDX2 expression could be a useful marker to identify CMS4/CDX2-negative patients with a very poor prognosis. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  2. Senp1 drives hypoxia-induced polycythemia via GATA1 and Bcl-xL in subjects with Monge’s disease

    PubMed Central

    Azad, Priti; Zhao, Huiwen W.; Ronen, Roy; Zhou, Dan; Poulsen, Orit; Hsiao, Yu Hsin; Bafna, Vineet

    2016-01-01

    In this study, because excessive polycythemia is a predominant trait in some high-altitude dwellers (chronic mountain sickness [CMS] or Monge’s disease) but not others living at the same altitude in the Andes, we took advantage of this human experiment of nature and used a combination of induced pluripotent stem cell technology, genomics, and molecular biology in this unique population to understand the molecular basis for hypoxia-induced excessive polycythemia. As compared with sea-level controls and non-CMS subjects who responded to hypoxia by increasing their RBCs modestly or not at all, respectively, CMS cells increased theirs remarkably (up to 60-fold). Although there was a switch from fetal to adult HgbA0 in all populations and a concomitant shift in oxygen binding, we found that CMS cells matured faster and had a higher efficiency and proliferative potential than non-CMS cells. We also established that SENP1 plays a critical role in the differential erythropoietic response of CMS and non-CMS subjects: we can convert the CMS phenotype into that of non-CMS and vice versa by altering SENP1 levels. We also demonstrated that GATA1 is an essential downstream target of SENP1 and that the differential expression and response of GATA1 and Bcl-xL are a key mechanism underlying CMS pathology. PMID:27821551

  3. Senp1 drives hypoxia-induced polycythemia via GATA1 and Bcl-xL in subjects with Monge's disease.

    PubMed

    Azad, Priti; Zhao, Huiwen W; Cabrales, Pedro J; Ronen, Roy; Zhou, Dan; Poulsen, Orit; Appenzeller, Otto; Hsiao, Yu Hsin; Bafna, Vineet; Haddad, Gabriel G

    2016-11-14

    In this study, because excessive polycythemia is a predominant trait in some high-altitude dwellers (chronic mountain sickness [CMS] or Monge's disease) but not others living at the same altitude in the Andes, we took advantage of this human experiment of nature and used a combination of induced pluripotent stem cell technology, genomics, and molecular biology in this unique population to understand the molecular basis for hypoxia-induced excessive polycythemia. As compared with sea-level controls and non-CMS subjects who responded to hypoxia by increasing their RBCs modestly or not at all, respectively, CMS cells increased theirs remarkably (up to 60-fold). Although there was a switch from fetal to adult HgbA0 in all populations and a concomitant shift in oxygen binding, we found that CMS cells matured faster and had a higher efficiency and proliferative potential than non-CMS cells. We also established that SENP1 plays a critical role in the differential erythropoietic response of CMS and non-CMS subjects: we can convert the CMS phenotype into that of non-CMS and vice versa by altering SENP1 levels. We also demonstrated that GATA1 is an essential downstream target of SENP1 and that the differential expression and response of GATA1 and Bcl-xL are a key mechanism underlying CMS pathology. © 2016 Azad et al.

  4. Transient Heat Conduction Simulation around Microprocessor Die

    NASA Astrophysics Data System (ADS)

    Nishi, Koji

    This paper explains about fundamental formula of calculating power consumption of CMOS (Complementary Metal-Oxide-Semiconductor) devices and its voltage and temperature dependency, then introduces equation for estimating power consumption of the microprocessor for notebook PC (Personal Computer). The equation is applied to heat conduction simulation with simplified thermal model and evaluates in sub-millisecond time step calculation. In addition, the microprocessor has two major heat conduction paths; one is from the top of the silicon die via thermal solution and the other is from package substrate and pins via PGA (Pin Grid Array) socket. Even though the dominant factor of heat conduction is the former path, the latter path - from package substrate and pins - plays an important role in transient heat conduction behavior. Therefore, this paper tries to focus the path from package substrate and pins, and to investigate more accurate method of estimating heat conduction paths of the microprocessor. Also, cooling performance expression of heatsink fan is one of key points to assure result with practical accuracy, while finer expression requires more computation resources which results in longer computation time. Then, this paper discusses the expression to minimize computation workload with a practical accuracy of the result.

  5. [Recognition of psychiatric disorders with a religious content by members of the clergy of different denominations in the Netherlands].

    PubMed

    Noort, A; Braam, A W; van Gool, A R; Verhagen, P J; Beekman, A T F

    2012-01-01

    Clergy members (CMS) frequently provide support and counselling for people with psychological and psychiatric disorders. There is evidence in the literature that CMS consider themselves to be inadequately trained to recognise psychiatric disorders. To investigate to what extent CMS are able to recognise psychiatric symptoms. CMS were recruited in the south-west of the Netherlands among various denominations (Roman Catholic, strict (orthodox) Protestant, moderate Protestant and Evangelical; n = 143) by means of a regional sampling method. The participating CMS (n = 143) and a control group consisting of mental health care professionals MPHS; n = 73) evaluated four vignettes of psychiatric problems with a religious content: two were about a psychiatric disorder (a psychotic state and a psychotic depression/melancholic state), and two concerned non-psychiatric states (a spiritual/religious experience and a mourning reaction with a religious dilemma). For each vignette the respondents scored the suitability of psychiatric medication, the desirability of mental health care, the severity of the disorder and whether there was a religious or spiritual aetiology. Some CMS were able to recognise psychiatric problems almost as well as the MHPS, but among the CMS the degree of recognition varied according to the denomination. Recognition was relatively poor among Evangelical CMS, but was best among the strict Protestant CMS. Evangelical pastors and strict Protestant CMS tended to interpret the non-psychiatric states as pathological. The findings of this study emphasise the need for collaboration between MHPS and CMS and stress the importance of consultation.

  6. 78 FR 56710 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-13

    ... the following transmissions: OMB, Office of Information and Regulatory Affairs Attention: CMS Desk... Identifiers: CMS-10199 and CMS-10266] Agency Information Collection Activities: Submission for OMB Review... an opportunity for the public to comment on CMS' intention to collect information from the public...

  7. 42 CFR 422.510 - Termination of contract by CMS.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...: (1) Termination of contract by CMS. (i) CMS notifies the MA organization in writing 90 days before... organization; or (B) The MA organization experiences financial difficulties so severe that its ability to make...) of this section. (ii) CMS notifies the MA organization in writing that its contract will be...

  8. 42 CFR 422.510 - Termination of contract by CMS.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...: (1) Termination of contract by CMS. (i) CMS notifies the MA organization in writing 90 days before... organization; or (B) The MA organization experiences financial difficulties so severe that its ability to make...) of this section. (ii) CMS notifies the MA organization in writing that its contract will be...

  9. 42 CFR 422.510 - Termination of contract by CMS.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) Termination of contract by CMS. (i) CMS notifies the MA organization in writing 90 days before the intended...; or (B) The MA organization experiences financial difficulties so severe that its ability to make...) of this section. (ii) CMS notifies the MA organization in writing that its contract will be...

  10. 42 CFR 401.108 - CMS rulings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false CMS rulings. 401.108 Section 401.108 Public Health... GENERAL ADMINISTRATIVE REQUIREMENTS Confidentiality and Disclosure § 401.108 CMS rulings. (a) After... regulations, but which has been adopted by CMS as having precedent, may be published in the Federal Register...

  11. 45 CFR 150.319 - Determining the amount of the penalty-mitigating circumstances.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement... guidelines for taking into account the factors listed in § 150.317, CMS considers the following: (a) Record... noncompliance without notice from CMS and voluntarily reported that noncompliance, provided that the responsible...

  12. 42 CFR 401.625 - Effect of CMS claims collection decisions on appeals.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Effect of CMS claims collection decisions on... Compromise § 401.625 Effect of CMS claims collection decisions on appeals. Any action taken under this..., is not an initial determination for purposes of CMS appeal procedures. ...

  13. 42 CFR 403.248 - Administrative review of CMS determinations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Administrative review of CMS determinations. 403... Certification Program: General Provisions § 403.248 Administrative review of CMS determinations. (a) This section provides for administrative review if CMS determines— (1) Not to certify a policy; or (2) That a...

  14. 78 FR 67149 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-08

    ... Identifier: CMS-R-216] Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY... & Medicaid Services (CMS) is announcing an opportunity for the public to comment on CMS' intention to collect... accepting comments. 2. By regular mail. You may mail written comments to the following address: CMS, Office...

  15. How Patronage Politics Undermines Parental Participation and Accountability: Community-Managed Schools in Honduras and Guatemala

    ERIC Educational Resources Information Center

    Altschuler, Daniel

    2013-01-01

    This article shows how patronage politics affects a popular international education model: community-managed schools (CMS). Focusing on Honduras's CMS initiative, PROHECO (Programa Hondureno de Educacion Comunitaria), I demonstrate how patronage can undermine CMS accountability. Whereas supporters argue that CMS increases accountability, partisan…

  16. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  17. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  18. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  19. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  20. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... validation inspection results. 493.571 Section 493.571 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a... licensure program, in accordance with State law. (c) CMS validation inspection results. CMS may disclose the...

  1. Solving a mathematical model integrating unequal-area facilities layout and part scheduling in a cellular manufacturing system by a genetic algorithm.

    PubMed

    Ebrahimi, Ahmad; Kia, Reza; Komijan, Alireza Rashidi

    2016-01-01

    In this article, a novel integrated mixed-integer nonlinear programming model is presented for designing a cellular manufacturing system (CMS) considering machine layout and part scheduling problems simultaneously as interrelated decisions. The integrated CMS model is formulated to incorporate several design features including part due date, material handling time, operation sequence, processing time, an intra-cell layout of unequal-area facilities, and part scheduling. The objective function is to minimize makespan, tardiness penalties, and material handling costs of inter-cell and intra-cell movements. Two numerical examples are solved by the Lingo software to illustrate the results obtained by the incorporated features. In order to assess the effects and importance of integration of machine layout and part scheduling in designing a CMS, two approaches, sequentially and concurrent are investigated and the improvement resulted from a concurrent approach is revealed. Also, due to the NP-hardness of the integrated model, an efficient genetic algorithm is designed. As a consequence, computational results of this study indicate that the best solutions found by GA are better than the solutions found by B&B in much less time for both sequential and concurrent approaches. Moreover, the comparisons between the objective function values (OFVs) obtained by sequential and concurrent approaches demonstrate that the OFV improvement is averagely around 17 % by GA and 14 % by B&B.

  2. Registration of cytoplasmic male-sterile oilseed sunflower genetic stocks CMS GIG2 and CMS GIG2-RV, and fertility restoration lines RF GIG2-MAX 1631 and RF GIG2-MAX 1631-RV

    USDA-ARS?s Scientific Manuscript database

    Two cytoplasmic male-sterile (CMS) oilseed sunflower (Helianthus annuus L.) genetic stocks, CMS GIG2 (Reg. No. xxx, PI xxxx), and CMS GIG2-RV (Reg. No. xxx, PI xxxx), and corresponding fertility restoration lines RF GIG2-MAX 1631 (Reg. No. xxx, PI xxxx) and RF GIG2-MAX 1631-RV (Reg. No. xxx, PI xxx...

  3. CMS Nonpayment Policy, Quality Improvement, and Hospital-Acquired Conditions: An Integrative Review.

    PubMed

    Bae, Sung-Heui

    This integrative review synthesized evidence on the consequences of the Centers for Medicare & Medicaid Services (CMS) nonpayment policy on quality improvement initiatives and hospital-acquired conditions. Fourteen articles were included. This review presents strong evidence that the CMS policy has spurred quality improvement initiatives; however, the relationships between the CMS policy and hospital-acquired conditions are inconclusive. In future research, a comprehensive model of implementation of the CMS nonpayment policy would help us understand the effectiveness of this policy.

  4. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    PubMed Central

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  5. XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.

    PubMed

    Ching, Daniel J; Gürsoy, Dogˇa

    2017-03-01

    The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  6. XDesign: An open-source software package for designing X-ray imaging phantoms and experiments

    DOE PAGES

    Ching, Daniel J.; Gursoy, Dogˇa

    2017-02-21

    Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  7. Micro Computer Tomography for medical device and pharmaceutical packaging analysis.

    PubMed

    Hindelang, Florine; Zurbach, Raphael; Roggo, Yves

    2015-04-10

    Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Development of a User Support Package for CPESIM II (A Computer Simulation for CPE (Computer Performance Evaluation) Use.

    DTIC Science & Technology

    1984-12-01

    Appendix D: CPESIM II Student Manual .........D-1 Appendix E: CPESIM II Instructor Manual .......E-1 Appendix F: The Abridged Report..........F-i Bibliography...operating system is implemented on. A student and instructor user’s manual is provided. vii I • - Development of a User Support Package for CPESIM II (a...was a manual one. The student changes should be collected into a database to ease the instructor workload and to provide a "history" of the evolution of

  9. Case Study: Audio-Guided Learning, with Computer Graphics.

    ERIC Educational Resources Information Center

    Koumi, Jack; Daniels, Judith

    1994-01-01

    Describes teaching packages which involve the use of audiotape recordings with personal computers in Open University (United Kingdom) mathematics courses. Topics addressed include software development; computer graphics; pedagogic principles for distance education; feedback, including course evaluations and student surveys; and future plans.…

  10. Computer Applications in Teaching and Learning.

    ERIC Educational Resources Information Center

    Halley, Fred S.; And Others

    Some examples of the usage of computers in teaching and learning are examination generation, automatic exam grading, student tracking, problem generation, computational examination generators, program packages, simulation, and programing skills for problem solving. These applications are non-trivial and do fulfill the basic assumptions necessary…

  11. Software for Computing, Archiving, and Querying Semisimple Braided Monoidal Category Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software package collects various open source and freely available codes and algorithms to compute and archive the categorical data for certain semisimple braided monoidal categories. In particular, it computes the data for of group theoretical categories for academic research.

  12. Hypercard Another Computer Tool.

    ERIC Educational Resources Information Center

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  13. An Object-Oriented Serial DSMC Simulation Package

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Cai, Chunpei

    2011-05-01

    A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.

  14. A Software Development Approach for Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Cushion, Steve

    2005-01-01

    Over the last 5 years we have developed, produced, tested, and evaluated an authoring software package to produce web-based, interactive, audio-enhanced language-learning material. That authoring package has been used to produce language-learning material in French, Spanish, German, Arabic, and Tamil. We are currently working on increasing…

  15. A CAI (Computer-Assisted Instruction) Course on Constructing PLANIT lessons: Development, Content, and Evaluation

    DTIC Science & Technology

    1980-06-01

    courseware package on how to program lessons for an automated system. Since PLANIT (Programming Language for Interactive Teaching) is the student/author...assisted instruction (CAI), how to program PLANIT lessons, and to evaluate the effectiveness of the package for select Army users. The resultant courseware

  16. Effectiveness of Simulation in a Hybrid and Online Networking Course.

    ERIC Educational Resources Information Center

    Cameron, Brian H.

    2003-01-01

    Reports on a study that compares the performance of students enrolled in two sections of a Web-based computer networking course: one utilizing a simulation package and the second utilizing a static, graphical software package. Analysis shows statistically significant improvements in performance in the simulation group compared to the…

  17. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  18. Relations between the Test of Variables of Attention (TOVA) and the Children's Memory Scale (CMS).

    PubMed

    Riccio, Cynthia A; Garland, Beth H; Cohen, Morris J

    2007-09-01

    There is considerable overlap in the constructs of attention and memory. The objective of this study was to examine the relationship between the Test of Variables of Attention (TOVA), a measure of attention, to components of memory and learning as measured by the Children's Memory Scale (CMS). Participants (N = 105) were consecutive referrals to an out-patient facility, generally for learning or behavior problems, who were administered both the TOVA and the CMS. Significant correlations were found between the omissions score on the TOVA and subscales of the CMS. TOVA variability and TOVA reaction time correlated significantly with subscales of the CMS as well. TOVA commission errors did not correlate significantly with any CMS Index. Although significant, the correlation coefficients indicate that the CMS and TOVA are measuring either different constructs or similar constructs but in different ways. As such, both measures may be useful in distinguishing memory from attention problems.

  19. Choice of Tuning Parameters on 3D IC Engine Simulations Using G-Equation

    DOE PAGES

    Liu, Jinlong; Szybist, James; Dumitrescu, Cosmin

    2018-04-03

    3D CFD spark-ignition IC engine simulations are extremely complex for the regular user. Truly-predictive CFD simulations for the turbulent flame combustion that solve fully coupled transport/chemistry equations may require large computational capabilities unavailable to regular CFD users. A solution is to use a simpler phenomenological model such as the G-equation that decouples transport/chemistry result. Such simulation can still provide acceptable and faster results at the expense of predictive capabilities. While the G-equation is well understood within the experienced modeling community, the goal of this paper is to document some of them for a novice or less experienced CFD user whomore » may not be aware that phenomenological models of turbulent flame combustion usually require heavy tuning and calibration from the user to mimic experimental observations. This study used ANSYS® Forte, Version 17.2, and the built-in G-equation model, to investigate two tuning constants that influence flame propagation in 3D CFD SI engine simulations: the stretch factor coefficient, Cms and the flame development coefficient, Cm2. After identifying several Cm2-Cms pairs that matched experimental data at one operating conditions, simulation results showed that engine models that used different Cm2-Cms sets predicted similar combustion performance, when the spark timing, engine load, and engine speed were changed from the operating condition used to validate the CFD simulation. A dramatic shift was observed when engine speed was doubled, which suggested that the flame stretch coefficient, Cms, had a much larger influence at higher engine speeds compared to the flame development coefficient, Cm2. Therefore, the Cm2-Cms sets that predicted a higher turbulent flame under higher in-cylinder pressure and temperature increased the peak pressure and efficiency. This suggest that the choice of the Cm2-Cms will affect the G-equation-based simulation accuracy when engine speed increases from the one used to validate the model. As a result, for the less-experienced CFD user and in the absence of enough experimental data that would help retune the tuning parameters at various operating conditions, the purpose of a good G-equation-based 3D engine simulation is to guide and/or complement experimental investigations, not the other way around. Only a truly-predictive simulation that fully couples the turbulence/chemistry equations can help reduce the amount of experimental work.« less

  20. Choice of Tuning Parameters on 3D IC Engine Simulations Using G-Equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jinlong; Szybist, James; Dumitrescu, Cosmin

    3D CFD spark-ignition IC engine simulations are extremely complex for the regular user. Truly-predictive CFD simulations for the turbulent flame combustion that solve fully coupled transport/chemistry equations may require large computational capabilities unavailable to regular CFD users. A solution is to use a simpler phenomenological model such as the G-equation that decouples transport/chemistry result. Such simulation can still provide acceptable and faster results at the expense of predictive capabilities. While the G-equation is well understood within the experienced modeling community, the goal of this paper is to document some of them for a novice or less experienced CFD user whomore » may not be aware that phenomenological models of turbulent flame combustion usually require heavy tuning and calibration from the user to mimic experimental observations. This study used ANSYS® Forte, Version 17.2, and the built-in G-equation model, to investigate two tuning constants that influence flame propagation in 3D CFD SI engine simulations: the stretch factor coefficient, Cms and the flame development coefficient, Cm2. After identifying several Cm2-Cms pairs that matched experimental data at one operating conditions, simulation results showed that engine models that used different Cm2-Cms sets predicted similar combustion performance, when the spark timing, engine load, and engine speed were changed from the operating condition used to validate the CFD simulation. A dramatic shift was observed when engine speed was doubled, which suggested that the flame stretch coefficient, Cms, had a much larger influence at higher engine speeds compared to the flame development coefficient, Cm2. Therefore, the Cm2-Cms sets that predicted a higher turbulent flame under higher in-cylinder pressure and temperature increased the peak pressure and efficiency. This suggest that the choice of the Cm2-Cms will affect the G-equation-based simulation accuracy when engine speed increases from the one used to validate the model. As a result, for the less-experienced CFD user and in the absence of enough experimental data that would help retune the tuning parameters at various operating conditions, the purpose of a good G-equation-based 3D engine simulation is to guide and/or complement experimental investigations, not the other way around. Only a truly-predictive simulation that fully couples the turbulence/chemistry equations can help reduce the amount of experimental work.« less

Top