Sample records for modern coding practices

  1. Code of Ethics for Electrical Engineers

    NASA Astrophysics Data System (ADS)

    Matsuki, Junya

    The Institute of Electrical Engineers of Japan (IEEJ) has established the rules of practice for its members recently, based on its code of ethics enacted in 1998. In this paper, first, the characteristics of the IEEJ 1998 ethical code are explained in detail compared to the other ethical codes for other fields of engineering. Secondly, the contents which shall be included in the modern code of ethics for electrical engineers are discussed. Thirdly, the newly-established rules of practice and the modified code of ethics are presented. Finally, results of questionnaires on the new ethical code and rules which were answered on May 23, 2007, by 51 electrical and electronic students of the University of Fukui are shown.

  2. Billing, coding, and documentation in the critical care environment.

    PubMed

    Fakhry, S M

    2000-06-01

    Optimal conduct of modern-day physician practices involves a thorough understanding and application of the principles of documentation, coding, and billing. Physicians' role in these activities can no longer be secondary. Surgeons practicing critical care must be well versed in these concepts and their effective application to ensure that they are competitive in an increasingly difficult and demanding environment. Health care policies and regulations continue to evolve, mandating constant education of practicing physicians and their staffs and surgical residents who also will have to function in this environment. Close, collaborative relationships between physicians and individuals well versed in the concepts of documentation, coding, and billing are indispensable. Similarly, ongoing educational and review processes (whether internal or consultative from outside sources) not only can decrease the possibility of unfavorable outcomes from audit but also will likely enhance practice efficiency and cash flow. A financially viable practice is certainly a prerequisite for a surgical critical care practice to achieve its primary goal of excellence in patient care.

  3. LRFD software for design and actual ultimate capacity of confined rectangular columns.

    DOT National Transportation Integrated Search

    2013-04-01

    The analysis of concrete columns using unconfined concrete models is a well established practice. On the : other hand, prediction of the actual ultimate capacity of confined concrete columns requires specialized nonlinear : analysis. Modern codes and...

  4. A Roadmap to Continuous Integration for ATLAS Software Development

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  5. The Modern Research Data Portal: A Design Pattern for Networked, Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; Dart, Eli; Foster, Ian

    Here we describe best practices for providing convenient, high-speed, secure access to large data via research data portals. We capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance Science DMZs and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site, https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less

  6. The Modern Research Data Portal: a design pattern for networked, data-intensive science

    DOE PAGES

    Chard, Kyle; Dart, Eli; Foster, Ian; ...

    2018-01-15

    We describe best practices for providing convenient, high-speed, secure access to large data via research data portals. Here, we capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance data enclaves and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site,https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less

  7. The Modern Research Data Portal: a design pattern for networked, data-intensive science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; Dart, Eli; Foster, Ian

    We describe best practices for providing convenient, high-speed, secure access to large data via research data portals. Here, we capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance data enclaves and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site,https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less

  8. IPEM guidelines on dosimeter systems for use as transfer instruments between the UK primary dosimetry standards laboratory (NPL) and radiotherapy centres1

    NASA Astrophysics Data System (ADS)

    Morgan, A. M.; Aird, E. G. A.; Aukett, R. J.; Duane, S.; Jenkins, N. H.; Mayles, W. P. M.; Moretti, C.; Thwaites, D. I.

    2000-09-01

    United Kingdom dosimetry codes of practice have traditionally specified one electrometer for use as a secondary standard, namely the Nuclear Enterprises (NE) 2560 NPL secondary standard therapy level exposure meter. The NE2560 will become obsolete in the foreseeable future. This report provides guidelines to assist physicists following the United Kingdom dosimetry codes of practice in the selection of an electrometer to replace the NE2560 when necessary. Using an internationally accepted standard (BS EN 60731:1997) as a basis, estimated error analyses demonstrate that the uncertainty (one standard deviation) in a charge measurement associated with the NE2560 alone is approximately 0.3% under specified conditions. Following a review of manufacturers' literature, it is considered that modern electrometers should be capable of equalling this performance. Additional constructural and operational requirements not specified in the international standard but considered essential in a modern electrometer to be used as a secondary standard are presented.

  9. Code Mixing and Modernization across Cultures.

    ERIC Educational Resources Information Center

    Kamwangamalu, Nkonko M.

    A review of recent studies addressed the functional uses of code mixing across cultures. Expressions of code mixing (CM) are not random; in fact, a number of functions of code mixing can easily be delineated, for example, the concept of "modernization.""Modernization" is viewed with respect to how bilingual code mixers perceive…

  10. ForTrilinos Design Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Mitchell T.; Johnson, Seth R.; Prokopenko, Andrey V.

    With the development of a Fortran Interface to Trilinos, ForTrilinos, modelers using modern Fortran will beable to provide their codes the capability to use solvers and other capabilities on exascale machines via astraightforward infrastructure that accesses Trilinos. This document outlines what Fortrilinos does andexplains briefly how it works. We show it provides a general access to packages via an entry point and usesan xml file from fortran code. With the first release, ForTrilinos will enable Teuchos to take xml parameterlists from Fortran code and set up data structures. It will provide access to linear solvers and eigensolvers.Several examples are providedmore » to illustrate the capabilities in practice. We explain what the user shouldhave already with their code and what Trilinos provides and returns to the Fortran code. We provideinformation about the build process for ForTrilinos, with a practical example. In future releases, nonlinearsolvers, time iteration, advanced preconditioning techniques, and inversion of control (IoC), to enablecallbacks to Fortran routines, will be available.« less

  11. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less

  12. Right Brain: The E-lephant in the room: One resident's challenge in transitioning to modern electronic medicine.

    PubMed

    Strowd, Roy E

    2014-09-23

    The electronic medical record (EMR) is changing the landscape of medical practice in the modern age. Increasing emphasis on quality metric reporting, data-driven documentation, and timely coding and billing are pressuring institutions across the country to adopt the latest EMR technology. The impact of these systems on the patient-physician relationship is profound. One year following the latest EMR transition, one resident reviews his experience and provides a personal perspective on the impact the EMR on patient-physician communication. © 2014 American Academy of Neurology.

  13. 12 CFR 201.110 - Goods held by persons employed by owner.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Board has taken into consideration the changes that have occurred in commercial law and practice since 1933. Modern commercial law, embodied in the Uniform Commercial Code, refers to “perfecting security interests” rather than “securing title” to goods. The Board believes that if, under State law, the issuance...

  14. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    NASA Astrophysics Data System (ADS)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.

    2017-02-01

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.

  15. Measurement of flows around modern commercial ship models

    NASA Astrophysics Data System (ADS)

    Kim, W. J.; Van, S. H.; Kim, D. H.

    To document the details of flow characteristics around modern commercial ships, global force, wave pattern, and local mean velocity components were measured in the towing tank. Three modern commercial hull models of a container ship (KRISO container ship = KCS) and of two very large crude-oil carriers (VLCCs) with the same forebody and slightly different afterbody (KVLCC and KVLCC2) having bow and stern bulbs were selected for the test. Uncertainty analysis was performed for the measured data using the procedure recommended by the ITTC. Obtained experimental data will provide a good opportunity to explore integrated flow phenomena around practical hull forms of today. Those can be also used as the validation data for the computational fluid dynamics (CFD) code of both inviscid and viscous flow calculations.

  16. How a modified approach to dental coding can benefit personal and professional development with improved clinical outcomes.

    PubMed

    Lam, Raymond; Kruger, Estie; Tennant, Marc

    2014-12-01

    One disadvantage of the remarkable achievements in dentistry is that treatment options have never been more varied or confusing. This has made the concept of Evidenced Based Dentistry more applicable to modern dental practice. Despite merit in the concept whereby clinical decisions are guided by scientific evidence, there are problems with establishing a scientific base. This is no more challenging than in modern dentistry where the gap between rapidly developing products/procedures and its evidence base are widening. Furthermore, the burden of oral disease continues to remain high at the population level. These problems have prompted new approaches to enhancing research. The aim of this paper is to outline how a modified approach to dental coding may benefit clinical and population level research. Using publically assessable data obtained from the Australian Chronic Disease Dental Scheme and item codes contained within the Australian Schedule of Dental Services and Glossary, a suggested approach to dental informatics is illustrated. A selection of item codes have been selected and expanded with the addition of suffixes. These suffixes provided circumstantial information that will assist in assessing clinical outcomes such as success rates and prognosis. The use of item codes in administering the CDDS yielded a large database of item codes. These codes are amenable to dental informatics which has been shown to enhance research at both the clinical and population level. This is a cost effective method to supplement existing research methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE PAGES

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...

    2017-03-20

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  18. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  19. Code-Switching Functions in Modern Hebrew Teaching and Learning

    ERIC Educational Resources Information Center

    Gilead, Yona

    2016-01-01

    The teaching and learning of Modern Hebrew outside of Israel is essential to Jewish education and identity. One of the most contested issues in Modern Hebrew pedagogy is the use of code-switching between Modern Hebrew and learners' first language. Moreover, this is one of the longest running disputes in the broader field of second language…

  20. Propagation of Computational Uncertainty Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2007-01-01

    This paper describes the use of formally designed experiments to aid in the error analysis of a computational experiment. A method is described by which the underlying code is approximated with relatively low-order polynomial graduating functions represented by truncated Taylor series approximations to the true underlying response function. A resource-minimal approach is outlined by which such graduating functions can be estimated from a minimum number of case runs of the underlying computational code. Certain practical considerations are discussed, including ways and means of coping with high-order response functions. The distributional properties of prediction residuals are presented and discussed. A practical method is presented for quantifying that component of the prediction uncertainty of a computational code that can be attributed to imperfect knowledge of independent variable levels. This method is illustrated with a recent assessment of uncertainty in computational estimates of Space Shuttle thermal and structural reentry loads attributable to ice and foam debris impact on ascent.

  1. The seven sins in academic behavior in the natural sciences.

    PubMed

    van Gunsteren, Wilfred F

    2013-01-02

    "Seven deadly sins" in modern academic research and publishing can be condensed into a list ranging from poorly described experimental or computational setups to falsification of data. This Essay describes these sins and their ramifications, and serves as a code of best practice for researchers in their quest for scientific truth. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  3. The Nuclear Energy Knowledge and Validation Center Summary of Activities Conducted in FY16

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans David

    The Nuclear Energy Knowledge and Validation Center (NEKVaC) is a new initiative by the Department of Energy (DOE) and Idaho National Laboratory (INL) to coordinate and focus the resources and expertise that exist with the DOE toward solving issues in modern nuclear code validation and knowledge management. In time, code owners, users, and developers will view the NEKVaC as a partner and essential resource for acquiring the best practices and latest techniques for validating codes, providing guidance in planning and executing experiments, facilitating access to and maximizing the usefulness of existing data, and preserving knowledge for continual use by nuclearmore » professionals and organizations for their own validation needs. The scope of the NEKVaC covers many interrelated activities that will need to be cultivated carefully in the near term and managed properly once the NEKVaC is fully functional. Three areas comprise the principal mission: (1) identify and prioritize projects that extend the field of validation science and its application to modern codes, (2) develop and disseminate best practices and guidelines for high-fidelity multiphysics/multiscale analysis code development and associated experiment design, and (3) define protocols for data acquisition and knowledge preservation and provide a portal for access to databases currently scattered among numerous organizations. These mission areas, while each having a unique focus, are interdependent and complementary. Likewise, all activities supported by the NEKVaC, both near term and long term, must possess elements supporting all three areas. This cross cutting nature is essential to ensuring that activities and supporting personnel do not become “stove piped” (i.e., focused a specific function that the activity itself becomes the objective rather than achieving the larger vision). This report begins with a description of the mission areas; specifically, the role played by each major committee and the types of activities for which they are responsible. It then lists and describes the proposed near term tasks upon which future efforts can build.« less

  4. Ecologically sound management: aspects of modern sustainable deer farming systems.

    PubMed

    Pearse, A J; Drew, K R

    1998-01-01

    Modern deer farming systems have become increasingly intensive allowing strategic feeding for production and genetic improvement programmes. Meeting feeding standards that account for changing nutritional demands related to seasonality and reproductive state is critical. As the industry matures there is a growing awareness of the balance between retaining natural behaviour in producing breeding stock on larger extensive holdings and intensification systems for performance in young stock. Stocking rates are critical determinants of success as land use and capability needs are matched with an increasing stratification of stock type and purpose. Food product safety and welfare considerations of farmed deer are being driven by consumer demands. Farm quality assurance and codes of practice are developing to ensure that deer farming meets and exceeds international expectations of land use and deer welfare in modern deer farming systems.

  5. QGene 4.0, an extensible Java QTL-analysis platform.

    PubMed

    Joehanes, Roby; Nelson, James C

    2008-12-01

    Of many statistical methods developed to date for quantitative trait locus (QTL) analysis, only a limited subset are available in public software allowing their exploration, comparison and practical application by researchers. We have developed QGene 4.0, a plug-in platform that allows execution and comparison of a variety of modern QTL-mapping methods and supports third-party addition of new ones. The software accommodates line-cross mating designs consisting of any arbitrary sequence of selfing, backcrossing, intercrossing and haploid-doubling steps that includes map, population, and trait simulators; and is scriptable. Software and documentation are available at http://coding.plantpath.ksu.edu/qgene. Source code is available on request.

  6. Socio-cultural inhibitors to use of modern contraceptive techniques in rural Uganda: a qualitative study.

    PubMed

    Kabagenyi, Allen; Reid, Alice; Ntozi, James; Atuyambe, Lynn

    2016-01-01

    Family planning is one of the cost-effective strategies in reducing maternal and child morbidity and mortality rates. Yet in Uganda, the contraceptive prevalence rate is only 30% among married women in conjunction with a persistently high fertility rate of 6.2 children per woman. These demographic indicators have contributed to a high population growth rate of over 3.2% annually. This study examines the role of socio-cultural inhibitions in the use of modern contraceptives in rural Uganda. This was a qualitative study conducted in 2012 among men aged 15-64 and women aged 15-49 in the districts of Mpigi and Bugiri in rural Uganda. Eighteen selected focus group discussions (FGDs), each internally homogeneous, and eight in-depth interviews (IDIs) were conducted among men and women. Data were collected on sociocultural beliefs and practices, barriers to modern contraceptive use and perceptions of and attitudes to contraceptive use. All interviews were tape recoded, translated and transcribed verbatim. All the transcripts were coded, prearranged into categories and later analyzed using a latent content analysis approach, with support of ATLAS.ti qualitative software. Suitable quotations were used to provide in-depth explanations of the findings. Three themes central in hindering the uptake of modern contraceptives emerged: (i) persistence of socio-cultural beliefs and practices promoting births (such as polygamy, extending family lineage, replacement of the dead, gender-based violence, power relations and twin myths). (ii) Continued reliance on traditional family planning practices and (iii) misconceptions and fears about modern contraception. Sociocultural expectations and values attached to marriage, women and child bearing remain an impediment to using family planning methods. The study suggests a need to eradicate the cultural beliefs and practices that hinder people from using contraceptives, as well as a need to scale-up family planning services and sensitization at the grassroots.

  7. Socio-cultural inhibitors to use of modern contraceptive techniques in rural Uganda: a qualitative study

    PubMed Central

    Kabagenyi, Allen; Reid, Alice; Ntozi, James; Atuyambe, Lynn

    2016-01-01

    Introduction Family planning is one of the cost-effective strategies in reducing maternal and child morbidity and mortality rates. Yet in Uganda, the contraceptive prevalence rate is only 30% among married women in conjunction with a persistently high fertility rate of 6.2 children per woman. These demographic indicators have contributed to a high population growth rate of over 3.2% annually. This study examines the role of socio-cultural inhibitions in the use of modern contraceptives in rural Uganda. Methods This was a qualitative study conducted in 2012 among men aged 15-64 and women aged 15-49 in the districts of Mpigi and Bugiri in rural Uganda. Eighteen selected focus group discussions (FGDs), each internally homogeneous, and eight in-depth interviews (IDIs) were conducted among men and women. Data were collected on sociocultural beliefs and practices, barriers to modern contraceptive use and perceptions of and attitudes to contraceptive use. All interviews were tape recoded, translated and transcribed verbatim. All the transcripts were coded, prearranged into categories and later analyzed using a latent content analysis approach, with support of ATLAS.ti qualitative software. Suitable quotations were used to provide in-depth explanations of the findings. Results Three themes central in hindering the uptake of modern contraceptives emerged: (i) persistence of socio-cultural beliefs and practices promoting births (such as polygamy, extending family lineage, replacement of the dead, gender-based violence, power relations and twin myths). (ii) Continued reliance on traditional family planning practices and (iii) misconceptions and fears about modern contraception. Conclusion Sociocultural expectations and values attached to marriage, women and child bearing remain an impediment to using family planning methods. The study suggests a need to eradicate the cultural beliefs and practices that hinder people from using contraceptives, as well as a need to scale-up family planning services and sensitization at the grassroots. PMID:28292041

  8. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, E.J.; McNeilly, G.S.

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  9. The relevance of the Hippocratic Oath to the ethical and moral values of contemporary medicine. Part I: The Hippocratic Oath from antiquity to modern times.

    PubMed

    Askitopoulou, Helen; Vgontzas, Antoniοs N

    2017-10-27

    The present paper discusses the relevance and significance of the Hippocratic Oath to contemporary medical ethical and moral values. It attempts to answer the questions about some controversial issues related to the Oath. The text is divided in two parts. Part I discusses the general attributes and ethical values of the Oath, while Part II presents a detailed analysis of each passage of the Oath with regard to perennial ethical principles and moral values. Part I starts with the contribution of Hippocrates and his School of Cos to medicine. It continues by examining the moral dilemmas concerning physicians and patients in the Classical Times and in the Modern World. It also investigates how the Hippocratic Oath stands nowadays, with regard to the remarkable and often revolutionary advancements in medical practice and the significant evolution in medical ethics. Further, it presents the debate and the criticism about the relevance of the general attributes and ethical values of the Oath to those of modern societies. Finally, it discusses the endurance of the ethical values of the Hippocratic Oath over the centuries until today with respect to the physicians' commitment to the practice of patient-oriented medicine. Part I concludes with the Oath's historic input in the Judgment delivered at the close of the Nuremberg "Doctors' Trial"; this Judgement has become legally binding for the discipline in the Western World and was the basis of the Nuremberg Code. The ethical code of the Oath turned out to be a fundamental part of western law not only on medical ethics but also on patients' rights regarding research.

  10. Method and computer program product for maintenance and modernization backlogging

    DOEpatents

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kornfeldt, H.; Bjoerk, K.O.; Ekstroem, P.

    The protection against dynamic effects in connection with potential pipe breaks has been implemented in different ways in the development of BWR reactor designs. First-generation plant designs reflect code requirements in effect at that time which means that no piping restraint systems were designed and built into those plants. Modern designs have, in contrast, implemented full protection against damage in connection with postulated pipe breaks, as required in current codes and regulations. Moderns standards and current regulatory demands can be met for the older plants by backfitting pipe whip restraint hardware. This could lead to several practical difficulties as thesemore » installations were not anticipated in the original plant design and layout. Meeting the new demands by analysis would in this situation have great advantages. Application of leak-before-break criteria gives an alternative opportunity of meeting modem standards in reactor safety design. Analysis takes into account data specific to BWR primary system operation, actual pipe material properties, piping loads and leak detection capability. Special attention must be given to ensure that the data used reflects actual plant conditions.« less

  12. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  13. The Nuremberg Code: its history and implications.

    PubMed

    Kious, B M

    2001-01-01

    The Nuremberg Code is a foundational document in the ethics of medical research and human experimentation; the principle its authors espoused in 1946 have provided the framework for modern codes that address the same issues, and have received little challenge and only slight modification in decades since. By analyzing the Code's tragic genesis and its normative implications, it is possible to understand some of the essence of modern experimental ethics, as well as certain outstanding controversies that still plague medical science.

  14. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into computational science and engineering codes. Finally, we are partnering with the lead PTP developers at IBM, to ensure we are as effective as possible within the Eclipse community development. We are also conducting training and outreach to our user community, including conference BOF sessions, monthly user calls, and an annual user meeting, so that we can best inform the improvements we make to Eclipse PTP. With these activities we endeavor to encourage use of modern software engineering practices, as enabled through the Eclipse IDE, with computational science and engineering applications. These practices include proper use of source code repositories, tracking and rectifying issues, measuring and monitoring code performance changes against both optimizations as well as ever-changing software stacks and configurations on HPC systems, as well as ultimately encouraging development and maintenance of testing suites -- things that have become commonplace in many software endeavors, but have lagged in the development of science applications. We view that the challenge with the increased complexity of both HPC systems and science applications demands the use of better software engineering methods, preferably enabled by modern tools such as Eclipse PTP, to help the computational science community thrive as we evolve the HPC landscape.

  15. PlasmaPy: beginning a community developed Python package for plasma physics

    NASA Astrophysics Data System (ADS)

    Murphy, Nicholas A.; Huang, Yi-Min; PlasmaPy Collaboration

    2016-10-01

    In recent years, researchers in several disciplines have collaborated on community-developed open source Python packages such as Astropy, SunPy, and SpacePy. These packages provide core functionality, common frameworks for data analysis and visualization, and educational tools. We propose that our community begins the development of PlasmaPy: a new open source core Python package for plasma physics. PlasmaPy could include commonly used functions in plasma physics, easy-to-use plasma simulation codes, Grad-Shafranov solvers, eigenmode solvers, and tools to analyze both simulations and experiments. The development will include modern programming practices such as version control, embedding documentation in the code, unit tests, and avoiding premature optimization. We will describe early code development on PlasmaPy, and discuss plans moving forward. The success of PlasmaPy depends on active community involvement and a welcoming and inclusive environment, so anyone interested in joining this collaboration should contact the authors.

  16. Revisiting Yasinsky and Henry`s benchmark using modern nodal codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feltus, M.A.; Becker, M.W.

    1995-12-31

    The numerical experiments analyzed by Yasinsky and Henry are quite trivial by comparison with today`s standards because they used the finite difference code WIGLE for their benchmark. Also, this problem is a simple slab (one-dimensional) case with no feedback mechanisms. This research attempts to obtain STAR (Ref. 2) and NEM (Ref. 3) code results in order to produce a more modern kinetics benchmark with results comparable WIGLE.

  17. Best practice & research in anaesthesiology issue on new approaches in clinical research ethics in clinical research.

    PubMed

    Schwenzer, Karen J

    2011-12-01

    The history of ethics in clinical research parallels the history of abuse of human beings. The Nuremberg Code, Declaration of Helsinki, and the Belmont Report laid the foundations for modern research ethics. In the United States, the OHRP and the FDA provide guidelines for the ethical conduct of research. Investigators should be familiar with regulations concerning informed consent, doing research in vulnerable populations, and protection of privacy. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Characteristic-based algorithms for flows in thermo-chemical nonequilibrium

    NASA Technical Reports Server (NTRS)

    Walters, Robert W.; Cinnella, Pasquale; Slack, David C.; Halt, David

    1990-01-01

    A generalized finite-rate chemistry algorithm with Steger-Warming, Van Leer, and Roe characteristic-based flux splittings is presented in three-dimensional generalized coordinates for the Navier-Stokes equations. Attention is placed on convergence to steady-state solutions with fully coupled chemistry. Time integration schemes including explicit m-stage Runge-Kutta, implicit approximate-factorization, relaxation and LU decomposition are investigated and compared in terms of residual reduction per unit of CPU time. Practical issues such as code vectorization and memory usage on modern supercomputers are discussed.

  19. Code of Conduct on Biosecurity for Biological Resource Centres: procedural implementation.

    PubMed

    Rohde, Christine; Smith, David; Martin, Dunja; Fritze, Dagmar; Stalpers, Joost

    2013-07-01

    A globally applicable code of conduct specifically dedicated to biosecurity has been developed together with guidance for its procedural implementation. This is to address the regulations governing potential dual-use of biological materials, associated information and technologies, and reduce the potential for their malicious use. Scientists researching and exchanging micro-organisms have a responsibility to prevent misuse of the inherently dangerous ones, that is, those possessing characters such as pathogenicity or toxin production. The code of conduct presented here is based on best practice principles for scientists and their institutions working with biological resources with a specific focus on micro-organisms. It aims to raise awareness of regulatory needs and to protect researchers, their facilities and stakeholders. It reflects global activities in this area in response to legislation such as that in the USA, the PATRIOT Act of 2001, Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act of 2001; the Anti-Terrorism Crime and Security Act 2001 and subsequent amendments in the UK; the EU Dual-Use Regulation; and the recommendations of the Organization for Economic Co-operation and Development (OECD), under their Biological Resource Centre (BRC) Initiative at the beginning of the millennium (OECD, 2001). Two project consortia with international partners came together with experts in the field to draw up a Code of Conduct on Biosecurity for BRCs to ensure that culture collections and microbiologists in general worked in a way that met the requirements of such legislation. A BRC is the modern day culture collection that adds value to its holdings and implements common best practice in the collection and supply of strains for research and development. This code of conduct specifically addresses the work of public service culture collections and describes the issues of importance and the controls or practices that should be in place. However, these best practices are equally applicable to all other microbiology laboratories holding, using and sharing microbial resources. The code was introduced to the Seventh Review Conference to the Biological and Toxin Weapons Convention (BTWC), United Nations, Geneva, 2011; the delegates to the States' parties recommended that this code of conduct be broadly applied in the life sciences and disseminated amongst microbiologists, hence the publishing of it here along with practical implementation guidance. This paper considers the regulatory and working environment for microbiology, defines responsibilities and provides practical advice on the implementation of best practice in handling the organism itself, associated data and technical know-how.

  20. [Genetic diversity of modern Russian durum wheat cultivars at the gliadin-coding loci].

    PubMed

    Kudriavtsev, A M; Dedova, L V; Mel'nik, V A; Shishkina, A A; Upelniek, V P; Novosel'skaia-Dragovich, A Iu

    2014-05-01

    The allelic diversity at four gliadin-coding loci was examined in modern cultivars of the spring and winter durum wheat Triticum durum Desf. Comparative analysis of the allelic diversity showed that the gene pools of these two types of durum wheat, having different life styles, were considerably different. For the modern spring durum wheat cultivars, a certain reduction of the genetic diversity was observed compared to the cultivars bred in the 20th century.

  1. The novel high-performance 3-D MT inverse solver

    NASA Astrophysics Data System (ADS)

    Kruglyakov, Mikhail; Geraskin, Alexey; Kuvshinov, Alexey

    2016-04-01

    We present novel, robust, scalable, and fast 3-D magnetotelluric (MT) inverse solver. The solver is written in multi-language paradigm to make it as efficient, readable and maintainable as possible. Separation of concerns and single responsibility concepts go through implementation of the solver. As a forward modelling engine a modern scalable solver extrEMe, based on contracting integral equation approach, is used. Iterative gradient-type (quasi-Newton) optimization scheme is invoked to search for (regularized) inverse problem solution, and adjoint source approach is used to calculate efficiently the gradient of the misfit. The inverse solver is able to deal with highly detailed and contrasting models, allows for working (separately or jointly) with any type of MT responses, and supports massive parallelization. Moreover, different parallelization strategies implemented in the code allow optimal usage of available computational resources for a given problem statement. To parameterize an inverse domain the so-called mask parameterization is implemented, which means that one can merge any subset of forward modelling cells in order to account for (usually) irregular distribution of observation sites. We report results of 3-D numerical experiments aimed at analysing the robustness, performance and scalability of the code. In particular, our computational experiments carried out at different platforms ranging from modern laptops to HPC Piz Daint (6th supercomputer in the world) demonstrate practically linear scalability of the code up to thousands of nodes.

  2. Comparison of IPSM 1990 photon dosimetry code of practice with IAEA TRS‐398 and AAPM TG‐51.

    PubMed Central

    Henríquez, Francisco Cutanda

    2009-01-01

    Several codes of practice for photon dosimetry are currently used around the world, supported by different organizations. A comparison of IPSM 1990 with both IAEA TRS‐398 and AAPM TG‐51 has been performed. All three protocols are based on the calibration of ionization chambers in terms of standards of absorbed dose to water, as it is the case with other modern codes of practice. This comparison has been carried out for photon beams of nominal energies: 4 MV, 6 MV, 8 MV, 10 MV and 18 MV. An NE 2571 graphite ionization chamber was used in this study, cross‐calibrated against an NE 2611A Secondary Standard, calibrated in the National Physical Laboratory (NPL). Absolute dose in reference conditions was obtained using each of these three protocols including: beam quality indices, beam quality conversion factors both theoretical and NPL experimental ones, correction factors for influence quantities and absolute dose measurements. Each protocol recommendations have been strictly followed. Uncertainties have been obtained according to the ISO Guide to the Expression of Uncertainty in Measurement. Absorbed dose obtained according to all three protocols agree within experimental uncertainty. The largest difference between absolute dose results for two protocols is obtained for the highest energy: 0.7% between IPSM 1990 and IAEA TRS‐398 using theoretical beam quality conversion factors. PACS number: 87.55.tm

  3. Advanced Digital Imaging Laboratory Using MATLAB® (Second edition)

    NASA Astrophysics Data System (ADS)

    Yaroslavsky, Leonid P.

    2016-09-01

    The first edition of this text book focussed on providing practical hands-on experience in digital imaging techniques for graduate students and practitioners keeping to a minimum any detailed discussion on the underlying theory. In this new extended edition, the author builds on the strength of the original edition by expanding the coverage to include formulation of the major theoretical results that underlie the exercises as well as introducing numerous modern concepts and new techniques. Whether you are studying or already using digital imaging techniques, developing proficiency in the subject is not possible without mastering practical skills. Including more than 100 MATLAB® exercises, this book delivers a complete applied course in digital imaging theory and practice. Part of IOP Series in Imaging Engineering Supplementary MATLAB codes and data files are available within Book Information.

  4. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  5. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    NASA Astrophysics Data System (ADS)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  6. Lifting the veil. Reform vs tradition in Turkey. An interview with Nilofer Gole.

    PubMed

    1997-01-01

    This article is based on an interview with a Turkish sociologist, Nilofer Gole, concerning the practice of veiling of women in Turkey. Turks have argued about the permissibility of veiling for over a decade. The issue of veiling marks the intersection between modernization and the Islamic movement and is a symbolic marker of the meaning of women's bodies. Turkish women wear a traditional long head scarf that has nothing to do with a modern Islamic head scarf. Turkish and Indian women are pressured by fathers, husbands, and brothers to completely cover themselves. Turkish women who put on the veil come from modest social origins, peripheral cities, and small towns with a conservative background. Turkish women wearing the veil do so when they move to urban areas and pass the university entrance exams. Rural women maintain the classical head scarf. Urban women wearing veils are leaving the domestic space and private sphere and distancing themselves from traditional women's roles. These Islamic radical women are a small minority of about 10% of the population. Most of the population follow a modern way of life with a different ideology. Modernization began in Turkey in 1876, and the formation of a secular, republican nation-state began with Mustafa Attaturk in 1923. Attaturk based the Turkish civil code on the Swiss civil code, which was the most progressive at that time and promoted the emancipation of women. Turkey, unlike Iran, had a "very rich democratic life, the defeat of a monarchy, and the growth of the middle class." Religious law and marriage were abolished. The veil was not allowed in public places and its use discouraged.

  7. Management methodology for pressure equipment

    NASA Astrophysics Data System (ADS)

    Bletchly, P. J.

    Pressure equipment constitutes a significant investment in capital and a major proportion of potential high-risk plant in many operations and this is particularly so in an alumina refinery. In many jurisdictions pressure equipment is also subject to statutory regulation that imposes obligations on Owners of the equipment with respect to workplace safety. Most modern technical standards and industry codes of practice employ a risk-based approach to support better decision making with respect to pressure equipment. For a management system to be effective it must demonstrate that risk is being managed within acceptable limits.

  8. Plug Into "The Modernizing Machine"! Danish University Reform and Its Transformable Academic Subjectivities

    ERIC Educational Resources Information Center

    Krejsler, John Benedicto

    2013-01-01

    "The modernizing machine" codes individual bodies, things, and symbols with images from New Public Management, neo-liberal, and Knowledge Economy discourses. Drawing on Deleuze and Guattari's concept of machines, this article explores how "the modernizing machine" produces neo-liberal modernization of the public sector. Taking…

  9. Cloudy's Journey from FORTRAN to C, Why and How

    NASA Astrophysics Data System (ADS)

    Ferland, G. J.

    Cloudy is a large-scale plasma simulation code that is widely used across the astronomical community as an aid in the interpretation of spectroscopic data. The cover of the ADAS VI book featured predictions of the code. The FORTRAN 77 source code has always been freely available on the Internet, contributing to its widespread use. The coming of PCs and Linux has fundamentally changed the computing environment. Modern Fortran compilers (F90 and F95) are not freely available. A common-use code must be written in either FORTRAN 77 or C to be Open Source/GNU/Linux friendly. F77 has serious drawbacks - modern language constructs cannot be used, students do not have skills in this language, and it does not contribute to their future employability. It became clear that the code would have to be ported to C to have a viable future. I describe the approach I used to convert Cloudy from FORTRAN 77 with MILSPEC extensions to ANSI/ISO 89 C. Cloudy is now openly available as a C code, and will evolve to C++ as gcc and standard C++ mature. Cloudy looks to a bright future with a modern language.

  10. Guide to the TANDEM System for the Modern Languages Department Tape Library: A Non-Technical Guide for Teachers.

    ERIC Educational Resources Information Center

    Hounsell, D.; And Others

    This guide for teachers to the tape indexing system (TANDEM) in use at the Modern Languages Department at Portsmouth Polytechnic focuses on tape classification, numbering, labeling, and shelving system procedures. The appendixes contain information on: (1) the classification system and related codes, (2) color and letter codes, (3) marking of tape…

  11. Staggered-grid finite-difference acoustic modeling with the Time-Domain Atmospheric Acoustic Propagation Suite (TDAAPS).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldridge, David Franklin; Collier, Sandra L.; Marlin, David H.

    2005-05-01

    This document is intended to serve as a users guide for the time-domain atmospheric acoustic propagation suite (TDAAPS) program developed as part of the Department of Defense High-Performance Modernization Office (HPCMP) Common High-Performance Computing Scalable Software Initiative (CHSSI). TDAAPS performs staggered-grid finite-difference modeling of the acoustic velocity-pressure system with the incorporation of spatially inhomogeneous winds. Wherever practical the control structure of the codes are written in C++ using an object oriented design. Sections of code where a large number of calculations are required are written in C or F77 in order to enable better compiler optimization of these sections. Themore » TDAAPS program conforms to a UNIX style calling interface. Most of the actions of the codes are controlled by adding flags to the invoking command line. This document presents a large number of examples and provides new users with the necessary background to perform acoustic modeling with TDAAPS.« less

  12. Codes of medical ethics: traditional foundations and contemporary practice.

    PubMed

    Sohl, P; Bassford, H A

    1986-01-01

    The Hippocratic Coprus recognized the interaction of 'business' and patient-health moral considerations, and urged that the former be subordinated to the latter. During the 1800s with the growth of complexity in both scientific knowledge and the organization of health services, the medical ethical codes addressed themselves to elaborate rules of conduct to be followed by the members of the newly emerging national medical associations. After World War II the World Medical Association was established as an international forum where national medical associations could debate the ethical problems presented by modern medicine. The International Code of Medical ethics and the Declaration of Geneva were written as 20th century restatements of the medical profession's commitment to the sovereignty of the patient-care norm. Many ethical statements have been issued by the World Medical Association in the past 35 years; they show the variety and difficulties of contemporary medical practice. The newest revisions were approved by the General Assembly of the World Medical Association in Venice, Italy October 1983. Their content is examined and concern is voiced about the danger of falling into cultural relativism when questions about the methods of financing medical services are the subject of an ethical declaration which is arrived at by consensus in the W.M.A.

  13. Honoring Our Ethical Origins: Scientific Integrity and Geoethics, Past, Present, and Future

    NASA Astrophysics Data System (ADS)

    Gundersen, L. C.

    2017-12-01

    Current ethics policy owes much of its origins to Aristotle and his writings on virtue - including the idea that if we understand and rationally practice virtue and excellence, we will be our best selves. From this humble beginning emerged a number of more complex, ever evolving, ethical theories. The Hypocratic Oath and atrocities of World War II resulted in the roots of scientific integrity through the Nuremberg Code and the Belmont Report, which set ethical rules for human experimentation, including, respect, beneficence, and justice. These roots produced bioethics, medical ethics, environmental ethics, and geoethics. Geoethics has its origins in Europe and is being embraced in the U.S.A. It needs a respected place in the geoscience curriculum, especially as we face the global challenges of climate change and sustainability. Modern scientific integrity in the U.S.A., where research misconduct is defined as fabrication, falsification, and plagiarism, was derived from efforts of the 1980's through 1990's by the Nat'l Institutes of Health and Nat'l Academy of Sciences (NAS). This definition of misconduct has remained an immovable standard, excluding anything not of the scientific process, such as personal behaviors within the research environment. Modern scientific integrity codes and reports such as the Singapore Statement, the NAS' Fostering Integrity in Research, and current federal agency policies, provide standards of behavior to aspire to, and acknowledge the deleterious effects of certain behaviors and practices, but still hesitate to include them in formal definitions of research misconduct. Modern media is holding a mirror to what is happening in the research environment. There are conflicts of interest, misrepresentations of data and uncertainty, discrimination, harassment, bullying, misuse of funds, withholding of data and code, intellectual theft, and a host of others, that are having a serious detrimental effect on science. For science to have its best future, we as scientists need to nurture and encourage the best in ourselves and others, taking a cue from the ancient Greeks. We need to address conduct as a part of misconduct. Recent policies, including the AGU's are bravely moving forward in this direction. It is new and difficult ground, but we are scientists, and this is an experiment we need to do.

  14. Remote control system for high-perfomance computer simulation of crystal growth by the PFC method

    NASA Astrophysics Data System (ADS)

    Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.

  15. The Helicopter Antenna Radiation Prediction Code (HARP)

    NASA Technical Reports Server (NTRS)

    Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.

    1990-01-01

    The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.

  16. Key Reconciliation for High Performance Quantum Key Distribution

    PubMed Central

    Martinez-Mateo, Jesus; Elkouss, David; Martin, Vicente

    2013-01-01

    Quantum Key Distribution is carving its place among the tools used to secure communications. While a difficult technology, it enjoys benefits that set it apart from the rest, the most prominent is its provable security based on the laws of physics. QKD requires not only the mastering of signals at the quantum level, but also a classical processing to extract a secret-key from them. This postprocessing has been customarily studied in terms of the efficiency, a figure of merit that offers a biased view of the performance of real devices. Here we argue that it is the throughput the significant magnitude in practical QKD, specially in the case of high speed devices, where the differences are more marked, and give some examples contrasting the usual postprocessing schemes with new ones from modern coding theory. A good understanding of its implications is very important for the design of modern QKD devices. PMID:23546440

  17. Code Modernization of VPIC

    NASA Astrophysics Data System (ADS)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  18. Clinical implementation of total skin electron irradiation treatment with a 6 MeV electron beam in high-dose total skin electron mode

    NASA Astrophysics Data System (ADS)

    Lucero, J. F.; Rojas, J. I.

    2016-07-01

    Total skin electron irradiation (TSEI) is a special treatment technique offered by modern radiation oncology facilities, given for the treatment of mycosis fungoides, a rare skin disease, which is type of cutaneous T-cell lymphoma [1]. During treatment the patient's entire skin is irradiated with a uniform dose. The aim of this work is to present implementation of total skin electron irradiation treatment using IAEA TRS-398 code of practice for absolute dosimetry and taking advantage of the use of radiochromic films.

  19. Clinical implementation of total skin electron irradiation treatment with a 6 MeV electron beam in high-dose total skin electron mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucero, J. F., E-mail: fernando.lucero@hoperadiotherapy.com.gt; Hope International, Guatemala; Rojas, J. I., E-mail: isaac.rojas@siglo21.cr

    Total skin electron irradiation (TSEI) is a special treatment technique offered by modern radiation oncology facilities, given for the treatment of mycosis fungoides, a rare skin disease, which is type of cutaneous T-cell lymphoma [1]. During treatment the patient’s entire skin is irradiated with a uniform dose. The aim of this work is to present implementation of total skin electron irradiation treatment using IAEA TRS-398 code of practice for absolute dosimetry and taking advantage of the use of radiochromic films.

  20. Practical Development of Modern Mass Media Education in Poland

    ERIC Educational Resources Information Center

    Fedorov, Alexander

    2012-01-01

    Practical development of modern mass media education in Poland. The paper analyzes the main ways of practical development of modern media education (1992-2012 years) in Poland: basic technologies, main events, etc.

  1. Analysis of view synthesis prediction architectures in modern coding standards

    NASA Astrophysics Data System (ADS)

    Tian, Dong; Zou, Feng; Lee, Chris; Vetro, Anthony; Sun, Huifang

    2013-09-01

    Depth-based 3D formats are currently being developed as extensions to both AVC and HEVC standards. The availability of depth information facilitates the generation of intermediate views for advanced 3D applications and displays, and also enables more efficient coding of the multiview input data through view synthesis prediction techniques. This paper outlines several approaches that have been explored to realize view synthesis prediction in modern video coding standards such as AVC and HEVC. The benefits and drawbacks of various architectures are analyzed in terms of performance, complexity, and other design considerations. It is hence concluded that block-based VSP prediction for multiview video signals provides attractive coding gains with comparable complexity as traditional motion/disparity compensation.

  2. Medical image classification based on multi-scale non-negative sparse coding.

    PubMed

    Zhang, Ruijie; Shen, Jian; Wei, Fushan; Li, Xiong; Sangaiah, Arun Kumar

    2017-11-01

    With the rapid development of modern medical imaging technology, medical image classification has become more and more important in medical diagnosis and clinical practice. Conventional medical image classification algorithms usually neglect the semantic gap problem between low-level features and high-level image semantic, which will largely degrade the classification performance. To solve this problem, we propose a multi-scale non-negative sparse coding based medical image classification algorithm. Firstly, Medical images are decomposed into multiple scale layers, thus diverse visual details can be extracted from different scale layers. Secondly, for each scale layer, the non-negative sparse coding model with fisher discriminative analysis is constructed to obtain the discriminative sparse representation of medical images. Then, the obtained multi-scale non-negative sparse coding features are combined to form a multi-scale feature histogram as the final representation for a medical image. Finally, SVM classifier is combined to conduct medical image classification. The experimental results demonstrate that our proposed algorithm can effectively utilize multi-scale and contextual spatial information of medical images, reduce the semantic gap in a large degree and improve medical image classification performance. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures

    DTIC Science & Technology

    2015-09-01

    soundness or completeness. An incomplete analysis will produce extra edges in the CFG that might allow an attacker to slip through. An unsound analysis...Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures by Isaac Noah Evans Submitted to the Department of Electrical...Engineering and Computer Science in partial fulfillment of the requirements for the degree of Master of Engineering in Electrical Engineering and Computer

  4. Using Modern C++ Idiom for the Discretisation of Sets of Coupled Transport Equations in Numerical Plasma Physics

    NASA Astrophysics Data System (ADS)

    van Dijk, Jan; Hartgers, Bart; van der Mullen, Joost

    2006-10-01

    Self-consistent modelling of plasma sources requires a simultaneous treatment of multiple physical phenomena. As a result plasma codes have a high degree of complexity. And with the growing interest in time-dependent modelling of non-equilibrium plasma in three dimensions, codes tend to become increasingly hard to explain-and-maintain. As a result of these trends there has been an increased interest in the software-engineering and implementation aspects of plasma modelling in our group at Eindhoven University of Technology. In this contribution we will present modern object-oriented techniques in C++ to solve an old problem: that of the discretisation of coupled linear(ized) equations involving multiple field variables on ortho-curvilinear meshes. The `LinSys' code has been tailored to the transport equations that occur in transport physics. The implementation has been made both efficient and user-friendly by using modern idiom like expression templates and template meta-programming. Live demonstrations will be given. The code is available to interested parties; please visit www.dischargemodelling.org.

  5. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  6. Ada software productivity prototypes: A case study

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Habib-Agahi, Hamid; Malhotra, Shan

    1988-01-01

    A case study of the impact of Ada on a Command and Control project completed at the Jet Propulsion Laboratory (JPL) is given. The data for this study was collected as part of a general survey of software costs and productivity at JPL and other NASA sites. The task analyzed is a successful example of the use of rapid prototyping as applied to command and control for the U.S. Air Force and provides the U.S. Air Force Military Airlift Command with the ability to track aircraft, air crews and payloads worldwide. The task consists of a replicated database at several globally distributed sites. The local databases at each site can be updated within seconds after changes are entered at any one site. The system must be able to handle up to 400,000 activities per day. There are currently seven sites, each with a local area network of computers and a variety of user displays; the local area networks are tied together into a single wide area network. Using data obtained for eight modules, totaling approximately 500,000 source lines of code, researchers analyze the differences in productivities between subtasks. Factors considered are percentage of Ada used in coding, years of programmer experience, and the use of Ada tools and modern programming practices. The principle findings are the following. Productivity is very sensitive to programmer experience. The use of Ada software tools and the use of modern programming practices are important; without such use Ada is just a large complex language which can cause productivity to decrease. The impact of Ada on development effort phases is consistent with earlier reports at the project level but not at the module level.

  7. Improving robustness and computational efficiency using modern C++

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paterno, M.; Kowalkowski, J.; Green, C.

    2014-01-01

    For nearly two decades, the C++ programming language has been the dominant programming language for experimental HEP. The publication of ISO/IEC 14882:2011, the current version of the international standard for the C++ programming language, makes available a variety of language and library facilities for improving the robustness, expressiveness, and computational efficiency of C++ code. However, much of the C++ written by the experimental HEP community does not take advantage of the features of the language to obtain these benefits, either due to lack of familiarity with these features or concern that these features must somehow be computationally inefficient. In thismore » paper, we address some of the features of modern C+-+, and show how they can be used to make programs that are both robust and computationally efficient. We compare and contrast simple yet realistic examples of some common implementation patterns in C, currently-typical C++, and modern C++, and show (when necessary, down to the level of generated assembly language code) the quality of the executable code produced by recent C++ compilers, with the aim of allowing the HEP community to make informed decisions on the costs and benefits of the use of modern C++.« less

  8. Development of Modern Performance Assessment Tools and Capabilities for Underground Disposal of Transuranic Waste at WIPP

    NASA Astrophysics Data System (ADS)

    Zeitler, T.; Kirchner, T. B.; Hammond, G. E.; Park, H.

    2014-12-01

    The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. In a broad modernization effort, the DOE has overseen the transfer of these codes to modern hardware and software platforms. Additionally, there is a current effort to establish new performance assessment capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Improvements to the current computational environment will result in greater detail in the final models due to the parallelization afforded by the modern code. Parallelization will allow for relatively faster calculations, as well as a move from a two-dimensional calculation grid to a three-dimensional grid. The result of the modernization effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S Department of Energy.

  9. Integrated Performance of Next Generation High Data Rate Receiver and AR4JA LDPC Codec for Space Communications

    NASA Technical Reports Server (NTRS)

    Cheng, Michael K.; Lyubarev, Mark; Nakashima, Michael A.; Andrews, Kenneth S.; Lee, Dennis

    2008-01-01

    Low-density parity-check (LDPC) codes are the state-of-the-art in forward error correction (FEC) technology that exhibits capacity approaching performance. The Jet Propulsion Laboratory (JPL) has designed a family of LDPC codes that are similar in structure and therefore, leads to a single decoder implementation. The Accumulate-Repeat-by-4-Jagged- Accumulate (AR4JA) code design offers a family of codes with rates 1/2, 2/3, 4/5 and lengths 1024, 4096, 16384 information bits. Performance is less than one dB from capacity for all combinations.Integrating a stand-alone LDPC decoder with a commercial-off-the-shelf (COTS) receiver faces additional challenges than building a single receiver-decoder unit from scratch. In this work, we outline the issues and show that these additional challenges can be over-come by simple solutions. To demonstrate that an LDPC decoder can be made to work seamlessly with a COTS receiver, we interface an AR4JA LDPC decoder developed on a field-programmable gate array (FPGA) with a modern high data rate receiver and mea- sure the combined receiver-decoder performance. Through optimizations that include an improved frame synchronizer and different soft-symbol scaling algorithms, we show that a combined implementation loss of less than one dB is possible and therefore, most of the coding gain evidence in theory can also be obtained in practice. Our techniques can benefit any modem that utilizes an advanced FEC code.

  10. Promoting survival: A grounded theory study of consequences of modern health practices in Ouramanat region of Iranian Kurdistan

    PubMed Central

    Mohammadpur, Ahmad; Rezaei, Mehdi; Sadeghi, Rasoul

    2010-01-01

    The aim of this qualitative study is to explore the way people using modern health care perceive its consequences in Ouraman-e-Takht region of Iranian Kurdistan. Ouraman-e-Takht is a rural, highly mountainous and dry region located in the southwest Kurdistan province of Iran. Recently, modern health practices have been introduced to the region. The purpose of this study was to investigate, from the Ouramains' point of view, the impact that modern health services and practices have had on the Ouraman traditional way of life. Interview data from respondents were analyzed by using grounded theory. Promoting survival was the core category that explained the impact that modern health practices have had on the Ouraman region. The people of Ouraman interpreted modern health practices as increasing their quality of life and promoting their survival. Results are organized around this core category in a paradigm model consisting of conditions, interactions, and consequences. This model can be used to understand the impact of change from the introduction of modern health on a traditional society. PMID:20640020

  11. Promoting survival: A grounded theory study of consequences of modern health practices in Ouramanat region of Iranian Kurdistan.

    PubMed

    Mohammadpur, Ahmad; Rezaei, Mehdi; Sadeghi, Rasoul

    2010-05-14

    The aim of this qualitative study is to explore the way people using modern health care perceive its consequences in Ouraman-e-Takht region of Iranian Kurdistan. Ouraman-e-Takht is a rural, highly mountainous and dry region located in the southwest Kurdistan province of Iran. Recently, modern health practices have been introduced to the region. The purpose of this study was to investigate, from the Ouramains' point of view, the impact that modern health services and practices have had on the Ouraman traditional way of life. Interview data from respondents were analyzed by using grounded theory. Promoting survival was the core category that explained the impact that modern health practices have had on the Ouraman region. The people of Ouraman interpreted modern health practices as increasing their quality of life and promoting their survival. Results are organized around this core category in a paradigm model consisting of conditions, interactions, and consequences. This model can be used to understand the impact of change from the introduction of modern health on a traditional society.

  12. Integrated Devices and Systems | Grid Modernization | NREL

    Science.gov Websites

    storage models Microgrids Microgrids Grid Simulation and Power Hardware-in-the-Loop Grid simulation and power hardware-in-the-loop Grid Standards and Codes Standards and codes Contact Barry Mather, Ph.D

  13. Attitudes of Trainers and Medical Students towards Using Modern Practices

    ERIC Educational Resources Information Center

    Hadzhiiliev, Vassil Stefanov; Dobreva, Zhaneta Stoykova

    2011-01-01

    The development of universities as independent scientific centers determines their mission to incorporate the most modern achievements of science into the students' practical training. This research on the attitudes of the participants in this process towards the use of modern practices encompasses both trainers and students, and it consists of…

  14. The Evolution of Random Number Generation in MUVES

    DTIC Science & Technology

    2017-01-01

    mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results identical to the current...MUVES, includ- ing the mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results...questionable numerical and statistical properties. The development of the modern system is traced through software change requests, resulting in a random number

  15. Dependency graph for code analysis on emerging architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shashkov, Mikhail Jurievich; Lipnikov, Konstantin

    Direct acyclic dependency (DAG) graph is becoming the standard for modern multi-physics codes.The ideal DAG is the true block-scheme of a multi-physics code. Therefore, it is the convenient object for insitu analysis of the cost of computations and algorithmic bottlenecks related to statistical frequent data motion and dymanical machine state.

  16. Comparison of LEWICE 1.6 and LEWICE/NS with IRT experimental data from modern air foil tests

    DOT National Transportation Integrated Search

    1998-01-01

    A research project is underway at NASA Lewis to produce a computer code which can accurately predict ice growth under any meteorological conditions for any aircraft surface. The most recent release of this code is LEWICE 1.6. This code is modular in ...

  17. Space Shuttle Debris Impact Tool Assessment Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard; Rayos, Elonsio M.; Campbell, Charles H.; Rickman, Steven L.; Larsen, Curtis E.

    2007-01-01

    Complex computer codes are used to estimate thermal and structural reentry loads on the Shuttle Orbiter induced by ice and foam debris impact during ascent. Such debris can create cavities in the Shuttle Thermal Protection System. The sizes and shapes of these cavities are approximated to accommodate a code limitation that requires simple "shoebox" geometries to describe the cavities -- rectangular areas and planar walls that are at constant angles with respect to vertical. These approximations induce uncertainty in the code results. The Modern Design of Experiments (MDOE) has recently been applied to develop a series of resource-minimal computational experiments designed to generate low-order polynomial graduating functions to approximate the more complex underlying codes. These polynomial functions were then used to propagate cavity geometry errors to estimate the uncertainty they induce in the reentry load calculations performed by the underlying code. This paper describes a methodological study focused on evaluating the application of MDOE to future operational codes in a rapid and low-cost way to assess the effects of cavity geometry uncertainty.

  18. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction

    PubMed Central

    Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin

    2016-01-01

    High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems. PMID:27814367

  19. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    PubMed

    Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin

    2016-01-01

    High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  20. Ethics, health care and spinal cord injury: research, practice and finance.

    PubMed

    Donovan, W H

    2011-02-01

    Dating back to ancient times, mankind has been absorbed with 'doing the right thing', that is, behaving in ways approved by the society and the culture during the era in which they lived. This has been and still is especially true for the medical and related health-care professions. Laws and professional codes have evolved over the years that provide guidelines as to how physicians should treat patients, beginning with the one authored by Hippocrates. Only more recently, however, have laws and codes been created to cover health-care research and the advances in health-care practice that have been brought to light by that research. Although these discoveries have clearly impacted the quality of life and duration of life for people with spinal cord injury and other maladies, they have also raised questions that go beyond the science. Questions such as when, why, how and for how long should such treatments be applied often relate more to what a society and its culture will condone and the answers can differ and have differed among societies depending on the prevailing ethics and morals. Modern codes and laws have been created so that the trust people have traditionally placed in their healers will not be violated or misused as happened during wars past, especially in Nazi Germany. This paper will trace the evolution of the rules that medical researchers, practitioners and payers for treatment must now follow and explain why guiding all their efforts that honesty must prevail.

  1. Standardized development of computer software. Part 1: Methods

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.

  2. Practical Bayesian tomography

    NASA Astrophysics Data System (ADS)

    Granade, Christopher; Combes, Joshua; Cory, D. G.

    2016-03-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of-the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we address all three problems. First, we use modern statistical methods, as pioneered by Huszár and Houlsby (2012 Phys. Rev. A 85 052120) and by Ferrie (2014 New J. Phys. 16 093035), to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first priors on quantum states and channels that allow for including useful experimental insight. Finally, we develop a method that allows tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  3. The pedagogical ebb and flow of human patient simulation: empowering through a process of fading support.

    PubMed

    Parker, Brian Corey; Myrick, Florence

    2012-07-01

    The use of the high-fidelity human patient simulator (HPS)-based clinical scenario in undergraduate nursing education is a powerful learning tool, well suited to modern nursing students' preference for immersive construction of knowledge through the provision of contextually rich reality-based practice and social discourse. The purpose of this study was to explore the social-psychological processes that occur within HPS-based clinical scenarios. Grounded theory method was used to study students and faculty sampled from a Western Canadian baccalaureate nursing program. The process of leveled coding generated a substantive theory that has the potential to enable educators to empower students through the use of fading support, a twofold process composed of adaptive scaffolding and dynamic assessment that challenges students to realistically self-regulate and transform their frame of reference for nursing practice, while limiting the threats that traditional HPS-based curriculum can impose. Copyright 2012, SLACK Incorporated.

  4. The long overdue medical specialty: bioethiatrics.

    PubMed

    Kevorkian, J

    1986-11-01

    Traditional bioethical codes have been unable to cope with the results of modern technology and the drastic changes in life patterns. The medical profession can reestablish bioethical order and reassert leadership through a new and urgently needed medical specialty, which the author tentatively calls bioethiatrics or bioethiatry. Bioethiatrics embodies a unique combination of ethical action and moral judgment.Training for the specialty would start with a residency program, consisting of thorough training in philosophy and religion coupled with continued experience in clinical medicine and indoctrination in contemporary research. Requirements would include the practice of general medicine for at least two years after internship, the passing of oral and written examinations after four years of residency, board certification, and subsequent periodic evaluations.Bioethiatricians would assume all the usual privileges, obligations, and risks associated with the practice of any medical specialty, thereby averting unnecessary ethical crises and ensuring a more rational response to present and future moral challenges.

  5. The Long Overdue Medical Specialty: Bioethiatrics

    PubMed Central

    Kevorkian, Jack

    1986-01-01

    Traditional bioethical codes have been unable to cope with the results of modern technology and the drastic changes in life patterns. The medical profession can reestablish bioethical order and reassert leadership through a new and urgently needed medical specialty, which the author tentatively calls bioethiatrics or bioethiatry. Bioethiatrics embodies a unique combination of ethical action and moral judgment. Training for the specialty would start with a residency program, consisting of thorough training in philosophy and religion coupled with continued experience in clinical medicine and indoctrination in contemporary research. Requirements would include the practice of general medicine for at least two years after internship, the passing of oral and written examinations after four years of residency, board certification, and subsequent periodic evaluations. Bioethiatricians would assume all the usual privileges, obligations, and risks associated with the practice of any medical specialty, thereby averting unnecessary ethical crises and ensuring a more rational response to present and future moral challenges. PMID:3795285

  6. Central Heat Plant Modernization: FY98 Update and Recommendations.

    DTIC Science & Technology

    1999-12-01

    Boiler and Pressure Vessel Code suggests an inspection frequency of 12 months for...28 April 1997). ASME International, Boiler and Pressure Vessel Code (ASME International, New York, NY, 1995). Bloomquist, R.G., J.D. Nimmons, and K...Services (HQDA, 28 April 1997). ASME International, Boiler and Pressure Vessel Code (ASME International, New York, NY, 1995). Bloomquist, R.G.,

  7. Main functions, recent updates, and applications of Synchrotron Radiation Workshop code

    NASA Astrophysics Data System (ADS)

    Chubar, Oleg; Rakitin, Maksim; Chen-Wiegart, Yu-Chen Karen; Chu, Yong S.; Fluerasu, Andrei; Hidas, Dean; Wiegart, Lutz

    2017-08-01

    The paper presents an overview of the main functions and new application examples of the "Synchrotron Radiation Workshop" (SRW) code. SRW supports high-accuracy calculations of different types of synchrotron radiation, and simulations of propagation of fully-coherent radiation wavefronts, partially-coherent radiation from a finite-emittance electron beam of a storage ring source, and time-/frequency-dependent radiation pulses of a free-electron laser, through X-ray optical elements of a beamline. An extended library of physical-optics "propagators" for different types of reflective, refractive and diffractive X-ray optics with its typical imperfections, implemented in SRW, enable simulation of practically any X-ray beamline in a modern light source facility. The high accuracy of calculation methods used in SRW allows for multiple applications of this code, not only in the area of development of instruments and beamlines for new light source facilities, but also in areas such as electron beam diagnostics, commissioning and performance benchmarking of insertion devices and individual X-ray optical elements of beamlines. Applications of SRW in these areas, facilitating development and advanced commissioning of beamlines at the National Synchrotron Light Source II (NSLS-II), are described.

  8. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  9. MATLAB for laser speckle contrast analysis (LASCA): a practice-based approach

    NASA Astrophysics Data System (ADS)

    Postnikov, Eugene B.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Laser Speckle Contrast Analysis (LASCA) is one of the most powerful modern methods for revealing blood dynamics. The experimental design and theory for this method are well established, and the computational recipie is often regarded to be trivial. However, the achieved performance and spatial resolution may considerable differ for different implementations. We comprise a minireview of known approaches to the spatial laser speckle contrast data processing and their realization in MATLAB code providing an explicit correspondence to the mathematical representation, a discussion of available implementations. We also present the algorithm based on the 2D Haar wavelet transform, also supplied with the program code. This new method provides an opportunity to introduce horizontal, vertical and diagonal speckle contrasts; it may be used for processing highly anisotropic images of vascular trees. We provide the comparative analysis of the accuracy of vascular pattern detection and the processing times with a special attention to details of the used MATLAB procedures.

  10. Performance Analysis, Design Considerations, and Applications of Extreme-Scale In Situ Infrastructures

    DOE PAGES

    Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...

    2016-11-01

    A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less

  11. An assessment of the Bhutanese traditional medicine for its ethnopharmacology, ethnobotany and ethnoquality: Textual understanding and the current practices.

    PubMed

    Wangchuk, Phurpa; Pyne, Stephen G; Keller, Paul A

    2013-06-21

    : This study involves the assessment of the Bhutanese traditional medicine (BTM) which was integrated with the mainstream biomedicine in 1967 to provide primary health care services in the country. It caters to 20-30% of the daily out-patients within 49 traditional medicine units attached to 20 district modern hospitals and 29 Basic Health Units in the country. : This study presents the ethnopharmacological, ethnobotanical and the ethnoquality concepts in relation to mainstream Tibetan medicine and describes the current practices of BTM. : Experienced BTM practitioners (Drung-tshos and Smen-pas) were selected using a convenience sampling method and were interviewed using an open questionnaire followed by informal discussions. The corpus of BTM, Tibetan and scientific literature was obtained and the information on ethnopharmacological, ethnoquality and ethnobotanical concepts and current practices of BTM was extracted. : This study found that the BTM shares many similarities in terms of materia medica, pharmacopoeia and the principles and concepts of ethnopharmacology and ethnobotany with its mainstream Tibetan medicine. However, the resourceful Bhutanese Drung-tshos and Smen-pas have adapted this medical system based on the local language, culture, disease trend, health care needs and their familiarity with the locally available medicinal ingredients making it particular to the country. A number of notable distinctions observed in the current practices include a code of classification of diseases (only 79 of 404 types of disorders recognized), formulations (currently used only 103 of thousands formulation types), usage of medicinal plants (only 229 species of thousands described) and selected treatment procedures (golden needle and water therapy). This BTM was found to cater to 20-30% of daily out-patients visiting 49 modern hospitals and basic health units in the country. : The BTM has been evolved from the Tibetan medicine. While the pharmacopoeia, ethnopharmacology, ethnobotany and the ethnoquality aspects shares commonalities with the mainstream Tibetan medicine, there are some practices unique to BTM. Such uniqueness observed in the current practices of BTM include formulations, medicinal plants collection and usage, and the treatment procedures including golden needle and water therapy. This could be a promising source of information for the rediscovery of useful remedies, the development of modern phytotherapeutics and the establishment of efficient quality control measures. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    ,

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  13. IEEE 1547 Standards Advancing Grid Modernization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basso, Thomas; Chakraborty, Sudipta; Hoke, Andy

    Technology advances including development of advanced distributed energy resources (DER) and grid-integrated operations and controls functionalities have surpassed the requirements in current standards and codes for DER interconnection with the distribution grid. The full revision of IEEE Standards 1547 (requirements for DER-grid interconnection and interoperability) and 1547.1 (test procedures for conformance to 1547) are establishing requirements and best practices for state-of-the-art DER including variable renewable energy sources. The revised standards will also address challenges associated with interoperability and transmission-level effects, in addition to strictly addressing the distribution grid needs. This paper provides the status and future direction of the ongoingmore » development focus for the 1547 standards.« less

  14. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  15. Correcting quantum errors with entanglement.

    PubMed

    Brun, Todd; Devetak, Igor; Hsieh, Min-Hsiu

    2006-10-20

    We show how entanglement shared between encoder and decoder can simplify the theory of quantum error correction. The entanglement-assisted quantum codes we describe do not require the dual-containing constraint necessary for standard quantum error-correcting codes, thus allowing us to "quantize" all of classical linear coding theory. In particular, efficient modern classical codes that attain the Shannon capacity can be made into entanglement-assisted quantum codes attaining the hashing bound (closely related to the quantum capacity). For systems without large amounts of shared entanglement, these codes can also be used as catalytic codes, in which a small amount of initial entanglement enables quantum communication.

  16. TRIQS: A toolbox for research on interacting quantum systems

    NASA Astrophysics Data System (ADS)

    Parcollet, Olivier; Ferrero, Michel; Ayral, Thomas; Hafermann, Hartmut; Krivenko, Igor; Messio, Laura; Seth, Priyanka

    2015-11-01

    We present the TRIQS library, a Toolbox for Research on Interacting Quantum Systems. It is an open-source, computational physics library providing a framework for the quick development of applications in the field of many-body quantum physics, and in particular, strongly-correlated electronic systems. It supplies components to develop codes in a modern, concise and efficient way: e.g. Green's function containers, a generic Monte Carlo class, and simple interfaces to HDF5. TRIQS is a C++/Python library that can be used from either language. It is distributed under the GNU General Public License (GPLv3). State-of-the-art applications based on the library, such as modern quantum many-body solvers and interfaces between density-functional-theory codes and dynamical mean-field theory (DMFT) codes are distributed along with it.

  17. A Silent Revolution: From Sketching to Coding--A Case Study on Code-Based Design Tool Learning

    ERIC Educational Resources Information Center

    Xu, Song; Fan, Kuo-Kuang

    2017-01-01

    Along with the information technology rising, Computer Aided Design activities are becoming more modern and more complex. But learning how to operation these new design tools has become the main problem lying in front of each designer. This study was purpose on finding problems encountered during code-based design tools learning period of…

  18. Changes in the social context and conduct of eating in four Nordic countries between 1997 and 2012.

    PubMed

    Holm, Lotte; Lauridsen, Drude; Lund, Thomas Bøker; Gronow, Jukka; Niva, Mari; Mäkelä, Johanna

    2016-08-01

    How have eating patterns changed in modern life? In public and academic debate concern has been expressed that the social function of eating may be challenged by de-structuration and the dissolution of traditions. We analyzed changes in the social context and conduct of eating in four Nordic countries over the period 1997-2012. We focused on three interlinked processes often claimed to be distinctive of modern eating: delocalization of eating from private households to commercial settings, individualization in the form of more eating alone, and informalization, implying more casual codes of conduct. We based the analysis on data from two surveys conducted in Denmark, Finland, Norway and Sweden in 1997 and 2012. The surveys reported in detail one day of eating in representative samples of adult populations in the four countries (N = 4823 and N = 8242). We compared data regarding where, with whom, and for how long people ate, and whether parallel activities took place while eating. While Nordic people's primary location for eating remained the home and the workplace, the practices of eating in haste, and while watching television increased and using tablets, computers and smartphones while eating was frequent in 2012. Propensity to eat alone increased slightly in Denmark and Norway, and decreased slightly in Sweden. While such practices vary with socio-economic background, regression analysis showed several changes were common across the Nordic populations. However, the new practice of using tablets, computers, and smartphones while eating was strongly associated with young age. Further, each of the practices appeared to be related to different types of meal. We conclude that while the changes in the social organization of eating were not dramatic, signs of individualization and informalization could be detected. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Application of Modern Fortran to Spacecraft Trajectory Design and Optimization

    NASA Technical Reports Server (NTRS)

    Williams, Jacob; Falck, Robert D.; Beekman, Izaak B.

    2018-01-01

    In this paper, applications of the modern Fortran programming language to the field of spacecraft trajectory optimization and design are examined. Modern object-oriented Fortran has many advantages for scientific programming, although many legacy Fortran aerospace codes have not been upgraded to use the newer standards (or have been rewritten in other languages perceived to be more modern). NASA's Copernicus spacecraft trajectory optimization program, originally a combination of Fortran 77 and Fortran 95, has attempted to keep up with modern standards and makes significant use of the new language features. Various algorithms and methods are presented from trajectory tools such as Copernicus, as well as modern Fortran open source libraries and other projects.

  20. Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators

    PubMed Central

    Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew

    2014-01-01

    Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950

  1. RELAP-7 Level 2 Milestone Report: Demonstration of a Steady State Single Phase PWR Simulation with RELAP-7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Andrs; Ray Berry; Derek Gaston

    The document contains the simulation results of a steady state model PWR problem with the RELAP-7 code. The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on INL's modern scientific software development framework - MOOSE (Multi-Physics Object-Oriented Simulation Environment). This report summarizes the initial results of simulating a model steady-state single phase PWR problem using the current version of the RELAP-7 code. The major purpose of this demonstration simulation is to show that RELAP-7 code can be rapidly developed to simulate single-phase reactor problems. RELAP-7more » is a new project started on October 1st, 2011. It will become the main reactor systems simulation toolkit for RISMC (Risk Informed Safety Margin Characterization) and the next generation tool in the RELAP reactor safety/systems analysis application series (the replacement for RELAP5). The key to the success of RELAP-7 is the simultaneous advancement of physical models, numerical methods, and software design while maintaining a solid user perspective. Physical models include both PDEs (Partial Differential Equations) and ODEs (Ordinary Differential Equations) and experimental based closure models. RELAP-7 will eventually utilize well posed governing equations for multiphase flow, which can be strictly verified. Closure models used in RELAP5 and newly developed models will be reviewed and selected to reflect the progress made during the past three decades. RELAP-7 uses modern numerical methods, which allow implicit time integration, higher order schemes in both time and space, and strongly coupled multi-physics simulations. RELAP-7 is written with object oriented programming language C++. Its development follows modern software design paradigms. The code is easy to read, develop, maintain, and couple with other codes. Most importantly, the modern software design allows the RELAP-7 code to evolve with time. RELAP-7 is a MOOSE-based application. MOOSE (Multiphysics Object-Oriented Simulation Environment) is a framework for solving computational engineering problems in a well-planned, managed, and coordinated way. By leveraging millions of lines of open source software packages, such as PETSC (a nonlinear solver developed at Argonne National Laboratory) and LibMesh (a Finite Element Analysis package developed at University of Texas), MOOSE significantly reduces the expense and time required to develop new applications. Numerical integration methods and mesh management for parallel computation are provided by MOOSE. Therefore RELAP-7 code developers only need to focus on physics and user experiences. By using the MOOSE development environment, RELAP-7 code is developed by following the same modern software design paradigms used for other MOOSE development efforts. There are currently over 20 different MOOSE based applications ranging from 3-D transient neutron transport, detailed 3-D transient fuel performance analysis, to long-term material aging. Multi-physics and multiple dimensional analyses capabilities can be obtained by coupling RELAP-7 and other MOOSE based applications and by leveraging with capabilities developed by other DOE programs. This allows restricting the focus of RELAP-7 to systems analysis-type simulations and gives priority to retain and significantly extend RELAP5's capabilities.« less

  2. Code modernization and modularization of APEX and SWAT watershed simulation models

    USDA-ARS?s Scientific Manuscript database

    SWAT (Soil and Water Assessment Tool) and APEX (Agricultural Policy / Environmental eXtender) are respectively large and small watershed simulation models derived from EPIC Environmental Policy Integrated Climate), a field-scale agroecology simulation model. All three models are coded in FORTRAN an...

  3. The Plasma Simulation Code: A modern particle-in-cell code with patch-based load-balancing

    NASA Astrophysics Data System (ADS)

    Germaschewski, Kai; Fox, William; Abbott, Stephen; Ahmadi, Narges; Maynard, Kristofor; Wang, Liang; Ruhl, Hartmut; Bhattacharjee, Amitava

    2016-08-01

    This work describes the Plasma Simulation Code (PSC), an explicit, electromagnetic particle-in-cell code with support for different order particle shape functions. We review the basic components of the particle-in-cell method as well as the computational architecture of the PSC code that allows support for modular algorithms and data structure in the code. We then describe and analyze in detail a distinguishing feature of PSC: patch-based load balancing using space-filling curves which is shown to lead to major efficiency gains over unbalanced methods and a previously used simpler balancing method.

  4. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    NASA Astrophysics Data System (ADS)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.

  5. Computing Legacy Software Behavior to Understand Functionality and Security Properties: An IBM/370 Demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J

    Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and securitymore » vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.« less

  6. Interoception, contemplative practice, and health

    PubMed Central

    Farb, Norman; Daubenmier, Jennifer; Price, Cynthia J.; Gard, Tim; Kerr, Catherine; Dunn, Barnaby D.; Klein, Anne Carolyn; Paulus, Martin P.; Mehling, Wolf E.

    2015-01-01

    Interoception can be broadly defined as the sense of signals originating within the body. As such, interoception is critical for our sense of embodiment, motivation, and well-being. And yet, despite its importance, interoception remains poorly understood within modern science. This paper reviews interdisciplinary perspectives on interoception, with the goal of presenting a unified perspective from diverse fields such as neuroscience, clinical practice, and contemplative studies. It is hoped that this integrative effort will advance our understanding of how interoception determines well-being, and identify the central challenges to such understanding. To this end, we introduce an expanded taxonomy of interoceptive processes, arguing that many of these processes can be understood through an emerging predictive coding model for mind–body integration. The model, which describes the tension between expected and felt body sensation, parallels contemplative theories, and implicates interoception in a variety of affective and psychosomatic disorders. We conclude that maladaptive construal of bodily sensations may lie at the heart of many contemporary maladies, and that contemplative practices may attenuate these interpretative biases, restoring a person’s sense of presence and agency in the world. PMID:26106345

  7. Measurement Techniques for Clock Jitter

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin; Schlesinger, Adam

    2012-01-01

    NASA is in the process of modernizing its communications infrastructure to accompany the development of a Crew Exploration Vehicle (CEV) to replace the shuttle. With this effort comes the opportunity to infuse more advanced coded modulation techniques, including low-density parity-check (LDPC) codes that offer greater coding gains than the current capability. However, in order to take full advantage of these codes, the ground segment receiver synchronization loops must be able to operate at a lower signal-to-noise ratio (SNR) than supported by equipment currently in use.

  8. Neural Decoder for Topological Codes

    NASA Astrophysics Data System (ADS)

    Torlai, Giacomo; Melko, Roger G.

    2017-07-01

    We present an algorithm for error correction in topological codes that exploits modern machine learning techniques. Our decoder is constructed from a stochastic neural network called a Boltzmann machine, of the type extensively used in deep learning. We provide a general prescription for the training of the network and a decoding strategy that is applicable to a wide variety of stabilizer codes with very little specialization. We demonstrate the neural decoder numerically on the well-known two-dimensional toric code with phase-flip errors.

  9. Quality of recording of diabetes in the UK: how does the GP's method of coding clinical data affect incidence estimates? Cross-sectional study using the CPRD database

    PubMed Central

    Tate, A Rosemary; Dungey, Sheena; Glew, Simon; Beloff, Natalia; Williams, Rachael; Williams, Tim

    2017-01-01

    Objective To assess the effect of coding quality on estimates of the incidence of diabetes in the UK between 1995 and 2014. Design A cross-sectional analysis examining diabetes coding from 1995 to 2014 and how the choice of codes (diagnosis codes vs codes which suggest diagnosis) and quality of coding affect estimated incidence. Setting Routine primary care data from 684 practices contributing to the UK Clinical Practice Research Datalink (data contributed from Vision (INPS) practices). Main outcome measure Incidence rates of diabetes and how they are affected by (1) GP coding and (2) excluding ‘poor’ quality practices with at least 10% incident patients inaccurately coded between 2004 and 2014. Results Incidence rates and accuracy of coding varied widely between practices and the trends differed according to selected category of code. If diagnosis codes were used, the incidence of type 2 increased sharply until 2004 (when the UK Quality Outcomes Framework was introduced), and then flattened off, until 2009, after which they decreased. If non-diagnosis codes were included, the numbers continued to increase until 2012. Although coding quality improved over time, 15% of the 666 practices that contributed data between 2004 and 2014 were labelled ‘poor’ quality. When these practices were dropped from the analyses, the downward trend in the incidence of type 2 after 2009 became less marked and incidence rates were higher. Conclusions In contrast to some previous reports, diabetes incidence (based on diagnostic codes) appears not to have increased since 2004 in the UK. Choice of codes can make a significant difference to incidence estimates, as can quality of recording. Codes and data quality should be checked when assessing incidence rates using GP data. PMID:28122831

  10. Code of Fair Testing Practices in Education (Revised)

    ERIC Educational Resources Information Center

    Educational Measurement: Issues and Practice, 2005

    2005-01-01

    A note from the Working Group of the Joint Committee on Testing Practices: The "Code of Fair Testing Practices in Education (Code)" prepared by the Joint Committee on Testing Practices (JCTP) has just been revised for the first time since its initial introduction in 1988. The revision of the Code was inspired primarily by the revision of…

  11. 76 FR 77237 - U.S. National Authority for the WHO Global Code of Practice on the International Recruitment of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-12

    ... Authority for the WHO Global Code of Practice on the International Recruitment of Health Personnel; Notice... Global Code of Practice on the International Recruitment of Health Personnel, notice is hereby given of... Global Code of Practice on International Recruitment of Health Personnel is ``to establish and promote...

  12. An algebraic hypothesis about the primeval genetic code architecture.

    PubMed

    Sánchez, Robersy; Grau, Ricardo

    2009-09-01

    A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D,A,C,G,U}, where symbol D represents one or more hypothetical bases with unspecific pairings. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvement of a primeval DNA repair system could make possible the transition from ancient to modern genetic codes. Our results suggest that the Watson-Crick base pairing G identical with C and A=U and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as, the transition from the former to the latter. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences. The phylogenetic analyses achieved with metrics defined in the N-dimensional vector space (B(3))(N) of DNA sequences and with the new evolutionary model presented here also suggest that an ancient DNA coding sequence with five or more bases does not contradict the expected evolutionary history.

  13. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  14. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  15. General practice--a post-modern specialty?

    PubMed Central

    Mathers, N; Rowland, S

    1997-01-01

    The 'modern' view of the world is based on the premise that we can discover the essential truth of the world using scientific method. The assumption is made that knowledge so acquired has been 'uncontaminated' by the mind of the investigator. Post-modern theory, however, is concerned with the process of knowing and how our minds are part of the process, i.e. our perceptions of reality and the relationships between different concepts are important influences on our ways of knowing. The values of post-modern theory are those of uncertainty, many different voices and experiences of reality and multifaceted descriptions of truth. These values are closer to our experience of general practice than the 'modern' values of scientific rationalism and should be reflected in a new curriculum for general practice. PMID:9167325

  16. Values and principles evident in current health promotion practice.

    PubMed

    Gregg, Jane; O'Hara, Lily

    2007-04-01

    Modern health promotion practice needs to respond to complex health issues that have multiple interrelated determinants. This requires an understanding of the values and principles of health promotion. A literature review was undertaken to explore the values and principles evident in current health promotion theory and practice. A broad range of values and principles are espoused as being integral to modern health promotion theory and practice. Although there are some commonalities across these lists, there is no recognised, authoritative set of values and principles accepted as fundamental and applicable to modern health promotion. There is a continuum of values and principles evident in health promotion practice from those associated with holistic, ecological, salutogenic health promotion to those more in keeping with conventional health promotion. There is a need for a system of values and principles consistent with modern health promotion that enables practitioners to purposefully integrate these values and principles into their understanding of health, as well as their needs assessment, planning, implementation and evaluation practice.

  17. Exploring multiliteracies, student voice, and scientific practices in two elementary classrooms

    NASA Astrophysics Data System (ADS)

    Allison, Elizabeth Rowland

    This study explored the voices of children in a changing world with evolving needs and new opportunities. The workplaces of rapidly moving capitalist societies value creativity, collaboration, and critical thinking skills which are of growing importance and manifesting themselves in modern K-12 science classroom cultures (Gee, 2000; New London Group, 2000). This study explored issues of multiliteracies and student voice set within the context of teaching and learning in 4th and 5th grade science classrooms. The purpose of the study was to ascertain what and how multiliteracies and scientific practices (NGSS Lead States, 2013c) are implemented, explore how multiliteracies influence students' voices, and investigate teacher and student perceptions of multiliteracies, student voice, and scientific practices. Grounded in a constructivist framework, a multiple case study was employed in two elementary classrooms. Through observations, student focus groups and interviews, and teacher interviews, a detailed narrative was created to describe a range of multiliteracies, student voice, and scientific practices that occurred with the science classroom context. Using grounded theory analysis, data were coded and analyzed to reveal emergent themes. Data analysis revealed that these two classrooms were enriched with multiliteracies that serve metaphorically as breeding grounds for student voice. In the modern classroom, defined as a space where information is instantly accessible through the Internet, multiliteracies can be developed through inquiry-based, collaborative, and technology-rich experiences. Scientific literacy, cultivated through student communication and collaboration, is arguably a multiliteracy that has not been considered in the literature, and should be, as an integral component of overall individual literacy in the 21st century. Findings revealed four themes. Three themes suggest that teachers address several modes of multiliteracies in science, but identify barriers to integrating multiliteracies and scientific practices into science teaching. The issues include time, increased standards accountability, and lack of comfort with effective integration of technology. The fourth theme revealed that students have the ability to shape and define their learning while supporting other voices through collaborative science experiences.

  18. Novel Scalable 3-D MT Inverse Solver

    NASA Astrophysics Data System (ADS)

    Kuvshinov, A. V.; Kruglyakov, M.; Geraskin, A.

    2016-12-01

    We present a new, robust and fast, three-dimensional (3-D) magnetotelluric (MT) inverse solver. As a forward modelling engine a highly-scalable solver extrEMe [1] is used. The (regularized) inversion is based on an iterative gradient-type optimization (quasi-Newton method) and exploits adjoint sources approach for fast calculation of the gradient of the misfit. The inverse solver is able to deal with highly detailed and contrasting models, allows for working (separately or jointly) with any type of MT (single-site and/or inter-site) responses, and supports massive parallelization. Different parallelization strategies implemented in the code allow for optimal usage of available computational resources for a given problem set up. To parameterize an inverse domain a mask approach is implemented, which means that one can merge any subset of forward modelling cells in order to account for (usually) irregular distribution of observation sites. We report results of 3-D numerical experiments aimed at analysing the robustness, performance and scalability of the code. In particular, our computational experiments carried out at different platforms ranging from modern laptops to high-performance clusters demonstrate practically linear scalability of the code up to thousands of nodes. 1. Kruglyakov, M., A. Geraskin, A. Kuvshinov, 2016. Novel accurate and scalable 3-D MT forward solver based on a contracting integral equation method, Computers and Geosciences, in press.

  19. Xyce parallel electronic simulator users guide, version 6.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas; Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers; A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models; Device models that are specifically tailored to meet Sandia's needs, including some radiationaware devices (for Sandia users only); and Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase-a message passing parallel implementation-which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  20. Xyce parallel electronic simulator users' guide, Version 6.0.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  1. Xyce parallel electronic simulator users guide, version 6.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  2. User input verification and test driven development in the NJOY21 nuclear data processing code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trainer, Amelia Jo; Conlin, Jeremy Lloyd; McCartney, Austin Paul

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, andmore » capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.« less

  3. Adding EUNIS and VAULT rocket data to the VSO with Modern Perl frameworks

    NASA Astrophysics Data System (ADS)

    Mansky, Edmund

    2017-08-01

    A new Perl code is described, that uses the modern Object-oriented Moose framework, to add EUNIS and VAULT rocket data to the Virtual Solar Observatory website. The code permits the easy fixing of FITS header fields in the case where some FITS fields that are required are missing from the original data files. The code makes novel use of the Moose extensions “before” and “after” to build in dependencies so that database creation of tables occurs before the loading of data, and that the validation of file-dependent tables occurs after the loading is completed. Also described is the computation and loading of the deferred FITS field CHECKSUM into the database following the loading and validation of the file-dependent tables. The loading of the EUNIS 2006 and 2007 flight data, and the VAULT 2.0 flight data is described in detail as illustrative examples.

  4. Development of Fuel Shuffling Module for PHISICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan Mabe; Andrea Alfonsi; Cristian Rabiti

    2013-06-01

    PHISICS (Parallel and Highly Innovative Simulation for the INL Code System) [4] code toolkit has been in development at the Idaho National Laboratory. This package is intended to provide a modern analysis tool for reactor physics investigation. It is designed with the mindset to maximize accuracy for a given availability of computational resources and to give state of the art tools to the modern nuclear engineer. This is obtained by implementing several different algorithms and meshing approaches among which the user will be able to choose, in order to optimize his computational resources and accuracy needs. The software is completelymore » modular in order to simplify the independent development of modules by different teams and future maintenance. The package is coupled with the thermo-hydraulic code RELAP5-3D [3]. In the following the structure of the different PHISICS modules is briefly recalled, focusing on the new shuffling module (SHUFFLE), object of this paper.« less

  5. A Short Research Note on Calculating Exact Distribution Functions and Random Sampling for the 3D NFW Profile

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Howlett, Cullan

    2018-06-01

    In this short note we publish the analytic quantile function for the Navarro, Frenk & White (NFW) profile. All known published and coded methods for sampling from the 3D NFW PDF use either accept-reject, or numeric interpolation (sometimes via a lookup table) for projecting random Uniform samples through the quantile distribution function to produce samples of the radius. This is a common requirement in N-body initial condition (IC), halo occupation distribution (HOD), and semi-analytic modelling (SAM) work for correctly assigning particles or galaxies to positions given an assumed concentration for the NFW profile. Using this analytic description allows for much faster and cleaner code to solve a common numeric problem in modern astronomy. We release R and Python versions of simple code that achieves this sampling, which we note is trivial to reproduce in any modern programming language.

  6. Development of an object-oriented ORIGEN for advanced nuclear fuel modeling applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, S.; Havloej, F.; Lago, D.

    2013-07-01

    The ORIGEN package serves as the core depletion and decay calculation module within the SCALE code system. A recent major re-factor to the ORIGEN code architecture as part of an overall modernization of the SCALE code system has both greatly enhanced its maintainability as well as afforded several new capabilities useful for incorporating depletion analysis into other code frameworks. This paper will present an overview of the improved ORIGEN code architecture (including the methods and data structures introduced) as well as current and potential future applications utilizing the new ORIGEN framework. (authors)

  7. 76 FR 12308 - Modernizing the FCC Form 477 Data Program; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-07

    ... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Parts 1, 20, and 43 [WCB: WC Docket Nos. 07-38, 09-190, 10-132, 11-10; FCC 11-14] Modernizing the FCC Form 477 Data Program; Correction AGENCY: Federal..., Deputy Manager. [FR Doc. 2011-5095 Filed 3-4-11; 8:45 am] BILLING CODE 6712-01-P ...

  8. Accounting for overdispersion when determining primary care outliers for the identification of chronic kidney disease: learning from the National Chronic Kidney Disease Audit.

    PubMed

    Kim, Lois G; Caplin, Ben; Cleary, Faye; Hull, Sally A; Griffith, Kathryn; Wheeler, David C; Nitsch, Dorothea

    2017-04-01

    Early diagnosis of chronic kidney disease (CKD) facilitates best management in primary care. Testing coverage of those at risk and translation into subsequent diagnostic coding will impact on observed CKD prevalence. Using initial data from 915 general practitioner (GP) practices taking part in a UK national audit, we seek to apply appropriate methods to identify outlying practices in terms of CKD stages 3-5 prevalence and diagnostic coding. We estimate expected numbers of CKD stages 3-5 cases in each practice, adjusted for key practice characteristics, and further inflate the control limits to account for overdispersion related to unobserved factors (including unobserved risk factors for CKD, and between-practice differences in coding and testing). GP practice prevalence of coded CKD stages 3-5 ranges from 0.04 to 7.8%. Practices differ considerably in coding of CKD in individuals where CKD is indicated following testing (ranging from 0 to 97% of those with and glomerular filtration rate  <60 mL/min/1.73 m 2 ). After adjusting for risk factors and overdispersion, the number of  'extreme' practices is reduced from 29 to 2.6% for the low-coded CKD prevalence outcome, from 21 to 1% for high-uncoded CKD stage and from 22 to 2.4% for low total (coded and uncoded) CKD prevalence. Thirty-one practices are identified as outliers for at least one of these outcomes. These can then be categorized into practices needing to address testing, coding or data storage/transfer issues. GP practice prevalence of coded CKD shows wide variation. Accounting for overdispersion is crucial in providing useful information about outlying practices for CKD prevalence. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  9. Delivery and postpartum practices among new mothers in Laputta, Myanmar: intersecting traditional and modern practices and beliefs.

    PubMed

    Diamond-Smith, Nadia; Thet, May Me; Khaing, Ei Ei; Sudhinaraset, May

    2016-09-01

    Myanmar is witnessing increased access to modern maternity care, along with shifting norms and practices. Past research has documented low rates of facility-based deliveries in the country, along with adverse maternal and child health outcomes. Research has also documented diverse traditional practices in the postpartum period, related to maternity care and maternal food intake. Through 34 qualitative interviews with women who recently gave birth and their mothers-in-law in one township in Myanmar (Laputta), we explore factors influencing decision-making around postpartum care and the practices that women engage in. We find that women use both modern and traditional providers because different types of providers play particular roles in the delivery and postpartum period. Despite knowledge of about healthy foods to eat postpartum, many women restrict the intake of certain foods, and mothers-in-laws' beliefs in these practices are particularly strong. Findings suggest that women and their families are balancing two different sets of practices and beliefs, which at times come in conflict. Educational campaigns and programmes should address both modern and traditional beliefs and practices to help women be better able to access safe care and improve their own and their children's health.

  10. Do physicians have an ethical obligation to care for patients with AIDS?

    PubMed Central

    Angoff, N. R.

    1991-01-01

    This paper responds to the question: Do physicians have an ethical obligation to care for patients with acquired immunodeficiency syndrome (AIDS)? First, the social and political milieu in which this question arises is sampled. Here physicians as well as other members of the community are found declaring an unwillingness to be exposed to people with AIDS. Next, laws, regulations, ethical codes and principles, and the history of the practice of medicine are examined, and the literature as it pertains to these areas is reviewed. The obligation to care for patients with AIDS, however, cannot be located in an orientation to morality defined in rules and codes and an appeal to legalistic fairness. By turning to the orientation to morality that emerges naturally from connection and is defined in caring, the physicians' ethical obligation to care for patients with AIDS is found. Through an exploration of the writings of modern medical ethicists, it is clear that the purpose of the practice of medicine is healing, which can only be accomplished in relationship to the patient. It is in relationship to patients that the physician has the opportunity for self-realization. In fact, the physician is physician in relationship to patients and only to the extent that he or she acts virtuously by being morally responsible for and to those patients. Not to do so diminishes the physician's ethical ideal, a vision of the physician as good physician, which has consequences for the physician's capacity to care and for the practice of medicine. PMID:1788990

  11. Promoting research integrity in the geosciences

    NASA Astrophysics Data System (ADS)

    Mayer, Tony

    2015-04-01

    Conducting research in a responsible manner in compliance with codes of research integrity is essential. The geosciences, as with all other areas of research endeavour, has its fair share of misconduct cases and causes celebres. As research becomes more global, more collaborative and more cross-disciplinary, the need for all concerned to work to the same high standards becomes imperative. Modern technology makes it far easier to 'cut and paste', to use Photoshop to manipulate imagery to falsify results at the same time as making research easier and more meaningful. So we need to promote the highest standards of research integrity and the responsible conduct of research. While ultimately, responsibility for misconduct rests with the individual, institutions and the academic research system have to take steps to alleviate the pressure on researchers and promote good practice through training programmes and mentoring. The role of the World Conferences on Research Integrity in promoting the importance of research integrity and statements about good practice will be presented and the need for training and mentoring programmes will be discussed

  12. Changing the paradigm for engineering ethics.

    PubMed

    Schmidt, Jon Alan

    2014-12-01

    Modern philosophy recognizes two major ethical theories: deontology, which encourages adherence to rules and fulfillment of duties or obligations; and consequentialism, which evaluates morally significant actions strictly on the basis of their actual or anticipated outcomes. Both involve the systematic application of universal abstract principles, reflecting the culturally dominant paradigm of technical rationality. Professional societies promulgate codes of ethics with which engineers are expected to comply (deontology), while courts and the public generally assign liability to engineers primarily in accordance with the results of their work, whether intended or unintended (consequentialism). A third option, prominent in ancient philosophy, has reemerged recently: virtue ethics, which recognizes that sensitivity to context and practical judgment are indispensable in particular concrete situations, and therefore rightly focuses on the person who acts, rather than the action itself. Beneficial character traits--i.e., virtues--are identified within a specific social practice in light of the internal goods that are unique to it. This paper proposes a comprehensive framework for implementing virtue ethics within engineering.

  13. Continuous integration for concurrent MOOSE framework and application development on GitHub

    DOE PAGES

    Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.; ...

    2015-11-20

    For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less

  14. Continuous integration for concurrent MOOSE framework and application development on GitHub

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.

    For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less

  15. How to Write a Reproducible Paper

    NASA Astrophysics Data System (ADS)

    Irving, D. B.

    2016-12-01

    The geosciences have undergone a computational revolution in recent decades, to the point where almost all modern research relies heavily on software and code. Despite this profound change in the research methods employed by geoscientists, the reporting of computational results has changed very little in academic journals. This lag has led to something of a reproducibility crisis, whereby it is impossible to replicate and verify most of today's published computational results. While it is tempting to decry the slow response of journals and funding agencies in the face of this crisis, there are very few examples of reproducible research upon which to base new communication standards. In an attempt to address this deficiency, this presentation will describe a procedure for reporting computational results that was employed in a recent Journal of Climate paper. The procedure was developed to be consistent with recommended computational best practices and seeks to minimize the time burden on authors, which has been identified as the most important barrier to publishing code. It should provide a starting point for geoscientists looking to publish reproducible research, and could be adopted by journals as a formal minimum communication standard.

  16. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    NASA Astrophysics Data System (ADS)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.

  17. Quality of recording of diabetes in the UK: how does the GP's method of coding clinical data affect incidence estimates? Cross-sectional study using the CPRD database.

    PubMed

    Tate, A Rosemary; Dungey, Sheena; Glew, Simon; Beloff, Natalia; Williams, Rachael; Williams, Tim

    2017-01-25

    To assess the effect of coding quality on estimates of the incidence of diabetes in the UK between 1995 and 2014. A cross-sectional analysis examining diabetes coding from 1995 to 2014 and how the choice of codes (diagnosis codes vs codes which suggest diagnosis) and quality of coding affect estimated incidence. Routine primary care data from 684 practices contributing to the UK Clinical Practice Research Datalink (data contributed from Vision (INPS) practices). Incidence rates of diabetes and how they are affected by (1) GP coding and (2) excluding 'poor' quality practices with at least 10% incident patients inaccurately coded between 2004 and 2014. Incidence rates and accuracy of coding varied widely between practices and the trends differed according to selected category of code. If diagnosis codes were used, the incidence of type 2 increased sharply until 2004 (when the UK Quality Outcomes Framework was introduced), and then flattened off, until 2009, after which they decreased. If non-diagnosis codes were included, the numbers continued to increase until 2012. Although coding quality improved over time, 15% of the 666 practices that contributed data between 2004 and 2014 were labelled 'poor' quality. When these practices were dropped from the analyses, the downward trend in the incidence of type 2 after 2009 became less marked and incidence rates were higher. In contrast to some previous reports, diabetes incidence (based on diagnostic codes) appears not to have increased since 2004 in the UK. Choice of codes can make a significant difference to incidence estimates, as can quality of recording. Codes and data quality should be checked when assessing incidence rates using GP data. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. Geospace simulations using modern accelerator processor technology

    NASA Astrophysics Data System (ADS)

    Germaschewski, K.; Raeder, J.; Larson, D. J.

    2009-12-01

    OpenGGCM (Open Geospace General Circulation Model) is a well-established numerical code simulating the Earth's space environment. The most computing intensive part is the MHD (magnetohydrodynamics) solver that models the plasma surrounding Earth and its interaction with Earth's magnetic field and the solar wind flowing in from the sun. Like other global magnetosphere codes, OpenGGCM's realism is currently limited by computational constraints on grid resolution. OpenGGCM has been ported to make use of the added computational powerof modern accelerator based processor architectures, in particular the Cell processor. The Cell architecture is a novel inhomogeneous multicore architecture capable of achieving up to 230 GFLops on a single chip. The University of New Hampshire recently acquired a PowerXCell 8i based computing cluster, and here we will report initial performance results of OpenGGCM. Realizing the high theoretical performance of the Cell processor is a programming challenge, though. We implemented the MHD solver using a multi-level parallelization approach: On the coarsest level, the problem is distributed to processors based upon the usual domain decomposition approach. Then, on each processor, the problem is divided into 3D columns, each of which is handled by the memory limited SPEs (synergistic processing elements) slice by slice. Finally, SIMD instructions are used to fully exploit the SIMD FPUs in each SPE. Memory management needs to be handled explicitly by the code, using DMA to move data from main memory to the per-SPE local store and vice versa. We use a modern technique, automatic code generation, which shields the application programmer from having to deal with all of the implementation details just described, keeping the code much more easily maintainable. Our preliminary results indicate excellent performance, a speed-up of a factor of 30 compared to the unoptimized version.

  19. Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers

    DOE PAGES

    Wang, Bei; Ethier, Stephane; Tang, William; ...

    2017-06-29

    The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less

  20. Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Bei; Ethier, Stephane; Tang, William

    The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less

  1. Billing and coding knowledge: a comparative survey of professional coders, practicing orthopedic surgeons, and orthopedic residents.

    PubMed

    Wiley, Kevin F; Yousuf, Tariq; Pasque, Charles B; Yousuf, Khalid

    2014-06-01

    Medical knowledge and surgical skills are necessary to become an effective orthopedic surgeon. To run an efficient practice, the surgeon must also possess a basic understanding of medical business practices, including billing and coding. In this study, we surveyed and compared the level of billing and coding knowledge among current orthopedic residents PGY3 and higher, academic and private practice attending orthopedic surgeons, and orthopedic coding professionals. According to the survey results, residents and fellows have a similar knowledge of coding and billing, regardless of their level of training or type of business education received in residency. Most residents would like formal training in coding, billing, and practice management didactics; this is consistent with data from previous studies.

  2. Secret Codes, Remainder Arithmetic, and Matrices.

    ERIC Educational Resources Information Center

    Peck, Lyman C.

    This pamphlet is designed for use as enrichment material for able junior and senior high school students who are interested in mathematics. No more than a clear understanding of basic arithmetic is expected. Students are introduced to ideas from number theory and modern algebra by learning mathematical ways of coding and decoding secret messages.…

  3. The Nuclear Energy Knowledge and Validation Center – Summary of Activities Conducted in FY15

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans David; Hong, Bonnie Colleen

    2016-05-01

    The Nuclear Energy Knowledge and Validation Center (NEKVaC) is a new initiative by the Department of Energy and the Idaho National Laboratory to coordinate and focus the resources and expertise that exist with the DOE Complex toward solving issues in modern nuclear code validation. In time, code owners, users, and developers will view the Center as a partner and essential resource for acquiring the best practices and latest techniques for validating codes, for guidance in planning and executing experiments, for facilitating access to, and maximizing the usefulness of, existing data, and for preserving knowledge for continual use by nuclear professionalsmore » and organizations for their own validation needs. The scope of the center covers many inter-related activities which will need to be cultivated carefully in the near-term and managed properly once the Center is fully functional. Three areas comprise the principal mission: 1) identification and prioritization of projects that extend the field of validation science and its application to modern codes, 2) adapt or develop best practices and guidelines for high fidelity multiphysics/multiscale analysis code development and associated experiment design, and 3) define protocols for data acquisition and knowledge preservation and provide a portal for access to databases currently scattered among numerous organizations. These mission areas, while each having a unique focus, are inter-dependent and complementary. Likewise, all activities supported by the NEKVaC, both near-term and long-term), must possess elements supporting all three. This cross-cutting nature is essential to ensuring that activities and supporting personnel do not become ‘stove-piped’, i.e. focused so much on a specific function that the activity itself becomes the objective rather than the achieving the larger vision. Achieving the broader vision will require a healthy and accountable level of activity in each of the areas. This will take time and significant DOE support. Growing too fast (budget-wise) will not allow ideas to mature, lessons to be learned, and taxpayer money to be spent responsibly. The process should be initiated with a small set of tasks, executed over a short but reasonable term, that will exercise most if not all aspects of the Center’s potential operation. The initial activities described in this report have a high potential for near-term success in demonstrating Center objectives but also to work out some of the issues in task execution, communication between functional elements, and the ability to raise awareness of the Center and cement stakeholder buy-in. This report begins with a description of the Mission areas; specifically the role played by each and the types of activities for which they are responsible. It then lists and describes the proposed near-term tasks upon which future efforts can build.« less

  4. Parallel-vector computation for structural analysis and nonlinear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.

    1990-01-01

    Practical engineering application can often be formulated in the form of a constrained optimization problem. There are several solution algorithms for solving a constrained optimization problem. One approach is to convert a constrained problem into a series of unconstrained problems. Furthermore, unconstrained solution algorithms can be used as part of the constrained solution algorithms. Structural optimization is an iterative process where one starts with an initial design, a finite element structure analysis is then performed to calculate the response of the system (such as displacements, stresses, eigenvalues, etc.). Based upon the sensitivity information on the objective and constraint functions, an optimizer such as ADS or IDESIGN, can be used to find the new, improved design. For the structural analysis phase, the equation solver for the system of simultaneous, linear equations plays a key role since it is needed for either static, or eigenvalue, or dynamic analysis. For practical, large-scale structural analysis-synthesis applications, computational time can be excessively large. Thus, it is necessary to have a new structural analysis-synthesis code which employs new solution algorithms to exploit both parallel and vector capabilities offered by modern, high performance computers such as the Convex, Cray-2 and Cray-YMP computers. The objective of this research project is, therefore, to incorporate the latest development in the parallel-vector equation solver, PVSOLVE into the widely popular finite-element production code, such as the SAP-4. Furthermore, several nonlinear unconstrained optimization subroutines have also been developed and tested under a parallel computer environment. The unconstrained optimization subroutines are not only useful in their own right, but they can also be incorporated into a more popular constrained optimization code, such as ADS.

  5. Coding for urologic office procedures.

    PubMed

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Containment Sodium Chemistry Models in MELCOR.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Humphries, Larry L.; Denman, Matthew R

    To meet regulatory needs for sodium fast reactors’ future development, including licensing requirements, Sandia National Laboratories is modernizing MELCOR, a severe accident analysis computer code developed for the U.S. Nuclear Regulatory Commission (NRC). Specifically, Sandia is modernizing MELCOR to include the capability to model sodium reactors. However, Sandia’s modernization effort primarily focuses on the containment response aspects of the sodium reactor accidents. Sandia began modernizing MELCOR in 2013 to allow a sodium coolant, rather than water, for conventional light water reactors. In the past three years, Sandia has been implementing the sodium chemistry containment models in CONTAIN-LMR, a legacy NRCmore » code, into MELCOR. These chemistry models include spray fire, pool fire and atmosphere chemistry models. Only the first two chemistry models have been implemented though it is intended to implement all these models into MELCOR. A new package called “NAC” has been created to manage the sodium chemistry model more efficiently. In 2017 Sandia began validating the implemented models in MELCOR by simulating available experiments. The CONTAIN-LMR sodium models include sodium atmosphere chemistry and sodium-concrete interaction models. This paper presents sodium property models, the implemented models, implementation issues, and a path towards validation against existing experimental data.« less

  7. SUPREM-DSMC: A New Scalable, Parallel, Reacting, Multidimensional Direct Simulation Monte Carlo Flow Code

    NASA Technical Reports Server (NTRS)

    Campbell, David; Wysong, Ingrid; Kaplan, Carolyn; Mott, David; Wadsworth, Dean; VanGilder, Douglas

    2000-01-01

    An AFRL/NRL team has recently been selected to develop a scalable, parallel, reacting, multidimensional (SUPREM) Direct Simulation Monte Carlo (DSMC) code for the DoD user community under the High Performance Computing Modernization Office (HPCMO) Common High Performance Computing Software Support Initiative (CHSSI). This paper will introduce the JANNAF Exhaust Plume community to this three-year development effort and present the overall goals, schedule, and current status of this new code.

  8. Ethics in sports medicine.

    PubMed

    Dunn, Warren R; George, Michael S; Churchill, Larry; Spindler, Kurt P

    2007-05-01

    Physicians have struggled with the medical ramifications of athletic competition since ancient Greece, where rational medicine and organized athletics originated. Historically, the relationship between sport and medicine was adversarial because of conflicts between health and sport. However, modern sports medicine has emerged with the goal of improving performance and preventing injury, and the concept of the "team physician" has become an integral part of athletic culture. With this distinction come unique ethical challenges because the customary ethical norms for most forms of clinical practice, such as confidentiality and patient autonomy, cannot be translated easily into sports medicine. The particular areas of medical ethics that present unique challenges in sports medicine are informed consent, third parties, advertising, confidentiality, drug use, and innovative technology. Unfortunately, there is no widely accepted code of sports medicine ethics that adequately addresses these issues.

  9. High-Performance Design Patterns for Modern Fortran

    DOE PAGES

    Haveraaen, Magne; Morris, Karla; Rouson, Damian; ...

    2015-01-01

    This paper presents ideas for using coordinate-free numerics in modern Fortran to achieve code flexibility in the partial differential equation (PDE) domain. We also show how Fortran, over the last few decades, has changed to become a language well-suited for state-of-the-art software development. Fortran’s new coarray distributed data structure, the language’s class mechanism, and its side-effect-free, pure procedure capability provide the scaffolding on which we implement HPC software. These features empower compilers to organize parallel computations with efficient communication. We present some programming patterns that support asynchronous evaluation of expressions comprised of parallel operations on distributed data. We implemented thesemore » patterns using coarrays and the message passing interface (MPI). We compared the codes’ complexity and performance. The MPI code is much more complex and depends on external libraries. The MPI code on Cray hardware using the Cray compiler is 1.5–2 times faster than the coarray code on the same hardware. The Intel compiler implements coarrays atop Intel’s MPI library with the result apparently being 2–2.5 times slower than manually coded MPI despite exhibiting nearly linear scaling efficiency. As compilers mature and further improvements to coarrays comes in Fortran 2015, we expect this performance gap to narrow.« less

  10. ALEGRA -- A massively parallel h-adaptive code for solid dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summers, R.M.; Wong, M.K.; Boucheron, E.A.

    1997-12-31

    ALEGRA is a multi-material, arbitrary-Lagrangian-Eulerian (ALE) code for solid dynamics designed to run on massively parallel (MP) computers. It combines the features of modern Eulerian shock codes, such as CTH, with modern Lagrangian structural analysis codes using an unstructured grid. ALEGRA is being developed for use on the teraflop supercomputers to conduct advanced three-dimensional (3D) simulations of shock phenomena important to a variety of systems. ALEGRA was designed with the Single Program Multiple Data (SPMD) paradigm, in which the mesh is decomposed into sub-meshes so that each processor gets a single sub-mesh with approximately the same number of elements. Usingmore » this approach the authors have been able to produce a single code that can scale from one processor to thousands of processors. A current major effort is to develop efficient, high precision simulation capabilities for ALEGRA, without the computational cost of using a global highly resolved mesh, through flexible, robust h-adaptivity of finite elements. H-adaptivity is the dynamic refinement of the mesh by subdividing elements, thus changing the characteristic element size and reducing numerical error. The authors are working on several major technical challenges that must be met to make effective use of HAMMER on MP computers.« less

  11. Divergent evolution of part of the involucrin gene in the hominoids: Unique intragenic duplications in the gorilla and human

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teumer, J.; Green, H.

    1989-02-01

    The gene for involucrin, an epidermal protein, has been remodeled in the higher primates. Most of the coding region of the human gene consists of a modern segment of repeats derived from a 10-codon sequence present in the ancestral segment of the gene. The modern segment can be divided into early, middle, and late regions. The authors report here the nucleotide sequence of three alleles of the gorilla involucrin gene. Each possesses a modern segment homologous to that of the human and consisting of 10-codon repeats. The early and middle regions are similar to the corresponding regions of the humanmore » allele and are nearly identical among the different gorilla alleles. The late region consists of recent duplications whose pattern is unique in each of the gorilla alleles and in the human allele. The early region is located in what is now the 3{prime} third of the modern segment, and the late, polymorphic region is located in what is now the 5{prime} third. Therefore, as the modern segment expanded during evolution, its 3{prime} end became stabilized, and continuing duplications became confined to its 5{prime} end. The expansion of the involucrin coding region, which began long before the separation of the gorilla and human, has continued in both species after their separation.« less

  12. Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?

    PubMed

    Tai, Tracy Waize; Anandarajah, Sobanna; Dhoul, Neil; de Lusignan, Simon

    2007-01-01

    Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultation. The six clinical conditions examined were: depression, cystitis, type 2 diabetes mellitus, sore throat, tired all the time, and myocardial infarction. We looked at the picking lists generated for these problem titles in EMIS, IPS, GPASS and iSOFT general practice clinical computer systems, using the Triset browser as a gold standard for comparison. A mean of 19.3 codes is offered in the picking list after entering a diagnosis or problem title. EMIS produced the longest picking lists and GPASS the shortest, with a mean number of choices of 35.2 and 12.7, respectively. Approximately three-quarters (73.5%) of codes are diagnoses, one-eighth (12.5%) symptom codes, and the remainder come from a range of Read chapters. There was no readily detectable consistent order in which codes were displayed. Velocity coding, whereby commonly-used codes are placed higher in the picking list, results in variation between practices even where they have the same brand of computer system. Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.

  13. Managers, Teachers, Students, and Parents' Opinions Concerning Changes on Dress Code Practices as an Educational Policy

    ERIC Educational Resources Information Center

    Birel, Firat Kiyas

    2016-01-01

    Problem Statement: Dressing for school has been intensely disputed and has led to periodic changes in dress codes since the foundation of the Turkish republic. Practitioners have tried to put some new practices related to school dress codes into practice for redressing former dress code issues involving mandatory dress standards for both students…

  14. Embedding Secure Coding Instruction into the IDE: Complementing Early and Intermediate CS Courses with ESIDE

    ERIC Educational Resources Information Center

    Whitney, Michael; Lipford, Heather Richter; Chu, Bill; Thomas, Tyler

    2018-01-01

    Many of the software security vulnerabilities that people face today can be remediated through secure coding practices. A critical step toward the practice of secure coding is ensuring that our computing students are educated on these practices. We argue that secure coding education needs to be included across a computing curriculum. We are…

  15. Special Section: Complementary and Alternative Medicine (CAM): Acupuncture From Ancient Practice to Modern Science

    MedlinePlus

    ... Home Current Issue Past Issues Special Section CAM Acupuncture From Ancient Practice to Modern Science Past Issues / ... percent of U.S. adults use acupuncture. What Is Acupuncture? Dr. Adeline Ge adjusts placement of acupuncture needles ...

  16. Concept analysis: malpractice and modern-day nursing practice.

    PubMed

    Weld, Konstantine Keian; Garmon Bibb, Sandra C

    2009-01-01

    The concept of malpractice can mean different things depending upon the context in which the term is used. This can lead to confusion about the standard of care required for nurses engaged in modern-day nursing practice. This paper examines the attributes and characteristics of the concept of malpractice using Walker and Avant's (2005) eight-step methodology. CINAHL, PubMed, and PsychINFO. Exposure to malpractice liability is an unfortunate consequence of modern-day nursing practice. An understanding of malpractice will assist nurses in identifying situations that may expose them to legal liability and hopefully lead to improved patient care.

  17. Barriers of modern contraceptive practices among Asian women: a mini literature review.

    PubMed

    Najafi-Sharjabad, Fatemeh; Zainiyah Syed Yahya, Sharifah; Abdul Rahman, Hejar; Hanafiah Juni, Muhamad; Abdul Manaf, Rosliza

    2013-07-22

    Family planning has been cited as essential to the achievement of Millennium Development Goals (MDG). Family planning has a direct impact on women's health and consequence of each pregnancy. The use of modern contraception among Asian women is less than global average. In Asia a majority of unintended pregnancies are due to using traditional contraceptive or no methods which lead to induced unsafe abortion. Cultural attitudes, lack of knowledge of methods and reproduction, socio demographic factors, and health service barriers are the main obstacles to modern contraceptive practice among Asian women. Culturally sensitive family planning program, reforming health system, and reproductive health education through mass media to create awareness of the benefits of planned parenthood are effective strategies to improve modern contraceptive practice among Asian women.

  18. Barriers of Modern Contraceptive Practices among Asian Women: A Mini Literature Review

    PubMed Central

    Najafi-Sharjabad, Fatemeh; Syed Yahya, Sharifah Zainiyah; Rahman, Hejar Abdul; Hanafiah, Muhamad; Abdul Manaf, Rosliza

    2013-01-01

    Family planning has been cited as essential to the achievement of Millennium Development Goals (MDG). Family planning has a direct impact on women's health and consequence of each pregnancy. The use of modern contraception among Asian women is less than global average. In Asia a majority of unintended pregnancies are due to using traditional contraceptive or no methods which lead to induced unsafe abortion. Cultural attitudes, lack of knowledge of methods and reproduction, socio demographic factors, and health service barriers are the main obstacles to modern contraceptive practice among Asian women. Culturally sensitive family planning program, reforming health system, and reproductive health education through mass media to create awareness of the benefits of planned parenthood are effective strategies to improve modern contraceptive practice among Asian women. PMID:23985120

  19. Assessment of uncertainties of the models used in thermal-hydraulic computer codes

    NASA Astrophysics Data System (ADS)

    Gricay, A. S.; Migrov, Yu. A.

    2015-09-01

    The article deals with matters concerned with the problem of determining the statistical characteristics of variable parameters (the variation range and distribution law) in analyzing the uncertainty and sensitivity of calculation results to uncertainty in input data. A comparative analysis of modern approaches to uncertainty in input data is presented. The need to develop an alternative method for estimating the uncertainty of model parameters used in thermal-hydraulic computer codes, in particular, in the closing correlations of the loop thermal hydraulics block, is shown. Such a method shall feature the minimal degree of subjectivism and must be based on objective quantitative assessment criteria. The method includes three sequential stages: selecting experimental data satisfying the specified criteria, identifying the key closing correlation using a sensitivity analysis, and carrying out case calculations followed by statistical processing of the results. By using the method, one can estimate the uncertainty range of a variable parameter and establish its distribution law in the above-mentioned range provided that the experimental information is sufficiently representative. Practical application of the method is demonstrated taking as an example the problem of estimating the uncertainty of a parameter appearing in the model describing transition to post-burnout heat transfer that is used in the thermal-hydraulic computer code KORSAR. The performed study revealed the need to narrow the previously established uncertainty range of this parameter and to replace the uniform distribution law in the above-mentioned range by the Gaussian distribution law. The proposed method can be applied to different thermal-hydraulic computer codes. In some cases, application of the method can make it possible to achieve a smaller degree of conservatism in the expert estimates of uncertainties pertinent to the model parameters used in computer codes.

  20. [Prevention and control of substance abuse in the workplace: a new and significant opportunity for the occupational physician].

    PubMed

    Riboldi, L; Porru, S; Feltrin, G; Latocca, R; Bonzini, M; Bordini, L; Ferrario, M M

    2009-01-01

    Substance abuse is nowadays a recurrent theme in the daily practice of occupational physicians (OP), mainly owing to recent legislation prescribing mandatory assessments for workers performing job tasks involving danger to third parties. While some degree of bureaucracy is inevitable and legislation seems to be inclined towards deterrence, it is recommended to take advantage of the opportunities offered for practical interventions which, in accordance with science and ethics, the OP can carry out in the workplace. Risk assessment, health surveillance, fitness for work, health promotion and cooperation in management issues are the areas of intervention required for the OP to fully accomplish his role in the practice of modern occupational health. We propose specific activities for the OP so as to highlight roles and obligations, based on available scientific evidence and established codes of ethics. Lastly, we wish to emphasize the overall role of the OP in taking on responsibilities shared jointly with all the parties and in the approach to the substance abuse problem in all workplaces with the ultimate goal of acting for the benefit of workers, enterprises and society in general.

  1. Cultures of simulations vs. cultures of calculations? The development of simulation practices in meteorology and astrophysics

    NASA Astrophysics Data System (ADS)

    Sundberg, Mikaela

    While the distinction between theory and experiment is often used to discuss the place of simulation from a philosophical viewpoint, other distinctions are possible from a sociological perspective. Turkle (1995) distinguishes between cultures of calculation and cultures of simulation and relates these cultures to the distinction between modernity and postmodernity, respectively. What can we understand about contemporary simulation practices in science by looking at them from the point of view of these two computer cultures? What new questions does such an analysis raise for further studies? On the basis of two case studies, the present paper compares and discusses simulation activities in astrophysics and meteorology. It argues that simulation practices manifest aspects of both of these cultures simultaneously, but in different situations. By employing the dichotomies surface/depth, play/seriousness, and extreme/reasonable to characterize and operationalize cultures of calculation and cultures of simulation as sensitizing concepts, the analysis shows how simulation code work shifts from development to use, the importance of but also resistance towards too much visualizations, and how simulation modelers play with extreme values, yet also try to achieve reasonable results compared to observations.

  2. A 21st Century Training Model for Flexible, Quick, and Life-Long Workforce Development

    DTIC Science & Technology

    2016-02-01

    specialty code . Differentiation and tailored training are made possible through modern talent management. 8 When Joslin entered Initial Skills...associated with the IST pipeline, but also identified five overarching themes:  Talent Management,  Asynchronous Training,  Modularity (coaching...augmented reality Figure 1: The combination of modern recruitment, talent management, and modular training both in the school house and online speed

  3. Shifting Codes: Education or Regulation? Trainee Teachers and the Code of Conduct and Practice in England

    ERIC Educational Resources Information Center

    Spendlove, David; Barton, Amanda; Hallett, Fiona; Shortt, Damien

    2012-01-01

    In 2009, the General Teaching Council for England (GTCE) introduced a revised Code of Conduct and Practice (2009) for registered teachers. The code also applies to all trainee teachers who are provisionally registered with the GTCE and who could be liable to a charge of misconduct during their periods of teaching practice. This paper presents the…

  4. Automated Coding Software: Development and Use to Enhance Anti-Fraud Activities*

    PubMed Central

    Garvin, Jennifer H.; Watzlaf, Valerie; Moeini, Sohrab

    2006-01-01

    This descriptive research project identified characteristics of automated coding systems that have the potential to detect improper coding and to minimize improper or fraudulent coding practices in the setting of automated coding used with the electronic health record (EHR). Recommendations were also developed for software developers and users of coding products to maximize anti-fraud practices. PMID:17238546

  5. Articles on Practical Cybernetics. Computer-Developed Computers; Heuristics and Modern Sciences; Linguistics and Practice; Cybernetics and Moral-Ethical Considerations; and Men and Machines at the Chessboard.

    ERIC Educational Resources Information Center

    Berg, A. I.; And Others

    Five articles which were selected from a Russian language book on cybernetics and then translated are presented here. They deal with the topics of: computer-developed computers, heuristics and modern sciences, linguistics and practice, cybernetics and moral-ethical considerations, and computer chess programs. (Author/JY)

  6. MODTRAN6: a major upgrade of the MODTRAN radiative transfer code

    NASA Astrophysics Data System (ADS)

    Berk, Alexander; Conforti, Patrick; Kennett, Rosemary; Perkins, Timothy; Hawes, Frederick; van den Bosch, Jeannette

    2014-06-01

    The MODTRAN6 radiative transfer (RT) code is a major advancement over earlier versions of the MODTRAN atmospheric transmittance and radiance model. This version of the code incorporates modern software ar- chitecture including an application programming interface, enhanced physics features including a line-by-line algorithm, a supplementary physics toolkit, and new documentation. The application programming interface has been developed for ease of integration into user applications. The MODTRAN code has been restructured towards a modular, object-oriented architecture to simplify upgrades as well as facilitate integration with other developers' codes. MODTRAN now includes a line-by-line algorithm for high resolution RT calculations as well as coupling to optical scattering codes for easy implementation of custom aerosols and clouds.

  7. The role of non-technical skills in surgery

    PubMed Central

    Agha, Riaz A.; Fowler, Alexander J.; Sevdalis, Nick

    2015-01-01

    Non-technical skills are of increasing importance in surgery and surgical training. A traditional focus on technical skills acquisition and competence is no longer enough for the delivery of a modern, safe surgical practice. This review discusses the importance of non-technical skills and the values that underpin successful modern surgical practice. This narrative review used a number of sources including written and online, there was no specific search strategy of defined databases. Modern surgical practice requires; technical and non-technical skills, evidence-based practice, an emphasis on lifelong learning, monitoring of outcomes and a supportive institutional and health service framework. Finally these requirements need to be combined with a number of personal and professional values including integrity, professionalism and compassionate, patient-centred care. PMID:26904193

  8. Radiation oncology services in the modern era: evolving patterns of usage and payments in the office setting for medicare patients from 2000 to 2010.

    PubMed

    Shen, Xinglei; Showalter, Timothy N; Mishra, Mark V; Barth, Sanford; Rao, Vijay; Levin, David; Parker, Laurence

    2014-07-01

    We evaluated long-term changes in the volume and payments for radiation oncology services in the intensity-modulated radiation therapy (IMRT) era from 2000 to 2010 using a database of Medicare claims. We used the Medicare Physician/Supplier Procedure Summary Master File (PSPSMF) for each year from 2000 to 2010 to tabulate the volume and payments for radiation oncology services. This database provides a summary of each billing code submitted to Medicare part B. We identified all codes used in radiation oncology services and categorized billing codes by treatment modality and place of service. We focused our analysis on office-based practices. Total office-based patient volume increased 8.2% from 2000 to 2010, whereas total payments increased 217%. Increase in overall payments increased dramatically from 2000 to 2007, but subsequently plateaued from 2008 to 2010. Increases in complexity of care, and image guidance in particular, have also resulted in higher payments. The cost of radiation oncology services increased from 2000 to 2010, mostly due to IMRT, but also with significant contribution from increased overall complexity of care. A cost adjustment occurred after 2007, limiting further growth of payments. Future health policy studies should explore the potential for further cost containment, including differences in use between freestanding and hospital outpatient facilities. Copyright © 2014 by American Society of Clinical Oncology.

  9. Xyce Parallel Electronic Simulator Users' Guide Version 6.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik Venkatraman; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase$-$ a message passing parallel implementation $-$ which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  10. Understanding and Writing G & M Code for CNC Machines

    ERIC Educational Resources Information Center

    Loveland, Thomas

    2012-01-01

    In modern CAD and CAM manufacturing companies, engineers design parts for machines and consumable goods. Many of these parts are cut on CNC machines. Whether using a CNC lathe, milling machine, or router, the ideas and designs of engineers must be translated into a machine-readable form called G & M Code that can be used to cut parts to precise…

  11. Modern Teaching Methods in Physics with the Aid of Original Computer Codes and Graphical Representations

    ERIC Educational Resources Information Center

    Ivanov, Anisoara; Neacsu, Andrei

    2011-01-01

    This study describes the possibility and advantages of utilizing simple computer codes to complement the teaching techniques for high school physics. The authors have begun working on a collection of open source programs which allow students to compare the results and graphics from classroom exercises with the correct solutions and further more to…

  12. Improving coding accuracy in an academic practice.

    PubMed

    Nguyen, Dana; O'Mara, Heather; Powell, Robert

    2017-01-01

    Practice management has become an increasingly important component of graduate medical education. This applies to every practice environment; private, academic, and military. One of the most critical aspects of practice management is documentation and coding for physician services, as they directly affect the financial success of any practice. Our quality improvement project aimed to implement a new and innovative method for teaching billing and coding in a longitudinal fashion in a family medicine residency. We hypothesized that implementation of a new teaching strategy would increase coding accuracy rates among residents and faculty. Design: single group, pretest-posttest. military family medicine residency clinic. Study populations: 7 faculty physicians and 18 resident physicians participated as learners in the project. Educational intervention: monthly structured coding learning sessions in the academic curriculum that involved learner-presented cases, small group case review, and large group discussion. overall coding accuracy (compliance) percentage and coding accuracy per year group for the subjects that were able to participate longitudinally. Statistical tests used: average coding accuracy for population; paired t test to assess improvement between 2 intervention periods, both aggregate and by year group. Overall coding accuracy rates remained stable over the course of time regardless of the modality of the educational intervention. A paired t test was conducted to compare coding accuracy rates at baseline (mean (M)=26.4%, SD=10%) to accuracy rates after all educational interventions were complete (M=26.8%, SD=12%); t24=-0.127, P=.90. Didactic teaching and small group discussion sessions did not improve overall coding accuracy in a residency practice. Future interventions could focus on educating providers at the individual level.

  13. Practice-Oriented Teachers' Training: Innovative Approach

    ERIC Educational Resources Information Center

    Shukshina, Tatjana I.; Gorshenina, Svetlana N.; Buyanova, Irina B.; Neyasova, Irina A.

    2016-01-01

    Modernization of Russian education meets the global trend of professionalization of teachers' training which assumes strengthening the practical orientation of educational programs as a significant factor in increasing the competitiveness of the teacher in the modern educational environment. The purpose of the article is to identify and…

  14. Ethical and educational considerations in coding hand surgeries.

    PubMed

    Lifchez, Scott D; Leinberry, Charles F; Rivlin, Michael; Blazar, Philip E

    2014-07-01

    To assess treatment coding knowledge and practices among residents, fellows, and attending hand surgeons. Through the use of 6 hypothetical cases, we developed a coding survey to assess coding knowledge and practices. We e-mailed this survey to residents, fellows, and attending hand surgeons. In additionally, we asked 2 professional coders to code these cases. A total of 71 participants completed the survey out of 134 people to whom the survey was sent (response rate = 53%). We observed marked disparity in codes chosen among surgeons and among professional coders. Results of this study indicate that coding knowledge, not just its ethical application, had a major role in coding procedures accurately. Surgical coding is an essential part of a hand surgeon's practice and is not well learned during residency or fellowship. Whereas ethical issues such as deliberate unbundling and upcoding may have a role in inaccurate coding, lack of knowledge among surgeons and coders has a major role as well. Coding has a critical role in every hand surgery practice. Inconstancies among those polled in this study reveal that an increase in education on coding during training and improvement in the clarity and consistency of the Current Procedural Terminology coding rules themselves are needed. Copyright © 2014 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  15. Measurement and computer simulation of antennas on ships and aircraft for results of operational reliability

    NASA Astrophysics Data System (ADS)

    Kubina, Stanley J.

    1989-09-01

    The review of the status of computational electromagnetics by Miller and the exposition by Burke of the developments in one of the more important computer codes in the application of the electric field integral equation method, the Numerical Electromagnetic Code (NEC), coupled with Molinet's summary of progress in techniques based on the Geometrical Theory of Diffraction (GTD), provide a clear perspective on the maturity of the modern discipline of computational electromagnetics and its potential. Audone's exposition of the application to the computation of Radar Scattering Cross-section (RCS) is an indication of the breadth of practical applications and his exploitation of modern near-field measurement techniques reminds one of progress in the measurement discipline which is essential to the validation or calibration of computational modeling methodology when applied to complex structures such as aircraft and ships. The latter monograph also presents some comparison results with computational models. Some of the results presented for scale model and flight measurements show some serious disagreements in the lobe structure which would require some detailed examination. This also applies to the radiation patterns obtained by flight measurement compared with those obtained using wire-grid models and integral equation modeling methods. In the examples which follow, an attempt is made to match measurements results completely over the entire 2 to 30 MHz HF range for antennas on a large patrol aircraft. The problem of validating computer models of HF antennas on a helicopter and using computer models to generate radiation pattern information which cannot be obtained by measurements are discussed. The use of NEC computer models to analyze top-side ship configurations where measurement results are not available and only self-validation measures are available or at best comparisons with an alternate GTD computer modeling technique is also discussed.

  16. Listening to Brain Microcircuits for Interfacing With External World—Progress in Wireless Implantable Microelectronic Neuroengineering Devices

    PubMed Central

    Nurmikko, Arto V.; Donoghue, John P.; Hochberg, Leigh R.; Patterson, William R.; Song, Yoon-Kyu; Bull, Christopher W.; Borton, David A.; Laiwalla, Farah; Park, Sunmee; Ming, Yin; Aceros, Juan

    2011-01-01

    Acquiring neural signals at high spatial and temporal resolution directly from brain microcircuits and decoding their activity to interpret commands and/or prior planning activity, such as motion of an arm or a leg, is a prime goal of modern neurotechnology. Its practical aims include assistive devices for subjects whose normal neural information pathways are not functioning due to physical damage or disease. On the fundamental side, researchers are striving to decipher the code of multiple neural microcircuits which collectively make up nature’s amazing computing machine, the brain. By implanting biocompatible neural sensor probes directly into the brain, in the form of microelectrode arrays, it is now possible to extract information from interacting populations of neural cells with spatial and temporal resolution at the single cell level. With parallel advances in application of statistical and mathematical techniques tools for deciphering the neural code, extracted populations or correlated neurons, significant understanding has been achieved of those brain commands that control, e.g., the motion of an arm in a primate (monkey or a human subject). These developments are accelerating the work on neural prosthetics where brain derived signals may be employed to bypass, e.g., an injured spinal cord. One key element in achieving the goals for practical and versatile neural prostheses is the development of fully implantable wireless microelectronic “brain-interfaces” within the body, a point of special emphasis of this paper. PMID:21654935

  17. The Literate Lives of Chamorro Women in Modern Guam

    ERIC Educational Resources Information Center

    Santos-Bamba, Sharleen J.Q.

    2010-01-01

    This ethnographic study traces the language and literacy attitudes, perceptions, and practices of three generations of indigenous Chamorro women in modern Guam. Through the lens of postcolonial theory, cultural literacy, intergenerational transmission theory, community of practice, and language and identity, this study examines how literacy is…

  18. Selected Problems of Applying the Law in Adaptation and Modernization of Buildings in Poland

    NASA Astrophysics Data System (ADS)

    Korbel, Wojciech

    2016-06-01

    Chosen problems of law implementation in the contemporary process of building's modernization in Poland. One of the major problems in the contemporary process of building's modernization in Poland is the pluralism of different interpretations of chosen legal terms, existing in the contemporary building code. Incorrect interpretation, results in the incorrect application to the authorities for the proper building permit and as the effect, it causes the lost of time and money. The article tries to identify some of these problems and seeks the solution to solve them, through the evolutionary method of building law creation.

  19. Utilizing codes of ethics in health professions education.

    PubMed

    Dahnke, Michael D

    2014-10-01

    Codes of ethics abound in health care, the aims and purposes of which are multiple and varied, from operating as a decision making tool to acting as a standard of practice that can be operational in a legal context to providing a sense of elevated seriousness and professionalism within a field of practice. There is some doubt and controversy, however, regarding the value and use of these codes both in professional practice and in the education of healthcare professionals. I intend to review and analyze the various aims and purposes of ethics codes particularly within the study and practice of healthcare in light of various criticisms of codes of ethics. After weighing the strength and import of these criticisms, I plan to explore effective means for utilizing such codes as part of the ethics education of healthcare professionals. While noting significant limitations of this tool, both in practice and in education, I plan to demonstrate its potential usefulness as well, in both generating critical thinking within the study of ethics and as a guide for practice for the professional.

  20. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David W. Nigg, Principal Investigator; Kevin A. Steuhm, Project Manager

    Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance, and to some extent, experiment management, are inconsistent with the state of modern nuclear engineering practice, and are difficult, if not impossible, to properly verify and validate (V&V) according to modern standards. Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In late 2009, the Idaho National Laboratory (INL) initiated a focused effort, the ATR Core Modeling Updatemore » Project, to address this situation through the introduction of modern high-fidelity computational software and protocols. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF). The ATR Core Modeling Update Project, targeted for full implementation in phase with the next anticipated ATR Core Internals Changeout (CIC) in the 2014-2015 time frame, began during the last quarter of Fiscal Year 2009, and has just completed its third full year. Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL under various licensing arrangements. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009, Cycle 145A through Cycle 151B, was successfully completed during 2012. This major effort supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR Core Safety Analysis Package (CSAP) preparation process, in parallel with the established PDQ-based methodology, beginning late in Fiscal Year 2012. Acquisition of the advanced SERPENT (VTT-Finland) and MC21 (DOE-NR) Monte Carlo stochastic neutronics simulation codes was also initiated during the year and some initial applications of SERPENT to ATRC experiment analysis were demonstrated. These two new codes will offer significant additional capability, including the possibility of full-3D Monte Carlo fuel management support capabilities for the ATR at some point in the future. Finally, a capability for rigorous sensitivity analysis and uncertainty quantification based on the TSUNAMI system has been implemented and initial computational results have been obtained. This capability will have many applications as a tool for understanding the margins of uncertainty in the new models as well as for validation experiment design and interpretation.« less

  1. Long Non-Coding RNAs Regulating Immunity in Insects

    PubMed Central

    Satyavathi, Valluri; Ghosh, Rupam; Subramanian, Srividya

    2017-01-01

    Recent advances in modern technology have led to the understanding that not all genetic information is coded into protein and that the genomes of each and every organism including insects produce non-coding RNAs that can control different biological processes. Among RNAs identified in the last decade, long non-coding RNAs (lncRNAs) represent a repertoire of a hidden layer of internal signals that can regulate gene expression in physiological, pathological, and immunological processes. Evidence shows the importance of lncRNAs in the regulation of host–pathogen interactions. In this review, an attempt has been made to view the role of lncRNAs regulating immune responses in insects. PMID:29657286

  2. Projection of Patient Condition Code Distributions Based on Mechanism of Injury

    DTIC Science & Technology

    2003-01-01

    The Medical Readiness and Strategic Plan (MRSP)1998-20041 requires that the military services develop a method for linking real world patient load...data with modern Patient Condition (PC) codes to enable planners to forecast medical workload and resource requirements. Determination of the likely...various levels of medical care. Medical planners and logisticians plan for medical contingencies based on anticipated patient streams, distributions of

  3. Green disease in optical coherence tomography diagnosis of glaucoma.

    PubMed

    Sayed, Mohamed S; Margolis, Michael; Lee, Richard K

    2017-03-01

    Optical coherence tomography (OCT) has become an integral component of modern glaucoma practice. Utilizing color codes, OCT analysis has rendered glaucoma diagnosis and follow-up simpler and faster for the busy clinician. However, green labeling of OCT parameters suggesting normal values may confer a false sense of security, potentially leading to missed diagnoses of glaucoma and/or glaucoma progression. Conditions in which OCT color coding may be falsely negative (i.e., green disease) are identified. Early glaucoma in which retinal nerve fiber layer (RNFL) thickness and optic disc parameters, albeit labeled green, are asymmetric in both eyes may result in glaucoma being undetected. Progressively decreasing RNFL thickness may reveal the presence of progressive glaucoma that, because of green labeling, can be missed by the clinician. Other ocular conditions that can increase RNFL thickness can make the diagnosis of coexisting glaucoma difficult. Recently introduced progression analysis features of OCT may help detect green disease. Recognition of green disease is of paramount importance in diagnosing and treating glaucoma. Understanding the limitations of imaging technologies coupled with evaluation of serial OCT analyses, prompt clinical examination, and structure-function correlation is important to avoid missing real glaucoma requiring treatment.

  4. Numerical investigation of galloping instabilities in Z-shaped profiles.

    PubMed

    Gomez, Ignacio; Chavez, Miguel; Alonso, Gustavo; Valero, Eusebio

    2014-01-01

    Aeroelastic effects are relatively common in the design of modern civil constructions such as office blocks, airport terminal buildings, and factories. Typical flexible structures exposed to the action of wind are shading devices, normally slats or louvers. A typical cross-section for such elements is a Z-shaped profile, made out of a central web and two-side wings. Galloping instabilities are often determined in practice using the Glauert-Den Hartog criterion. This criterion relies on accurate predictions of the dependence of the aerodynamic force coefficients with the angle of attack. The results of a parametric analysis based on a numerical analysis and performed on different Z-shaped louvers to determine translational galloping instability regions are presented in this paper. These numerical analysis results have been validated with a parametric analysis of Z-shaped profiles based on static wind tunnel tests. In order to perform this validation, the DLR TAU Code, which is a standard code within the European aeronautical industry, has been used. This study highlights the focus on the numerical prediction of the effect of galloping, which is shown in a visible way, through stability maps. Comparisons between numerical and experimental data are presented with respect to various meshes and turbulence models.

  5. Certifying an Irreducible 1024-Dimensional Photonic State Using Refined Dimension Witnesses.

    PubMed

    Aguilar, Edgar A; Farkas, Máté; Martínez, Daniel; Alvarado, Matías; Cariñe, Jaime; Xavier, Guilherme B; Barra, Johanna F; Cañas, Gustavo; Pawłowski, Marcin; Lima, Gustavo

    2018-06-08

    We report on a new class of dimension witnesses, based on quantum random access codes, which are a function of the recorded statistics and that have different bounds for all possible decompositions of a high-dimensional physical system. Thus, it certifies the dimension of the system and has the new distinct feature of identifying whether the high-dimensional system is decomposable in terms of lower dimensional subsystems. To demonstrate the practicability of this technique, we used it to experimentally certify the generation of an irreducible 1024-dimensional photonic quantum state. Therefore, certifying that the state is not multipartite or encoded using noncoupled different degrees of freedom of a single photon. Our protocol should find applications in a broad class of modern quantum information experiments addressing the generation of high-dimensional quantum systems, where quantum tomography may become intractable.

  6. Certifying an Irreducible 1024-Dimensional Photonic State Using Refined Dimension Witnesses

    NASA Astrophysics Data System (ADS)

    Aguilar, Edgar A.; Farkas, Máté; Martínez, Daniel; Alvarado, Matías; Cariñe, Jaime; Xavier, Guilherme B.; Barra, Johanna F.; Cañas, Gustavo; Pawłowski, Marcin; Lima, Gustavo

    2018-06-01

    We report on a new class of dimension witnesses, based on quantum random access codes, which are a function of the recorded statistics and that have different bounds for all possible decompositions of a high-dimensional physical system. Thus, it certifies the dimension of the system and has the new distinct feature of identifying whether the high-dimensional system is decomposable in terms of lower dimensional subsystems. To demonstrate the practicability of this technique, we used it to experimentally certify the generation of an irreducible 1024-dimensional photonic quantum state. Therefore, certifying that the state is not multipartite or encoded using noncoupled different degrees of freedom of a single photon. Our protocol should find applications in a broad class of modern quantum information experiments addressing the generation of high-dimensional quantum systems, where quantum tomography may become intractable.

  7. Origins of tmRNA: the missing link in the birth of protein synthesis?

    PubMed

    Macé, Kevin; Gillet, Reynald

    2016-09-30

    The RNA world hypothesis refers to the early period on earth in which RNA was central in assuring both genetic continuity and catalysis. The end of this era coincided with the development of the genetic code and protein synthesis, symbolized by the apparition of the first non-random messenger RNA (mRNA). Modern transfer-messenger RNA (tmRNA) is a unique hybrid molecule which has the properties of both mRNA and transfer RNA (tRNA). It acts as a key molecule during trans-translation, a major quality control pathway of modern bacterial protein synthesis. tmRNA shares many common characteristics with ancestral RNA. Here, we present a model in which proto-tmRNAs were the first molecules on earth to support non-random protein synthesis, explaining the emergence of early genetic code. In this way, proto-tmRNA could be the missing link between the first mRNA and tRNA molecules and modern ribosome-mediated protein synthesis. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Health and healing: spiritual, pharmaceutical, and mechanical medicine.

    PubMed

    Hutch, Richard A

    2013-09-01

    Modern medical practice is identified as a relatively recent way of approaching human ill health in the wide scope of how people have addressed sickness throughout history and across a wide range of cultures. The ideological biases of medical or "allopathic" (disease as "other" or "outsider") practice are identified and grafted onto other perspectives on how people not engaged in modern medicine have achieved healing and health. Alternative forms of healing and health open a consideration of ethnomedicine, many forms of which are unknown and, hence, untested by modern medical research. Ethnomedicine the world over and throughout human history has displayed unique spiritual (vitalism), pharmaceutical (herbs/drugs), and mechanical (manipulation/surgery) approaches to treating illness. The argument is that modern allopathic medicine would do well to consider such "world medicine" as having valuable alternative and complementary therapies, the use of which could enhance contemporary medical advice and practice.

  9. Physiological patterns during practice of the Transcendental Meditation technique compared with patterns while reading Sanskrit and a modern language.

    PubMed

    Travis, F; Olson, T; Egenes, T; Gupta, H K

    2001-07-01

    This study tested the prediction that reading Vedic Sanskrit texts, without knowledge of their meaning, produces a distinct physiological state. We measured EEG, breath rate, heart rate, and skin conductance during: (1) 15-min Transcendental Meditation (TM) practice; (2) 15-min reading verses of the Bhagavad Gita in Sanskrit; and (3) 15-min reading the same verses translated in German, Spanish, or French. The two reading conditions were randomly counterbalanced, and subjects filled out experience forms between each block to reduce carryover effects. Skin conductance levels significantly decreased during both reading Sanskrit and TM practice, and increased slightly during reading a modern language. Alpha power and coherence were significantly higher when reading Sanskrit and during TM practice, compared to reading modern languages. Similar physiological patterns when reading Sanskrit and during practice of the TM technique suggests that the state gained during TM practice may be integrated with active mental processes by reading Sanskrit.

  10. Potential application of item-response theory to interpretation of medical codes in electronic patient records

    PubMed Central

    2011-01-01

    Background Electronic patient records are generally coded using extensive sets of codes but the significance of the utilisation of individual codes may be unclear. Item response theory (IRT) models are used to characterise the psychometric properties of items included in tests and questionnaires. This study asked whether the properties of medical codes in electronic patient records may be characterised through the application of item response theory models. Methods Data were provided by a cohort of 47,845 participants from 414 family practices in the UK General Practice Research Database (GPRD) with a first stroke between 1997 and 2006. Each eligible stroke code, out of a set of 202 OXMIS and Read codes, was coded as either recorded or not recorded for each participant. A two parameter IRT model was fitted using marginal maximum likelihood estimation. Estimated parameters from the model were considered to characterise each code with respect to the latent trait of stroke diagnosis. The location parameter is referred to as a calibration parameter, while the slope parameter is referred to as a discrimination parameter. Results There were 79,874 stroke code occurrences available for analysis. Utilisation of codes varied between family practices with intraclass correlation coefficients of up to 0.25 for the most frequently used codes. IRT analyses were restricted to 110 Read codes. Calibration and discrimination parameters were estimated for 77 (70%) codes that were endorsed for 1,942 stroke patients. Parameters were not estimated for the remaining more frequently used codes. Discrimination parameter values ranged from 0.67 to 2.78, while calibration parameters values ranged from 4.47 to 11.58. The two parameter model gave a better fit to the data than either the one- or three-parameter models. However, high chi-square values for about a fifth of the stroke codes were suggestive of poor item fit. Conclusion The application of item response theory models to coded electronic patient records might potentially contribute to identifying medical codes that offer poor discrimination or low calibration. This might indicate the need for improved coding sets or a requirement for improved clinical coding practice. However, in this study estimates were only obtained for a small proportion of participants and there was some evidence of poor model fit. There was also evidence of variation in the utilisation of codes between family practices raising the possibility that, in practice, properties of codes may vary for different coders. PMID:22176509

  11. PyORBIT: A Python Shell For ORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jean-Francois Ostiguy; Jeffrey Holmes

    2003-07-01

    ORBIT is code developed at SNS to simulate beam dynamics in accumulation rings and synchrotrons. The code is structured as a collection of external C++ modules for SuperCode, a high level interpreter shell developed at LLNL in the early 1990s. SuperCode is no longer actively supported and there has for some time been interest in replacing it by a modern scripting language, while preserving the feel of the original ORBIT program. In this paper, we describe a new version of ORBIT where the role of SuperCode is assumed by Python, a free, well-documented and widely supported object-oriented scripting language. Wemore » also compare PyORBIT to ORBIT from the standpoint of features, performance and future expandability.« less

  12. Codes of environmental management practice: Assessing their potential as a tool for change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nash, J.; Ehrenfeld, J.

    1997-12-31

    Codes of environmental management practice emerged as a tool of environmental policy in the late 1980s. Industry and other groups have developed codes for two purposes: to change the environmental behavior of participating firms and to increase public confidence in industry`s commitment to environmental protection. This review examines five codes of environmental management practice: Responsible Care, the International Chamber of Commerce`s Business Charter for Sustainable Development, ISO 14000, the CERES Principles, and The Natural Step. The first three codes have been drafted and promoted primarily by industry; the others have been developed by non-industry groups. These codes have spurred participatingmore » firms to introduce new practices, including the institution of environmental management systems, public environmental reporting, and community advisory panels. The extent to which codes are introducing a process of cultural change is considered in terms of four dimensions: new consciousness, norms, organization, and tools. 94 refs., 3 tabs.« less

  13. Community to Classroom: Reflections on Community-Centered Pedagogy in Contemporary Modern Dance Technique

    ERIC Educational Resources Information Center

    Fitzgerald, Mary

    2017-01-01

    This article reflects on the ways in which socially engaged arts practices can contribute to reconceptualizing the contemporary modern dance technique class as a powerful site of social change. Specifically, the author considers how incorporating socially engaged practices into pedagogical models has the potential to foster responsible citizenship…

  14. 77 FR 75885 - Control of Communicable Diseases: Foreign; Scope and Definitions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-26

    ... primary authority supporting this rulemaking is section 361 of the Public Health Service Act (42 U.S.C... the scope and definitions to part 71 to reflect modern science and current practices. HHS/CDC has... products'' in subpart F. This revision more adequately reflects modern science and current practice which...

  15. Understanding the workplace culture of a special care nursery.

    PubMed

    Wilson, Valerie J; McCormack, Brendan G; Ives, Glenice

    2005-04-01

    This paper presents findings from the first phase of a research study focusing on implementation and evaluation of emancipatory practice development strategies. Understanding the culture of practice is essential to undertaking effective developments in practice. Culture is a dominant feature of discussions about modernizing health care, yet few studies have been undertaken that systematically evaluate the development of effective practice cultures. The study intervention is that of emancipatory practice development with an integrated evaluation approach based on Realistic Evaluation. The aim of Realistic Evaluation is to evaluate relationships between Context (setting), Mechanism (process characteristics) and Outcome (arising from the context-mechanism configuration). This first phase of the study focuses on uncovering the context (in particular the culture) of the Special Care Nursery in order to evaluate the emancipatory practice development processes and outcomes. Data collection methods included survey, participant observation and interview. Cognitive mapping, constant comparative method and coding were used to analyse the data. Findings. Four key categories were identified: Teamwork, Learning in Practice, Inevitability of Change and Family-Centred Care and collectively these formed a central category of Core Values and Beliefs. A number of themes were identified in each category, and reflected tensions that existed between differing values and beliefs within the culture of the unit. Understanding values and beliefs is an important part of understanding a workplace culture. Whilst survey methods are capable of outlining espoused workplace characteristics, observation of staff interactions and perceptions gives an understanding of culture as a living entity manifested through interpersonal relationships. Attempts at changing workplace cultures should start from the clarification of values held among staff in that culture.

  16. Higher Education in Further Education Colleges: Indirectly Funded Partnerships: Codes of Practice for Franchise and Consortia Arrangements. Report.

    ERIC Educational Resources Information Center

    Higher Education Funding Council for England, Bristol.

    This report provides codes of practice for two types of indirectly funded partnerships entered into by higher education institutions and further education sector colleges: franchises and consortia. The codes of practice set out guidance on the principles that should be reflected in the franchise and consortia agreements that underpin indirectly…

  17. "SEN's Completely Different Now": Critical Discourse Analysis of Three "Codes of Practice for Special Educational Needs" (1994, 2001, 2015)

    ERIC Educational Resources Information Center

    Lehane, Teresa

    2017-01-01

    Regardless of the differing shades of neo-liberalism, successive governments have claimed to champion the cause of "special educational needs and/or disability" (SEND) through official Codes of Practice in 1994, 2001 and 2015. This analysis and comparison of the three Codes of Practice aims to contribute to the debate by exploring…

  18. MEDICAL OPERATIONS IN DENIED ENVIRONMENTS (MODE): ARE OUR AF MEDICS READY

    DTIC Science & Technology

    2016-02-28

    modernization spending, more than the sum of the previous three administrations combined.16 Regional actors believe China’s increased A2/AD capabilities...requirements makes achieving the right personnel with sufficient medical readiness especially challenging.37 20 AF planners use unit type codes ... Codes (AFSCs) as a manpower-classification system to group together personnel that have similar duties, skills, and required training. The Air Force

  19. Innovative E-portal for prevention and therapeutic programme for treatment of the obesity and overweight in health-tourism

    NASA Astrophysics Data System (ADS)

    Zuzda, Jolanta G.; Półjanowicz, Wiesław; Latosiewicz, Robert; Borkowski, Piotr; Bierkus, Mirosław; Moska, Owidiusz

    2017-11-01

    Modern technologies enable overweight and obesity people to enjoy physical activity. We have developed electronic portal containing rotational exercises useful in fight against those disorders. Easy access is provided with QR codes placed on web-site and simply accessed with electronic personal equipment (smartphones). QR codes can also be printed and hanged in different places of health tourism facilities.

  20. CFD Sensitivity Analysis of a Modern Civil Transport Near Buffet-Onset Conditions

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Allison, Dennis O.; Biedron, Robert T.; Buning, Pieter G.; Gainer, Thomas G.; Morrison, Joseph H.; Rivers, S. Melissa; Mysko, Stephen J.; Witkowski, David P.

    2001-01-01

    A computational fluid dynamics (CFD) sensitivity analysis is conducted for a modern civil transport at several conditions ranging from mostly attached flow to flow with substantial separation. Two different Navier-Stokes computer codes and four different turbulence models are utilized, and results are compared both to wind tunnel data at flight Reynolds number and flight data. In-depth CFD sensitivities to grid, code, spatial differencing method, aeroelastic shape, and turbulence model are described for conditions near buffet onset (a condition at which significant separation exists). In summary, given a grid of sufficient density for a given aeroelastic wing shape, the combined approximate error band in CFD at conditions near buffet onset due to code, spatial differencing method, and turbulence model is: 6% in lift, 7% in drag, and 16% in moment. The biggest two contributers to this uncertainty are turbulence model and code. Computed results agree well with wind tunnel surface pressure measurements both for an overspeed 'cruise' case as well as a case with small trailing edge separation. At and beyond buffet onset, computed results agree well over the inner half of the wing, but shock location is predicted too far aft at some of the outboard stations. Lift, drag, and moment curves are predicted in good agreement with experimental results from the wind tunnel.

  1. The influence of commenting validity, placement, and style on perceptions of computer code trustworthiness: A heuristic-systematic processing approach.

    PubMed

    Alarcon, Gene M; Gamble, Rose F; Ryan, Tyler J; Walter, Charles; Jessup, Sarah A; Wood, David W; Capiola, August

    2018-07-01

    Computer programs are a ubiquitous part of modern society, yet little is known about the psychological processes that underlie reviewing code. We applied the heuristic-systematic model (HSM) to investigate the influence of computer code comments on perceptions of code trustworthiness. The study explored the influence of validity, placement, and style of comments in code on trustworthiness perceptions and time spent on code. Results indicated valid comments led to higher trust assessments and more time spent on the code. Properly placed comments led to lower trust assessments and had a marginal effect on time spent on code; however, the effect was no longer significant after controlling for effects of the source code. Low style comments led to marginally higher trustworthiness assessments, but high style comments led to longer time spent on the code. Several interactions were also found. Our findings suggest the relationship between code comments and perceptions of code trustworthiness is not as straightforward as previously thought. Additionally, the current paper extends the HSM to the programming literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. The application of coded excitation technology in medical ultrasonic Doppler imaging

    NASA Astrophysics Data System (ADS)

    Li, Weifeng; Chen, Xiaodong; Bao, Jing; Yu, Daoyin

    2008-03-01

    Medical ultrasonic Doppler imaging is one of the most important domains of modern medical imaging technology. The application of coded excitation technology in medical ultrasonic Doppler imaging system has the potential of higher SNR and deeper penetration depth than conventional pulse-echo imaging system, it also improves the image quality, and enhances the sensitivity of feeble signal, furthermore, proper coded excitation is beneficial to received spectrum of Doppler signal. Firstly, this paper analyzes the application of coded excitation technology in medical ultrasonic Doppler imaging system abstractly, showing the advantage and bright future of coded excitation technology, then introduces the principle and the theory of coded excitation. Secondly, we compare some coded serials (including Chirp and fake Chirp signal, Barker codes, Golay's complementary serial, M-sequence, etc). Considering Mainlobe Width, Range Sidelobe Level, Signal-to-Noise Ratio and sensitivity of Doppler signal, we choose Barker codes as coded serial. At last, we design the coded excitation circuit. The result in B-mode imaging and Doppler flow measurement coincided with our expectation, which incarnated the advantage of application of coded excitation technology in Digital Medical Ultrasonic Doppler Endoscope Imaging System.

  3. Repeats of base oligomers as the primordial coding sequences of the primeval earth and their vestiges in modern genes.

    PubMed

    Ohno, S

    1984-01-01

    Three outstanding properties uniquely qualify repeats of base oligomers as the primordial coding sequences of all polypeptide chains. First, when compared with randomly generated base sequences in general, they are more likely to have long open reading frames. Second, periodical polypeptide chains specified by such repeats are more likely to assume either alpha-helical or beta-sheet secondary structures than are polypeptide chains of random sequence. Third, provided that the number of bases in the oligomeric unit is not a multiple of 3, these internally repetitious coding sequences are impervious to randomly sustained base substitutions, deletions, and insertions. This is because the recurring periodicity of their polypeptide chains is given by three consecutive copies of the oligomeric unit translated in three different reading frames. Accordingly, when one reading frame is open, the other two are automatically open as well, all three being capable of coding for polypeptide chains of identical periodicity. Under this circumstance, a frame shift due to the deletion or insertion of a number of bases that is not a multiple of 3 fails to alter the down-stream amino acid sequence, and even a base change causing premature chain-termination can silence only one of the three potential coding units. Newly arisen coding sequences in modern organisms are oligomeric repeats, and most of the older genes retain various vestiges of their original internal repetitions. Some of the genes (e.g., oncogenes) have even inherited the property of being impervious to randomly sustained base changes.

  4. Studying Innovation Technologies in Modern Education

    ERIC Educational Resources Information Center

    Stukalenko, Nina M.; Zhakhina, Bariya B.; Kukubaeva, Asiya K.; Smagulova, Nurgul K.; Kazhibaeva, Gulden K.

    2016-01-01

    In modern society, innovation technologies expand to almost every field of human activity, including such wide field as education. Due to integrating innovation technologies into the educational process practice, this phenomenon gained special significance within improvement and modernization of the established educational system. Currently, the…

  5. Criminal Code Modernization and Simplification Act of 2013

    THOMAS, 113th Congress

    Rep. Sensenbrenner, F. James, Jr. [R-WI-5

    2013-05-07

    House - 06/20/2013 Referred to the Subcommittee on Crime, Terrorism, Homeland Security, and Investigations. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  6. The PLUTO code for astrophysical gasdynamics .

    NASA Astrophysics Data System (ADS)

    Mignone, A.

    Present numerical codes appeal to a consolidated theory based on finite difference and Godunov-type schemes. In this context we have developed a versatile numerical code, PLUTO, suitable for the solution of high-mach number flow in 1, 2 and 3 spatial dimensions and different systems of coordinates. Different hydrodynamic modules and algorithms may be independently selected to properly describe Newtonian, relativistic, MHD, or relativistic MHD fluids. The modular structure exploits a general framework for integrating a system of conservation laws, built on modern Godunov-type shock-capturing schemes. The code is freely distributed under the GNU public license and it is available for download to the astrophysical community at the URL http://plutocode.to.astro.it.

  7. VERAIn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan

    2015-02-16

    CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less

  8. Art-House Cinema, Avant-Garde Film, and Dramatic Modernism

    ERIC Educational Resources Information Center

    Cardullo, Bert

    2011-01-01

    In this article, the author talks about art-house cinema, avant-garde film, and dramatic modernism. He believes that the most important modes of film practice are art-house cinema and the avant-garde, both of which contrast with the classical Hollywood mode of film practice. While the latter is characterized by its commercial imperative, corporate…

  9. Philosophy, Methodology and Action Research

    ERIC Educational Resources Information Center

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  10. The commerce of professional psychology and the new ethics code.

    PubMed

    Koocher, G P

    1994-11-01

    The 1992 version of the American Psychological Association's Ethical Principles of Psychologists and Code of Conduct brings some changes in requirements and new specificity to the practice of psychology. The impact of the new code on therapeutic contracts, informed consent to psychological services, advertising, financial aspects of psychological practice, and other topics related to the commerce of professional psychology are discussed. The genesis of many new thrusts in the code is reviewed from the perspective of psychological service provider. Specific recommendations for improved attention to ethical matters in professional practice are made.

  11. Xyce™ Parallel Electronic Simulator Users' Guide, Version 6.5.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik V.; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The information herein is subject to change without notice. Copyright © 2002-2016 Sandia Corporation. All rights reserved.« less

  12. The "schola medica salernitana": the forerunner of the modern university medical schools.

    PubMed

    de Divitiis, Enrico; Cappabianca, Paolo; de Divitiis, Oreste

    2004-10-01

    The schola medica salernitana is considered the oldest medical school of modern civilization. Salerno's long medical tradition began during the Greco-Roman period in a Greek colony named Elea, where Parmenides decided to found a medical school. The fame of the school became more and more important during the 10th century, and it was best known in the 11th century. In the middle of 12th century, the school was at its apogee, and Salerno provided a notable contribution to the formulation of a medical curriculum for medieval universities. The most famous work of the Salernitan School was the Regimen Sanitatis Saleritanum, a Latin poem of rational, dietetic, and hygienic precepts, many of them still valid today. The school also produced a physician's reference book, with advice on how to treat a patient, a sort of code of conduct to help the physician to respect the patient and his or her relatives. The first science-based surgery appeared on the scene of the discredited medieval practice in Salerno, thanks to Roger of Salerno and his fellows. He wrote a book on surgery, called Rogerina or Post Mundi Fabricam, in which surgery from head to toe is described, with surprising originality. The important contribution to the School of Salerno made by women as female practitioners is outlined, and among them, Trotula de Ruggiero was the most renowned. The period when the School of Salerno, universally recognized as the forerunner of the modern universities, became a government academy was when Frederick II reigned over the Kingdom of the Two Sicilies, as Emperor of the Holy Roman Empire.

  13. Pastoral care in hospitals: a literature review.

    PubMed

    Proserpio, Tullio; Piccinelli, Claudia; Clerici, Carlo Alfredo

    2011-01-01

    This literature review investigates the potential contribution of the pastoral care provided in hospitals by hospital chaplains, as part of an integrated view of patient care, particularly in institutions dealing with severe disease. A search was conducted in the Medline database covering the last 10 years. Ninety-eight articles were considered concerning the modern hospital chaplains' relationships and the principal procedures and practices associated with their roles, i.e., their relations with the scientific world, with other religious figures in the community, with other faiths and religious confessions, with other public health professionals and operators, with colleagues in professional associations and training activities, and with the hospital organization as a whole, as well as their patient assessment activities and the spiritual-religious support they provide, also for the patients' families. Improvements are needed on several fronts to professionalize the pastoral care provided in hospitals and modernize the figure of the hospital chaplain. These improvements include better relations between modern chaplains and the hospital organization and scientific world; more focus on a scientific approach to their activities and on evaluating the efficacy of pastoral care activities; greater clarity in the definition of the goals, methods and procedures; the design of protocols and a stance on important ethical issues; respect for the various faiths, different cultures and both religious and nonreligious or secularized customs; greater involvement in the multidisciplinary patient care teams, of which the hospital chaplains are an integral part; stronger integration with public health operators and cooperation with the psychosocial professions; specific training on pastoral care and professional certification of chaplains; and the development of shared ethical codes for the profession.

  14. Increasing Trend of Fatal Falls in Older Adults in the United States, 1992 to 2005: Coding Practice or Reporting Quality?

    PubMed

    Kharrazi, Rebekah J; Nash, Denis; Mielenz, Thelma J

    2015-09-01

    To investigate whether changes in death certificate coding and reporting practices explain part or all of the recent increase in the rate of fatal falls in adults aged 65 and older in the United States. Trends in coding and reporting practices of fatal falls were evaluated under mortality coding schemes for International Classification of Diseases (ICD), Ninth Revision (1992-1998) and Tenth Revision (1999-2005). United States, 1992 to 2005. Individuals aged 65 and older with falls listed as the underlying cause of death (UCD) on their death certificates. The primary outcome was annual fatal falls rates per 100,000 U.S. residents aged 65 and older. Coding practice was assessed through analysis of trends in rates of specific UCD fall ICD e-codes over time. Reporting quality was assessed by examining changes in the location on the death certificate where fall e-codes were reported, in particular, the percentage of fall e-codes recorded in the proper location on the death certificate. Fatal falls rates increased over both time periods: 1992 to 1998 and 1999 to 2005. A single falls e-code was responsible for the increasing trend of fatal falls overall from 1992 to 1998 (E888, other and unspecified fall) and from 1999 to 2005 (W18, other falls on the same level), whereas trends for other falls e-codes remained stable. Reporting quality improved steadily throughout the study period. Better reporting quality, not coding practices, contributed to the increasing rate of fatal falls in older adults in the United States from 1992 to 2005. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.

  15. A code of ethics for nurse educators: revised.

    PubMed

    Rosenkoetter, Marlene M; Milstead, Jeri A

    2010-01-01

    Nurse educators have the responsibility of assisting students and their colleagues with understanding and practicing ethical conduct. There is an inherent responsibility to keep codes current and relevant for existing nursing practice. The code presented here is a revision of the Code of ethics for nurse educators originally published in 1983 and includes changes that are intended to provide for that relevancy.

  16. C++ Coding Standards for the AMP Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas M; Clarno, Kevin T

    2009-09-01

    This document provides an initial starting point to define the C++ coding standards used by the AMP nuclear fuel performance integrated code project and a part of AMP's software development process. This document draws from the experiences, and documentation [1], of the developers of the Marmot Project at Los Alamos National Laboratory. Much of the software in AMP will be written in C++. The power of C++ can be abused easily, resulting in code that is difficult to understand and maintain. This document gives the practices that should be followed on the AMP project for all new code that ismore » written. The intent is not to be onerous but to ensure that the code can be readily understood by the entire code team and serve as a basis for collectively defining a set of coding standards for use in future development efforts. At the end of the AMP development in fiscal year (FY) 2010, all developers will have experience with the benefits, restrictions, and limitations of the standards described and will collectively define a set of standards for future software development. External libraries that AMP uses do not have to meet these requirements, although we encourage external developers to follow these practices. For any code of which AMP takes ownership, the project will decide on any changes on a case-by-case basis. The practices that we are using in the AMP project have been in use in the Denovo project [2] for several years. The practices build on those given in References [3-5]; the practices given in these references should also be followed. Some of the practices given in this document can also be found in [6].« less

  17. A new Fortran 90 program to compute regular and irregular associated Legendre functions (new version announcement)

    NASA Astrophysics Data System (ADS)

    Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus

    2018-04-01

    This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.

  18. Listening to Brain Microcircuits for Interfacing With External World-Progress in Wireless Implantable Microelectronic Neuroengineering Devices: Experimental systems are described for electrical recording in the brain using multiple microelectrodes and short range implantable or wearable broadcasting units.

    PubMed

    Nurmikko, Arto V; Donoghue, John P; Hochberg, Leigh R; Patterson, William R; Song, Yoon-Kyu; Bull, Christopher W; Borton, David A; Laiwalla, Farah; Park, Sunmee; Ming, Yin; Aceros, Juan

    2010-01-01

    Acquiring neural signals at high spatial and temporal resolution directly from brain microcircuits and decoding their activity to interpret commands and/or prior planning activity, such as motion of an arm or a leg, is a prime goal of modern neurotechnology. Its practical aims include assistive devices for subjects whose normal neural information pathways are not functioning due to physical damage or disease. On the fundamental side, researchers are striving to decipher the code of multiple neural microcircuits which collectively make up nature's amazing computing machine, the brain. By implanting biocompatible neural sensor probes directly into the brain, in the form of microelectrode arrays, it is now possible to extract information from interacting populations of neural cells with spatial and temporal resolution at the single cell level. With parallel advances in application of statistical and mathematical techniques tools for deciphering the neural code, extracted populations or correlated neurons, significant understanding has been achieved of those brain commands that control, e.g., the motion of an arm in a primate (monkey or a human subject). These developments are accelerating the work on neural prosthetics where brain derived signals may be employed to bypass, e.g., an injured spinal cord. One key element in achieving the goals for practical and versatile neural prostheses is the development of fully implantable wireless microelectronic "brain-interfaces" within the body, a point of special emphasis of this paper.

  19. High Performance Object-Oriented Scientific Programming in Fortran 90

    NASA Technical Reports Server (NTRS)

    Norton, Charles D.; Decyk, Viktor K.; Szymanski, Boleslaw K.

    1997-01-01

    We illustrate how Fortran 90 supports object-oriented concepts by example of plasma particle computations on the IBM SP. Our experience shows that Fortran 90 and object-oriented methodology give high performance while providing a bridge from Fortran 77 legacy codes to modern programming principles. All of our object-oriented Fortran 90 codes execute more quickly thatn the equeivalent C++ versions, yet the abstraction modelling capabilities used for scentific programming are comparably powereful.

  20. 2-Step scalar deadzone quantization for bitplane image coding.

    PubMed

    Auli-Llinas, Francesc

    2013-12-01

    Modern lossy image coding systems generate a quality progressive codestream that, truncated at increasing rates, produces an image with decreasing distortion. Quality progressivity is commonly provided by an embedded quantizer that employs uniform scalar deadzone quantization (USDQ) together with a bitplane coding strategy. This paper introduces a 2-step scalar deadzone quantization (2SDQ) scheme that achieves same coding performance as that of USDQ while reducing the coding passes and the emitted symbols of the bitplane coding engine. This serves to reduce the computational costs of the codec and/or to code high dynamic range images. The main insights behind 2SDQ are the use of two quantization step sizes that approximate wavelet coefficients with more or less precision depending on their density, and a rate-distortion optimization technique that adjusts the distortion decreases produced when coding 2SDQ indexes. The integration of 2SDQ in current codecs is straightforward. The applicability and efficiency of 2SDQ are demonstrated within the framework of JPEG2000.

  1. Performance tuning of N-body codes on modern microprocessors: I. Direct integration with a hermite scheme on x86_64 architecture

    NASA Astrophysics Data System (ADS)

    Nitadori, Keigo; Makino, Junichiro; Hut, Piet

    2006-12-01

    The main performance bottleneck of gravitational N-body codes is the force calculation between two particles. We have succeeded in speeding up this pair-wise force calculation by factors between 2 and 10, depending on the code and the processor on which the code is run. These speed-ups were obtained by writing highly fine-tuned code for x86_64 microprocessors. Any existing N-body code, running on these chips, can easily incorporate our assembly code programs. In the current paper, we present an outline of our overall approach, which we illustrate with one specific example: the use of a Hermite scheme for a direct N2 type integration on a single 2.0 GHz Athlon 64 processor, for which we obtain an effective performance of 4.05 Gflops, for double-precision accuracy. In subsequent papers, we will discuss other variations, including the combinations of N log N codes, single-precision implementations, and performance on other microprocessors.

  2. Some Practical Universal Noiseless Coding Techniques

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.

    1994-01-01

    Report discusses noiseless data-compression-coding algorithms, performance characteristics and practical consideration in implementation of algorithms in coding modules composed of very-large-scale integrated circuits. Report also has value as tutorial document on data-compression-coding concepts. Coding techniques and concepts in question "universal" in sense that, in principle, applicable to streams of data from variety of sources. However, discussion oriented toward compression of high-rate data generated by spaceborne sensors for lower-rate transmission back to earth.

  3. Comparison of professional values of Taiwanese and United States nursing students.

    PubMed

    Alfred, Danita; Yarbrough, Susan; Martin, Pam; Mink, Janice; Lin, Yu-Hua; Wang, Liching S

    2013-12-01

    Globalization is a part of modern life. Sharing a common set of professional nursing values is critical in this global environment. The purpose of this research was to examine the professional values of nursing students from two distinct cultural perspectives. Nurse educators in Taiwan partnered with nurse educators in the United States to compare professional values of their respective graduating nursing students. The American Nurses Association Code of Ethics served as the philosophical framework for this examination. The convenience sample comprised 94 Taiwanese students and 168 US students. Both groups reported high scores on an overall measure of values. They did differ substantially on the relative importance of individual items related to advocacy, competence, education, self-evaluation, professional advancement, and professional associations. Global implications for the collaborative practice of nurses from different cultures working together can be improved by first recognizing and then attending to these differences in value priorities.

  4. High-Performance 3D Compressive Sensing MRI Reconstruction Using Many-Core Architectures.

    PubMed

    Kim, Daehyun; Trzasko, Joshua; Smelyanskiy, Mikhail; Haider, Clifton; Dubey, Pradeep; Manduca, Armando

    2011-01-01

    Compressive sensing (CS) describes how sparse signals can be accurately reconstructed from many fewer samples than required by the Nyquist criterion. Since MRI scan duration is proportional to the number of acquired samples, CS has been gaining significant attention in MRI. However, the computationally intensive nature of CS reconstructions has precluded their use in routine clinical practice. In this work, we investigate how different throughput-oriented architectures can benefit one CS algorithm and what levels of acceleration are feasible on different modern platforms. We demonstrate that a CUDA-based code running on an NVIDIA Tesla C2050 GPU can reconstruct a 256 × 160 × 80 volume from an 8-channel acquisition in 19 seconds, which is in itself a significant improvement over the state of the art. We then show that Intel's Knights Ferry can perform the same 3D MRI reconstruction in only 12 seconds, bringing CS methods even closer to clinical viability.

  5. Supporting metabolomics with adaptable software: design architectures for the end-user.

    PubMed

    Sarpe, Vladimir; Schriemer, David C

    2017-02-01

    Large and disparate sets of LC-MS data are generated by modern metabolomics profiling initiatives, and while useful software tools are available to annotate and quantify compounds, the field requires continued software development in order to sustain methodological innovation. Advances in software development practices allow for a new paradigm in tool development for metabolomics, where increasingly the end-user can develop or redeploy utilities ranging from simple algorithms to complex workflows. Resources that provide an organized framework for development are described and illustrated with LC-MS processing packages that have leveraged their design tools. Full access to these resources depends in part on coding experience, but the emergence of workflow builders and pluggable frameworks strongly reduces the skill level required. Developers in the metabolomics community are encouraged to use these resources and design content for uptake and reuse. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Science, ethics and war: a pacifist's perspective.

    PubMed

    Kovac, Jeffrey

    2013-06-01

    This article considers the ethical aspects of the question: should a scientist engage in war-related research, particularly use-inspired or applied research directed at the development of the means for the better waging of war? Because scientists are simultaneously professionals, citizens of a particular country, and human beings, they are subject to conflicting moral and practical demands. There are three major philosophical views concerning the morality of war that are relevant to this discussion: realism, just war theory and pacifism. In addition, the requirements of professional codes of ethics and common morality contribute to an ethical analysis of the involvement of scientists and engineers in war-related research and technology. Because modern total warfare, which is facilitated by the work of scientists and engineers, results in the inevitable killing of innocents, it follows that most, if not all, war-related research should be considered at least as morally suspect and probably as morally prohibited.

  7. Energy-modeled flight in a wind field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feldman, M.A.; Cliff, E.M.

    Optimal shaping of aerospace trajectories has provided the motivation for much modern study of optimization theory and algorithms. Current industrial practice favors approaches where the continuous-time optimal control problem is transcribed to a finite-dimensional nonlinear programming problem (NLP) by a discretization process. Two such formulations are implemented in the POST and the OTIS codes. In the present paper we use a discretization that is specially adapted to the flight problem of interest. Among the unique aspects of the present discretization are: a least-squares formulation for certain kinematic constraints; the use of an energy ideas to enforce Newton`s Laws; and, themore » inclusion of large magnitude horizontal winds. In the next section we shall provide a description of the flight problem and its NLP representation. Following this we provide some details of the constraint formulation. Finally, we present an overview of the NLP problem.« less

  8. Active euthanasia in pre-modern society, 1500-1800: learned debates and popular practices.

    PubMed

    Stolberg, Michael

    2007-08-01

    Historians of medical ethics have found that active euthanasia, in the sense of intentionally hastening the death of terminally-ill patients, was considered unacceptable in the Christian West before the 1870s. This paper presents a range of early modern texts on the issue which reflect a learned awareness of practices designed to shorten the lives of dying patients which were widely accepted among the lay public. Depriving the dying abruptly of their head-rest or placing them flat on the cold floor may strike us as merely symbolic today, but early moderns associated such measures with very concrete and immediate effects. In this sense, the intentional hastening of death in agonising patients had an accepted place in pre-modern popular culture. These practices must, however, be put into their proper context. Death was perceived more as a transition to the after-life and contemporary notions of dying could make even outright suffocation appear as an act of compassion which merely helped the soul depart from the body at the divinely ordained hour of death. The paper concludes with a brief comparison of early modern arguments with those of today.

  9. Current issues in billing and coding in interventional pain medicine.

    PubMed

    Manchikanti, L

    2000-10-01

    Interventional pain management is a dynamic field with changes occurring on a daily basis, not only with technology but also with regulations that have a substantial financial impact on practices. Regulations are imposed not only by the federal government and other regulatory agencies, and also by a multitude of other payors, state governments and medical boards. Documentation of medical necessity with coding that correlates with multiple components of the patient's medical record, operative report, and billing statement is extremely important. Numerous changes which have occurred in the practice of interventional pain management in the new millennium continue to impact the financial viability of interventional pain practices along with patient access to these services. Thus, while complying with regulations of billing, coding and proper, effective, and ethical practice of pain management, it is also essential for physicians to understand financial aspects and the impact of various practice patterns. This article provides guidelines which are meant to provide practical considerations for billing and coding of interventional techniques in the management of chronic pain based on the current state of the art and science of interventional pain management. Hence, these guidelines do not constitute inflexible treatment, coding, billing or documentation recommendations. It is expected that a provider will establish a plan of care on a case-by-case basis taking into account an individual patient's medical condition, personal needs, and preferences, along with physician's experience and in a similar manner, billing and coding practices will be developed. Based on an individual patient's needs, treatment, billing and coding, different from what is outlined here is not only warranted but essential.

  10. [Discussion on logistics management of medical consumables].

    PubMed

    Deng, Sutong; Wang, Miao; Jiang, Xiali

    2011-09-01

    Management of medical consumables is an important part of modern hospital management. In modern medical behavior, drugs and medical devices act directly on the patient, and are important factors affecting the quality of medical practice. With the increasing use of medical materials, based on practical application, this article proposes the management model of medical consumables, and discusses the essence of medical materials logistics management.

  11. Developing Pedagogical Expertise in Modern Language Learning and Specific Learning Difficulties through Collaborative and Open Educational Practices

    ERIC Educational Resources Information Center

    Gallardo, Matilde; Heiser, Sarah; Arias McLaughlin, Ximena

    2017-01-01

    This paper analyses teachers' engagement with collaborative and open educational practices to develop their pedagogical expertise in the field of modern language (ML) learning and specific learning difficulties (SpLD). The study analyses the findings of a staff development initiative at the Department of Languages, Open University, UK, in 2013,…

  12. Performance of Trellis Coded 256 QAM super-multicarrier modem VLSI's for SDH interface outage-free digital microwave radio

    NASA Astrophysics Data System (ADS)

    Aikawa, Satoru; Nakamura, Yasuhisa; Takanashi, Hitoshi

    1994-02-01

    This paper describes the performance of an outage free SXH (Synchronous Digital Hierarchy) interface 256 QAM modem. An outage free DMR (Digital Microwave Radio) is achieved by a high coding gain trellis coded SPORT QAM and Super Multicarrier modem. A new frame format and its associated circuits connect the outage free modem to the SDH interface. The newly designed VLSI's are key devices for developing the modem. As an overall modem performance, BER (bit error rate) characteristics and equipment signatures are presented. A coding gain of 4.7 dB (at a BER of 10(exp -4)) is obtained using SPORT 256 QAM and Viterbi decoding. This coding gain is realized by trellis coding as well as by increasing of transmission rate. Roll-off factor is decreased to maintain the same frequency occupation and modulation level as ordinary SDH 256 QAM modern.

  13. Valuing Science: A Turkish-American Comparison

    ERIC Educational Resources Information Center

    Titrek, Osman; Cobern, William W.

    2011-01-01

    The process of modernization began in Turkey under the reform government of Mustafa Kemal Ataturk (1881-1938). Turkey officially became a secular nation seeking to develop a modern economy with modern science and technology and political democracy. Turkey also has long been, and remains, a deeply religious society. Specifically, the practice of…

  14. Code of practice for food handler activities.

    PubMed

    Smith, T A; Kanas, R P; McCoubrey, I A; Belton, M E

    2005-08-01

    The food industry regulates various aspects of food handler activities, according to legislation and customer expectations. The purpose of this paper is to provide a code of practice which delineates a set of working standards for food handler hygiene, handwashing, use of protective equipment, wearing of jewellery and body piercing. The code was developed by a working group of occupational physicians with expertise in both food manufacturing and retail, using a risk assessment approach. Views were also obtained from other occupational physicians working within the food industry and the relevant regulatory bodies. The final version of the code (available in full as Supplementary data in Occupational Medicine Online) therefore represents a broad consensus of opinion. The code of practice represents a set of minimum standards for food handler suitability and activities, based on a practical assessment of risk, for application in food businesses. It aims to provide useful working advice to food businesses of all sizes.

  15. QR codes: next level of social media.

    PubMed

    Gottesman, Wesley; Baum, Neil

    2013-01-01

    The OR code, which is short for quick response code, system was invented in Japan for the auto industry. Its purpose was to track vehicles during manufacture; it was designed to allow high-speed component scanning. Now the scanning can be easily accomplished via cell phone, making the technology useful and within reach of your patients. There are numerous applications for OR codes in the contemporary medical practice. This article describes QR codes and how they might be applied for marketing and practice management.

  16. History of teaching anatomy in India: from ancient to modern times.

    PubMed

    Jacob, Tony George

    2013-01-01

    Safe clinical practice is based on a sound knowledge of the structure and function of the human body. Thus, knowledge of anatomy has been an essential tool in the practice of healthcare throughout the ages. The history of anatomy in India traces from the Paleolithic Age to the Indus Valley Civilization, the Vedic Times, the Islamic Dynasties, the modern Colonial Period, and finally to Independent India. The course of the study of anatomy, despite accompanying controversies and periods of latencies, has been fascinating. This review takes the reader through various periods of Indian medicine and the role of anatomy in the field of medical practice. It also provides a peek into the modern system of pedagogy in anatomical sciences in India. Copyright © 2013 American Association of Anatomists.

  17. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  18. Wolf-Rayet stars, black holes and the first detected gravitational wave source

    NASA Astrophysics Data System (ADS)

    Bogomazov, A. I.; Cherepashchuk, A. M.; Lipunov, V. M.; Tutukov, A. V.

    2018-01-01

    The recently discovered burst of gravitational waves GW150914 provides a good new chance to verify the current view on the evolution of close binary stars. Modern population synthesis codes help to study this evolution from two main sequence stars up to the formation of two final remnant degenerate dwarfs, neutron stars or black holes (Masevich and Tutukov, 1988). To study the evolution of the GW150914 predecessor we use the ;Scenario Machine; code presented by Lipunov et al. (1996). The scenario modeling conducted in this study allowed to describe the evolution of systems for which the final stage is a massive BH+BH merger. We find that the initial mass of the primary component can be 100÷140M⊙ and the initial separation of the components can be 50÷350R⊙. Our calculations show the plausibility of modern evolutionary scenarios for binary stars and the population synthesis modeling based on it.

  19. Master of Puppets: Cooperative Multitasking for In Situ Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Lukic, Zarija

    2016-01-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less

  20. Henson v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monozov, Dmitriy; Lukie, Zarija

    2016-04-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less

  1. Characterizing Mathematics Classroom Practice: Impact of Observation and Coding Choices

    ERIC Educational Resources Information Center

    Ing, Marsha; Webb, Noreen M.

    2012-01-01

    Large-scale observational measures of classroom practice increasingly focus on opportunities for student participation as an indicator of instructional quality. Each observational measure necessitates making design and coding choices on how to best measure student participation. This study investigated variations of coding approaches that may be…

  2. Relationships between antenatal and postnatal care and post-partum modern contraceptive use: evidence from population surveys in Kenya and Zambia.

    PubMed

    Do, Mai; Hotchkiss, David

    2013-01-04

    It is often assumed, with little supportive, empirical evidence, that women who use maternal health care are more likely than those who do not to use modern contraceptives. This study aims to add to the existing literature on associations between the use of antenatal (ANC) and post-natal care (PNC) and post-partum modern contraceptives. Data come from the most recent Demographic and Health Surveys (DHS) in Kenya (2008-09) and Zambia (2007). Study samples include women who had a live birth within five years before the survey (3,667 in Kenya and 3,587 in Zambia). Multivariate proportional hazard models were used to examine the associations between the intensity of ANC and PNC service use and a woman's adoption of modern contraceptives after a recent live birth. Tests of exogeneity confirmed that the intensity of ANC and PNC service use and post-partum modern contraceptive practice were not influenced by common unobserved factors. Cox proportional hazard models showed significant associations between the service intensity of ANC and PNC and post-partum modern contraceptive use in both countries. This relationship is largely due to ANC services; no significant associations were observed between PNC service intensity and post-partum FP practice. While the lack of associations between PNC and post-partum FP use may be due to the limited measure of PNC service intensity, the study highlights a window of opportunity to promote the use of modern contraceptives after childbirth through ANC service delivery. Depending on the availability of data, further research should take into account community- and facility-level factors that may influence modern contraceptive use in examining associations between ANC and PNC use and post-partum FP practice.

  3. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    NASA Astrophysics Data System (ADS)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O'Neill, B. J.; Nolting, C.; Edmon, P.; Donnert, J. M. F.; Jones, T. W.

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  4. The Los Alamos Supernova Light Curve Project: Current Projects and Future Directions

    NASA Astrophysics Data System (ADS)

    Wiggins, Brandon Kerry; Los Alamos Supernovae Research Group

    2015-01-01

    The Los Alamos Supernova Light Curve Project models supernovae in the ancient and modern universe to determine the luminosities of observability of certain supernovae events and to explore the physics of supernovae in the local universe. The project utilizes RAGE, Los Alamos' radiation hydrodynamics code to evolve the explosions of progenitors prepared in well-established stellar evolution codes. RAGE allows us to capture events such as shock breakout and collisions of ejecta with shells of material which cannot be modeled well in other codes. RAGE's dumps are then ported to LANL's SPECTRUM code which uses LANL's OPLIB opacities database to calculate light curves and spectra. In this paper, we summarize our recent work in modeling supernovae.

  5. Spousal communication on family planning and perceived social support for contraceptive practices in a sample of Malaysian women

    PubMed Central

    Najafi-Sharjabad, Fatemeh; Rahman, Hejar Abdul; Hanafiah, Muhamad; Syed Yahya, Sharifah Zainiyah

    2014-01-01

    Background: In Malaysia, contraceptive prevalence rate (CPR) during past three decades has been steady, with only 34% of women practicing modern contraception. The aim of this study was to determine the factors associated with modern contraceptive practices with a focus on spousal communication and perceived social support among married women working in the university. Materials and Methods: A cross-sectional study was carried out using self-administered structured questionnaire. The association between variables were assessed using Chi-square test, independent sample t-test, and logistic regression. Results: Overall, 36.8% of women used modern contraceptive methods. Significant association was found between contraceptive practice and ethnicity (P = 0.003), number of pregnancies (P < 0.001), having child (P = 0.003), number of children (P < 0.001), positive history of mistimed pregnancy (P = 0.006), and experience of unwanted pregnancy (P = 0.003). The final model showed Malay women were 92% less likely to use modern contraception as compared to non-Malay women. Women who discussed about family planning with their spouses were more likely to practice modern contraception than the women who did not [odds ratio (OR): 2.2, Confidence Interval (CI): 1.3–3.7]. Those women with moderate (OR: 4.9, CI: 1.6–10.8) and strong (OR: 14, CI: 4.5–26.4) perception of social support for contraceptive usage were more likely to use modern contraception than the women with poor perception of social support. Conclusion: Spousal communication regarding family planning would be an effective way to motivate men for supporting and using contraceptives. Family planning education initiatives should target both men and women, particularly high-risk cases, for promoting healthy timing and spacing of pregnancies. Ethnic disparities need to be considered in planning reproductive health programs. PMID:25949248

  6. A browser-based tool for conversion between Fortran NAMELIST and XML/HTML

    NASA Astrophysics Data System (ADS)

    Naito, O.

    A browser-based tool for conversion between Fortran NAMELIST and XML/HTML is presented. It runs on an HTML5 compliant browser and generates reusable XML files to aid interoperability. It also provides a graphical interface for editing and annotating variables in NAMELIST, hence serves as a primitive code documentation environment. Although the tool is not comprehensive, it could be viewed as a test bed for integrating legacy codes into modern systems.

  7. National Combustion Code, a Multidisciplinary Combustor Design System, Will Be Transferred to the Commercial Sector

    NASA Technical Reports Server (NTRS)

    Steele, Gynelle C.

    1999-01-01

    The NASA Lewis Research Center and Flow Parametrics will enter into an agreement to commercialize the National Combustion Code (NCC). This multidisciplinary combustor design system utilizes computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. This integrated system can facilitate and enhance various phases of the design and analysis process.

  8. Evaluation of Evidence for Altered Behavior and Auditory Deficits in Fishes Due to Human-Generated Noise Sources

    DTIC Science & Technology

    2006-04-01

    prepared by the Research and Animal Care Branch, Code 2351, of the Biosciences Division, Code 235, SSC San Diego. This is a work of the United...and Animal Care Branch Under authority of M. Rothe, Head Biosciences Division i EXECUTIVE SUMMARY In this study, we have evaluated peer... sharks , skates, and rays) and teleost fishes (modern bony fishes) and provide recommendations for research to address remaining issues. Clear responses

  9. Profile and birthing practices of Maranao traditional birth attendants.

    PubMed

    Maghuyop-Butalid, Roselyn; Mayo, Norhanifa A; Polangi, Hania T

    2015-01-01

    This study determined the profile and birthing practices in both modern and traditional ways among Maranao traditional birth attendants (TBAs) in Lanao del Norte, Philippines. It employed a descriptive research design. The respondents were 50 Maranao TBAs selected through the snowball sampling technique. A questionnaire was developed by the researchers to identify the respondents' modern birthing practices utilizing the Essential Intrapartum and Newborn Care (EINC) Protocol. To determine their profile and traditional birthing practices, items from a previous study and the respondents' personal claims were adapted. This study shows that Maranao TBAs have less compliance to the EINC Protocol and they often practice the traditional birthing interventions, thus increasing the risk of complications to both mother and newborn.

  10. A need for a code of ethics in science communication?

    NASA Astrophysics Data System (ADS)

    Benestad, R. E.

    2009-09-01

    The modern western civilization and high standard of living are to a large extent the 'fruits' of scientific endeavor over generations. Some examples include the longer life expectancy due to progress in medical sciences, and changes in infrastructure associated with the utilization of electromagnetism. Modern meteorology is not possible without the state-of-the-art digital computers, satellites, remote sensing, and communications. Science also is of relevance for policy making, e.g. the present hot topic of climate change. Climate scientists have recently become much exposed to media focus and mass communications, a task for which many are not trained. Furthermore, science, communication, and politics have different objectives, and do not necessarily mix. Scientists have an obligation to provide unbiased information, and a code of ethics is needed to give a guidance for acceptable and unacceptable conduct. Some examples of questionable conduct in Norway include using the title 'Ph.D' to imply scientific authority when the person never had obtained such an academic degree, or writing biased and one-sided articles in Norwegian encyclopedia that do not reflect the scientific consensus. It is proposed here that a set of guide lines (for the scientists and journalists) and a code of conduct could provide recommendation for regarding how to act in media - similar to a code of conduct with respect to carrying out research - to which everyone could agree, even when disagreeing on specific scientific questions.

  11. A New Image Encryption Technique Combining Hill Cipher Method, Morse Code and Least Significant Bit Algorithm

    NASA Astrophysics Data System (ADS)

    Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi

    2018-01-01

    Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.

  12. DRG coding practice: a nationwide hospital survey in Thailand.

    PubMed

    Pongpirul, Krit; Walker, Damian G; Rahman, Hafizur; Robinson, Courtland

    2011-10-31

    Diagnosis Related Group (DRG) payment is preferred by healthcare reform in various countries but its implementation in resource-limited countries has not been fully explored. This study was aimed (1) to compare the characteristics of hospitals in Thailand that were audited with those that were not and (2) to develop a simplified scale to measure hospital coding practice. A questionnaire survey was conducted of 920 hospitals in the Summary and Coding Audit Database (SCAD hospitals, all of which were audited in 2008 because of suspicious reports of possible DRG miscoding); the questionnaire also included 390 non-SCAD hospitals. The questionnaire asked about general demographics of the hospitals, hospital coding structure and process, and also included a set of 63 opinion-oriented items on the current hospital coding practice. Descriptive statistics and exploratory factor analysis (EFA) were used for data analysis. SCAD and Non-SCAD hospitals were different in many aspects, especially the number of medical statisticians, experience of medical statisticians and physicians, as well as number of certified coders. Factor analysis revealed a simplified 3-factor, 20-item model to assess hospital coding practice and classify hospital intention. Hospital providers should not be assumed capable of producing high quality DRG codes, especially in resource-limited settings.

  13. Rethinking critical reflection on care: late modern uncertainty and the implications for care ethics.

    PubMed

    Vosman, Frans; Niemeijer, Alistair

    2017-12-01

    Care ethics as initiated by Gilligan, Held, Tronto and others (in the nineteen eighties and nineties) has from its onset been critical towards ethical concepts established in modernity, like 'autonomy', alternatively proposing to think from within relationships and to pay attention to power. In this article the question is raised whether renewal in this same critical vein is necessary and possible as late modern circumstances require rethinking the care ethical inquiry. Two late modern realities that invite to rethink care ethics are complexity and precariousness. Late modern organizations, like the general hospital, codetermined by various (control-, information-, safety-, accountability-) systems are characterized by complexity and the need for complexity reduction, both permeating care practices. By means of a heuristic use of the concept of precariousness, taken as the installment of uncertainty, it is shown that relations and power in late modern care organizations have changed, precluding the use of a straightforward domination idea of power. In the final section a proposition is made how to rethink the care ethical inquiry in order to take late modern circumstances into account: inquiry should always be related to the concerns of people and practitioners from within care practices.

  14. An international survey of building energy codes and their implementation

    DOE PAGES

    Evans, Meredydd; Roshchanka, Volha; Graham, Peter

    2017-08-01

    Buildings are key to low-carbon development everywhere, and many countries have introduced building energy codes to improve energy efficiency in buildings. Yet, building energy codes can only deliver results when the codes are implemented. For this reason, studies of building energy codes need to consider implementation of building energy codes in a consistent and comprehensive way. This research identifies elements and practices in implementing building energy codes, covering codes in 22 countries that account for 70% of global energy use in buildings. These elements and practices include: comprehensive coverage of buildings by type, age, size, and geographic location; an implementationmore » framework that involves a certified agency to inspect construction at critical stages; and building materials that are independently tested, rated, and labeled. Training and supporting tools are another element of successful code implementation. Some countries have also introduced compliance evaluation studies, which suggested that tightening energy requirements would only be meaningful when also addressing gaps in implementation (Pitt&Sherry, 2014; U.S. DOE, 2016b). Here, this article provides examples of practices that countries have adopted to assist with implementation of building energy codes.« less

  15. An international survey of building energy codes and their implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Meredydd; Roshchanka, Volha; Graham, Peter

    Buildings are key to low-carbon development everywhere, and many countries have introduced building energy codes to improve energy efficiency in buildings. Yet, building energy codes can only deliver results when the codes are implemented. For this reason, studies of building energy codes need to consider implementation of building energy codes in a consistent and comprehensive way. This research identifies elements and practices in implementing building energy codes, covering codes in 22 countries that account for 70% of global energy use in buildings. These elements and practices include: comprehensive coverage of buildings by type, age, size, and geographic location; an implementationmore » framework that involves a certified agency to inspect construction at critical stages; and building materials that are independently tested, rated, and labeled. Training and supporting tools are another element of successful code implementation. Some countries have also introduced compliance evaluation studies, which suggested that tightening energy requirements would only be meaningful when also addressing gaps in implementation (Pitt&Sherry, 2014; U.S. DOE, 2016b). Here, this article provides examples of practices that countries have adopted to assist with implementation of building energy codes.« less

  16. WDEC: A Code for Modeling White Dwarf Structure and Pulsations

    NASA Astrophysics Data System (ADS)

    Bischoff-Kim, Agnès; Montgomery, Michael H.

    2018-05-01

    The White Dwarf Evolution Code (WDEC), written in Fortran, makes models of white dwarf stars. It is fast, versatile, and includes the latest physics. The code evolves hot (∼100,000 K) input models down to a chosen effective temperature by relaxing the models to be solutions of the equations of stellar structure. The code can also be used to obtain g-mode oscillation modes for the models. WDEC has a long history going back to the late 1960s. Over the years, it has been updated and re-packaged for modern computer architectures and has specifically been used in computationally intensive asteroseismic fitting. Generations of white dwarf astronomers and dozens of publications have made use of the WDEC, although the last true instrument paper is the original one, published in 1975. This paper discusses the history of the code, necessary to understand why it works the way it does, details the physics and features in the code today, and points the reader to where to find the code and a user guide.

  17. The environmental impact of dairy production: 1944 compared with 2007.

    PubMed

    Capper, J L; Cady, R A; Bauman, D E

    2009-06-01

    A common perception is that pasture-based, low-input dairy systems characteristic of the 1940s were more conducive to environmental stewardship than modern milk production systems. The objective of this study was to compare the environmental impact of modern (2007) US dairy production with historical production practices as exemplified by the US dairy system in 1944. A deterministic model based on the metabolism and nutrient requirements of the dairy herd was used to estimate resource inputs and waste outputs per billion kg of milk. Both the modern and historical production systems were modeled using characteristic management practices, herd population dynamics, and production data from US dairy farms. Modern dairy practices require considerably fewer resources than dairying in 1944 with 21% of animals, 23% of feedstuffs, 35% of the water, and only 10% of the land required to produce the same 1 billion kg of milk. Waste outputs were similarly reduced, with modern dairy systems producing 24% of the manure, 43% of CH(4), and 56% of N(2)O per billion kg of milk compared with equivalent milk from historical dairying. The carbon footprint per billion kilograms of milk produced in 2007 was 37% of equivalent milk production in 1944. To fulfill the increasing requirements of the US population for dairy products, it is essential to adopt management practices and technologies that improve productive efficiency, allowing milk production to be increased while reducing resource use and mitigating environmental impact.

  18. Evaluation of Proteus as a Tool for the Rapid Development of Models of Hydrologic Systems

    NASA Astrophysics Data System (ADS)

    Weigand, T. M.; Farthing, M. W.; Kees, C. E.; Miller, C. T.

    2013-12-01

    Models of modern hydrologic systems can be complex and involve a variety of operators with varying character. The goal is to implement approximations of such models that are both efficient for the developer and computationally efficient, which is a set of naturally competing objectives. Proteus is a Python-based toolbox that supports prototyping of model formulations as well as a wide variety of modern numerical methods and parallel computing. We used Proteus to develop numerical approximations for three models: Richards' equation, a brine flow model derived using the Thermodynamically Constrained Averaging Theory (TCAT), and a multiphase TCAT-based tumor growth model. For Richards' equation, we investigated discontinuous Galerkin solutions with higher order time integration based on the backward difference formulas. The TCAT brine flow model was implemented using Proteus and a variety of numerical methods were compared to hand coded solutions. Finally, an existing tumor growth model was implemented in Proteus to introduce more advanced numerics and allow the code to be run in parallel. From these three example models, Proteus was found to be an attractive open-source option for rapidly developing high quality code for solving existing and evolving computational science models.

  19. Practice Location Characteristics of Non-Traditional Dental Practices.

    PubMed

    Solomon, Eric S; Jones, Daniel L

    2016-04-01

    Current and future dental school graduates are increasingly likely to choose a non-traditional dental practice-a group practice managed by a dental service organization or a corporate practice with employed dentists-for their initial practice experience. In addition, the growth of non-traditional practices, which are located primarily in major urban areas, could accelerate the movement of dentists to those areas and contribute to geographic disparities in the distribution of dental services. To help the profession understand the implications of these developments, the aim of this study was to compare the location characteristics of non-traditional practices and traditional dental practices. After identifying non-traditional practices across the United States, the authors located those practices and traditional dental practices geographically by zip code. Non-traditional dental practices were found to represent about 3.1% of all dental practices, but they had a greater impact on the marketplace with almost twice the average number of staff and annual revenue. Virtually all non-traditional dental practices were located in zip codes that also had a traditional dental practice. Zip codes with non-traditional practices had significant differences from zip codes with only a traditional dental practice: the populations in areas with non-traditional practices had higher income levels and higher education and were slightly younger and proportionally more Hispanic; those practices also had a much higher likelihood of being located in a major metropolitan area. Dental educators and leaders need to understand the impact of these trends in the practice environment in order to both prepare graduates for practice and make decisions about planning for the workforce of the future.

  20. An Investigation of the Applicability of Modern Management Processes by Industrial Managers in Turkey.

    ERIC Educational Resources Information Center

    Lauter, Geza Peter

    This study noted American concepts of modern management which Turkish industrial managers tend to find difficult: identified cultural, economic, and other factors that impede application of modern management processes; and compared the practices of American overseas managers with those of Turkish managers of domestic firms. Managerial performance…

  1. Why different countries manage death differently: a comparative analysis of modern urban societies.

    PubMed

    Walter, Tony

    2012-03-01

    The sociology of death, dying and bereavement tends to take as its implicit frame either the nation state or a homogenous modernity. Between-nation differences in the management of death and dying are either ignored or untheorized. This article seeks to identify the factors that can explain both similarities and differences in the management of death between different modern western nations. Structural factors which affect all modern nations include urbanization and the division of labour leading to the dominance of professionals, migration, rationality and bureaucracy, information technology and the risk society. How these sociologically familiar structural features are responded to, however, depends on national histories, institutions and cultures. Historically, key transitional periods to modernity, different in different nations, necessitated particular institutional responses in the management of dying and dead bodies. Culturally, key factors include individualism versus collectivism, religion, secularization, boundary regulation, and expressivism. Global flows of death practices depend significantly on subjugated nations' perceptions of colonialism, neo-colonialism and modernity, which can lead to a dominant power's death practices being either imitated or rejected. © London School of Economics and Political Science 2012.

  2. FY 2002 strategic plan, Georgia Department of Transportation

    DOT National Transportation Integrated Search

    2001-08-01

    The Georgia DOT is authorized by Title 32 of the Georgia Code to organize, administer and operate an efficient, modern system of public roads, highways and other modes of transportation including public transit, rail, aviation, ports and bicycle and ...

  3. Code-Mixing as a Bilingual Instructional Strategy

    ERIC Educational Resources Information Center

    Jiang, Yih-Lin Belinda; García, Georgia Earnest; Willis, Arlette Ingram

    2014-01-01

    This study investigated code-mixing practices, specifically the use of L2 (English) in an L1 (Chinese) class in a U.S. bilingual program. Our findings indicate that the code-mixing practices made and prompted by the teacher served five pedagogical functions: (a) to enhance students' bilingualism and bilingual learning, (b) to review and…

  4. EASY-II Renaissance: n, p, d, α, γ-induced Inventory Code System

    NASA Astrophysics Data System (ADS)

    Sublet, J.-Ch.; Eastwood, J. W.; Morgan, J. G.

    2014-04-01

    The European Activation SYstem has been re-engineered and re-written in modern programming languages so as to answer today's and tomorrow's needs in terms of activation, transmutation, depletion, decay and processing of radioactive materials. The new FISPACT-II inventory code development project has allowed us to embed many more features in terms of energy range: up to GeV; incident particles: alpha, gamma, proton, deuteron and neutron; and neutron physics: self-shielding effects, temperature dependence and covariance, so as to cover all anticipated application needs: nuclear fission and fusion, accelerator physics, isotope production, stockpile and fuel cycle stewardship, materials characterization and life, and storage cycle management. In parallel, the maturity of modern, truly general purpose libraries encompassing thousands of target isotopes such as TENDL-2012, the evolution of the ENDF-6 format and the capabilities of the latest generation of processing codes PREPRO, NJOY and CALENDF have allowed the activation code to be fed with more robust, complete and appropriate data: cross sections with covariance, probability tables in the resonance ranges, kerma, dpa, gas and radionuclide production and 24 decay types. All such data for the five most important incident particles (n, p, d, α, γ), are placed in evaluated data files up to an incident energy of 200 MeV. The resulting code system, EASY-II is designed as a functional replacement for the previous European Activation System, EASY-2010. It includes many new features and enhancements, but also benefits already from the feedback from extensive validation and verification activities performed with its predecessor.

  5. A qualitative study of DRG coding practice in hospitals under the Thai Universal Coverage scheme.

    PubMed

    Pongpirul, Krit; Walker, Damian G; Winch, Peter J; Robinson, Courtland

    2011-04-08

    In the Thai Universal Coverage health insurance scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group-based retrospective payment, for which quality of the diagnosis and procedure codes is crucial. However, there has been limited understandings on which health care professions are involved and how the diagnosis and procedure coding is actually done within hospital settings. The objective of this study is to detail hospital coding structure and process, and to describe the roles of key hospital staff, and other related internal dynamics in Thai hospitals that affect quality of data submitted for inpatient care reimbursement. Research involved qualitative semi-structured interview with 43 participants at 10 hospitals chosen to represent a range of hospital sizes (small/medium/large), location (urban/rural), and type (public/private). Hospital Coding Practice has structural and process components. While the structural component includes human resources, hospital committee, and information technology infrastructure, the process component comprises all activities from patient discharge to submission of the diagnosis and procedure codes. At least eight health care professional disciplines are involved in the coding process which comprises seven major steps, each of which involves different hospital staff: 1) Discharge Summarization, 2) Completeness Checking, 3) Diagnosis and Procedure Coding, 4) Code Checking, 5) Relative Weight Challenging, 6) Coding Report, and 7) Internal Audit. The hospital coding practice can be affected by at least five main factors: 1) Internal Dynamics, 2) Management Context, 3) Financial Dependency, 4) Resource and Capacity, and 5) External Factors. Hospital coding practice comprises both structural and process components, involves many health care professional disciplines, and is greatly varied across hospitals as a result of five main factors.

  6. A qualitative study of DRG coding practice in hospitals under the Thai Universal Coverage Scheme

    PubMed Central

    2011-01-01

    Background In the Thai Universal Coverage health insurance scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group-based retrospective payment, for which quality of the diagnosis and procedure codes is crucial. However, there has been limited understandings on which health care professions are involved and how the diagnosis and procedure coding is actually done within hospital settings. The objective of this study is to detail hospital coding structure and process, and to describe the roles of key hospital staff, and other related internal dynamics in Thai hospitals that affect quality of data submitted for inpatient care reimbursement. Methods Research involved qualitative semi-structured interview with 43 participants at 10 hospitals chosen to represent a range of hospital sizes (small/medium/large), location (urban/rural), and type (public/private). Results Hospital Coding Practice has structural and process components. While the structural component includes human resources, hospital committee, and information technology infrastructure, the process component comprises all activities from patient discharge to submission of the diagnosis and procedure codes. At least eight health care professional disciplines are involved in the coding process which comprises seven major steps, each of which involves different hospital staff: 1) Discharge Summarization, 2) Completeness Checking, 3) Diagnosis and Procedure Coding, 4) Code Checking, 5) Relative Weight Challenging, 6) Coding Report, and 7) Internal Audit. The hospital coding practice can be affected by at least five main factors: 1) Internal Dynamics, 2) Management Context, 3) Financial Dependency, 4) Resource and Capacity, and 5) External Factors. Conclusions Hospital coding practice comprises both structural and process components, involves many health care professional disciplines, and is greatly varied across hospitals as a result of five main factors. PMID:21477310

  7. Scheduling observational and physical practice: influence on the coding of simple motor sequences.

    PubMed

    Ellenbuerger, Thomas; Boutin, Arnaud; Blandin, Yannick; Shea, Charles H; Panzer, Stefan

    2012-01-01

    The main purpose of the present experiment was to determine the coordinate system used in the development of movement codes when observational and physical practice are scheduled across practice sessions. The task was to reproduce a 1,300-ms spatial-temporal pattern of elbow flexions and extensions. An intermanual transfer paradigm with a retention test and two effector (contralateral limb) transfer tests was used. The mirror effector transfer test required the same pattern of homologous muscle activation and sequence of limb joint angles as that performed or observed during practice, and the non-mirror effector transfer test required the same spatial pattern movements as that performed or observed. The test results following the first acquisition session replicated the findings of Gruetzmacher, Panzer, Blandin, and Shea (2011) . The results following the second acquisition session indicated a strong advantage for participants who received physical practice in both practice sessions or received observational practice followed by physical practice. This advantage was found on both the retention and the mirror transfer tests compared to the non-mirror transfer test. These results demonstrate that codes based in motor coordinates can be developed relatively quickly and effectively for a simple spatial-temporal movement sequence when participants are provided with physical practice or observation followed by physical practice, but physical practice followed by observational practice or observational practice alone limits the development of codes based in motor coordinates.

  8. Are nursing codes of practice ethical?

    PubMed

    Pattison, S

    2001-01-01

    This article provides a theoretical critique from a particular 'ideal type' ethical perspective of professional codes in general and the United Kingdom Central Council for Nursing, Midwifery and Health Visiting (UKCC) Code of professional conduct (reprinted on pp. 77-78) in particular. Having outlined a specific 'ideal type' of what ethically informed and aware practice may be, the article examines the extent to which professional codes may be likely to elicit and engender such practice. Because of their terminological inexactitudes and confusions, their arbitrary values and principles, their lack of helpful ethical guidance, and their exclusion of ordinary moral experience, a number of contemporary professional codes in health and social care can be arraigned as ethically inadequate. The UKCC Code of professional conduct embodies many of these flaws, and others besides. Some of its weaknesses in this respect are anatomized before some tentative suggestions are offered for the reform of codes and the engendering of greater ethical awareness among professionals in the light of greater public ethical concerns and values.

  9. The barriers to clinical coding in general practice: a literature review.

    PubMed

    de Lusignan, S

    2005-06-01

    Clinical coding is variable in UK general practice. The reasons for this remain undefined. This review explains why there are no readily available alternatives to recording structured clinical data and reviews the barriers to recording structured clinical data. Methods used included a literature review of bibliographic databases, university health informatics departments, and national and international medical informatics associations. The results show that the current state of development of computers and data processing means there is no practical alternative to coding data. The identified barriers to clinical coding are: the limitations of the coding systems and terminologies and the skill gap in their use; recording structured data in the consultation takes time and is distracting; the level of motivation of primary care professionals; and the priority within the organization. A taxonomy is proposed to describe the barriers to clinical coding. This can be used to identify barriers to coding and facilitate the development of strategies to overcome them.

  10. Patient and health care professional views and experiences of computer agent-supported health care.

    PubMed

    Neville, Ron G; Greene, Alexandra C; Lewis, Sue

    2006-01-01

    To explore patient and health care professional (HCP) views towards the use of multi-agent computer systems in their GP practice. Qualitative analysis of in-depth interviews and analysis of transcriptions. Urban health centre in Dundee, Scotland. Five representative healthcare professionals and 11 patients. Emergent themes from interviews revealed participants' attitudes and beliefs, which were coded and indexed. Patients and HCPs had similar beliefs, attitudes and views towards the implementation of multi-agent systems (MAS). Both felt modern communication methods were useful to supplement, not supplant, face-to-face consultations between doctors and patients. This was based on the immense trust these patients placed in their doctors in this practice, which extended to trust in their choice of communication technology and security. Rapid access to medical information increased patients' sense of shared partnership and self-efficacy. Patients and HCPs expressed respect for each other's time and were keen to embrace technology that made interactions more efficient, including for the altruistic benefit of others less technically competent. Patients and HCPs welcomed the introduction of agent technology to the delivery of health care. Widespread use will depend more on the trust patients place in their own GP than on technological issues.

  11. Computer-assisted total hip arthroplasty: coding the next generation of navigation systems for orthopedic surgery.

    PubMed

    Renkawitz, Tobias; Tingart, Markus; Grifka, Joachim; Sendtner, Ernst; Kalteis, Thomas

    2009-09-01

    This article outlines the scientific basis and a state-of-the-art application of computer-assisted orthopedic surgery in total hip arthroplasty (THA) and provides a future perspective on this technology. Computer-assisted orthopedic surgery in primary THA has the potential to couple 3D simulations with real-time evaluations of surgical performance, which has brought these developments from the research laboratory all the way to clinical use. Nonimage- or imageless-based navigation systems without the need for additional pre- or intra-operative image acquisition have stood the test to significantly reduce the variability in positioning the acetabular component and have shown precise measurement of leg length and offset changes during THA. More recently, computer-assisted orthopedic surgery systems have opened a new frontier for accurate surgical practice in minimally invasive, tissue-preserving THA. The future generation of imageless navigation systems will switch from simple measurement tasks to real navigation tools. These software algorithms will consider the cup and stem as components of a coupled biomechanical system, navigating the orthopedic surgeon to find an optimized complementary component orientation rather than target values intraoperatively, and are expected to have a high impact on clinical practice and postoperative functionality in modern THA.

  12. On the Ottoman consent documents for medical interventions and the modern concept of informed consent.

    PubMed

    Kara, Mahmut A; Aksoy, Sahin

    2006-09-01

    Information for patients prior to medical intervention is one of the principles of modern medical practice. In this study, we looked at an earlier practice of this principle. Ottoman judges had record books called sicil. One of the categories in sicils was the consent documents called riza senedi, which was a patient-physician contract approved by the courts. These contracts were especially for the protection of physicians from punishment if the patient dies. It is not clear whether patients were informed properly or not. Consent for minors was obtained from parents. However, a situation where an adult does not have the capacity to consent, was not clear in these documents. Any sign of free withdrawal of consent was not found in these records. Due to the legal system of Ottoman State, these contracts were related to Islamic law rather than modern civil law. We aim, in this paper, to present a legal practice, which is possible to consider as an early example of the informed consent practice.

  13. Avoid lost discoveries, because of violations of standard assumptions, by using modern robust statistical methods.

    PubMed

    Wilcox, Rand; Carlson, Mike; Azen, Stan; Clark, Florence

    2013-03-01

    Recently, there have been major advances in statistical techniques for assessing central tendency and measures of association. The practical utility of modern methods has been documented extensively in the statistics literature, but they remain underused and relatively unknown in clinical trials. Our objective was to address this issue. STUDY DESIGN AND PURPOSE: The first purpose was to review common problems associated with standard methodologies (low power, lack of control over type I errors, and incorrect assessments of the strength of the association). The second purpose was to summarize some modern methods that can be used to circumvent such problems. The third purpose was to illustrate the practical utility of modern robust methods using data from the Well Elderly 2 randomized controlled trial. In multiple instances, robust methods uncovered differences among groups and associations among variables that were not detected by classic techniques. In particular, the results demonstrated that details of the nature and strength of the association were sometimes overlooked when using ordinary least squares regression and Pearson correlation. Modern robust methods can make a practical difference in detecting and describing differences between groups and associations between variables. Such procedures should be applied more frequently when analyzing trial-based data. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Modern roundabouts for Oregon

    DOT National Transportation Integrated Search

    1998-06-01

    This report reviews current research and practice on modern roundabouts, both in the US and other countries. The report compares the advantages and disadvantages of roundabouts, summarizes safety implications, and discusses pedestrian and bicyclist c...

  15. [Modern-day slavery as a public health issue].

    PubMed

    Leão, Luís Henrique da Costa

    2016-12-01

    Modern-day slave labor is one of the most pernicious and persistent social problems in Brazil. In the light of the need to implement a national occupational health policy, this paper discusses slave labor as a public health concern, highlighting possibilities for broadening strategies for vigilance and comprehensive care for this specific working population. Exploratory qualitative research was carried out based on the "social construction of reality" proposed by Lenoir, Berger and Luckmann. The investigation consisted of a theoretical review of modern-day slave labor on the national and international scene within the scope of the human, social and public health sciences and an analysis of social and political practices to tackle modern-day slave labor was conducted in the State of Rio de Janeiro. Semi-structured individual and group interviews with workers and representatives of social movements and public institutions were organized. The results reveal the theoretical and practical dimensions of slave labor and its relations with the health field and highlight the role and potential of public health in the enhancing of vigilance practices and health care of workers subjected to these chronic social conditions.

  16. Incurable suffering from the "hiatus theoreticus"? Some epistemological problems in modern medicine and the clinical relevance of philosophy of medicine.

    PubMed

    Paul, N

    1998-06-01

    Up to now neither the question, whether all theoretical medical knowledge can at least be described as scientific, nor the one how exactly access to the existing scientific and theoretical medical knowledge during clinical problem-solving is made, has been sufficiently answered. Scientific theories play an important role in controlling clinical practice and improving the quality of clinical care in modern medicine on the one hand, and making it vindicable on the other. Therefore, the vagueness of unexplicit interrelations between medicine's stock of knowledge and medical practice appears as a gap in the theoretical concept of modern medicine which can be described as "Hiatus theoreticus" in the anatomy of medicine. A central intention of the paper is to analyze the role of philosophy of medicine for the clarification of the theoretical basis of medical practice. Clinical relevance and normativity in the sense of modern theory of science are suggested as criteria to establish a differentiation between philosophy of medicine as a primary medical discipline and the application of general philosophy in medicine.

  17. A Critical Evaluation of Phrónêsis as a Key Tool for Professional Excellence for Modern Managers

    ERIC Educational Resources Information Center

    Thomas, Shinto

    2017-01-01

    Phrónêsis or practical wisdom is an important element of Aristotelian virtue ethics. This paper is an attempt to study what is meant by Phrónêsis, how it might be understood, reinterpreted, applied, and extended in contemporary professional management practice and its role in enhancing professional excellence in modern managers. Phrónêsis can…

  18. IFRP studies child-spacing trends in subSaharan Africa.

    PubMed

    1981-10-01

    The IFRP (International Fertility Research Program) conducted in late 1980 a study of the changing patterns of breast feeding, postpartum abstinence, and use of modern contraceptive methods among women of reproductive age in Lagos, Nigeria, the largest city in tropical Africa. There is concern among family planning experts that the traditional child-spacing practices are breaking down in subSaharan Africa without being replaced sufficiently with modern fertility control methods. The household survey showed a total fertility rate of 6.56. Compared to the mean cumulative fertility of 6.19 for women aged 45-49, it appears that fertility is actually increasing in Lagos. There were low levels of awareness and practice of modern family planning methods reported by these women. Less than 1/4 were using a method of family planning currently. Of these, 7% were using a modern method (mostly pills), 3% were using a conventional method (condoms), and 14% were using only a traditional method. Many of these women did not consider either breast feeding or sexual abstinence as means of contraception. Breast feeding was practiced by 85% of the women during the 1st 2 months, with rates falling off sharply after that. Child spacing rather than fertility limitation was the motivation of most of the women practicing family planning.

  19. Code of Practice for Scientific Diving: Principles for the Safe Practice of Scientific Diving in Different Environments. Unesco Technical Papers in Marine Science 53.

    ERIC Educational Resources Information Center

    Flemming, N. C., Ed.; Max, M. D., Ed.

    This publication has been prepared to provide scientific divers with guidance on safe practice under varying experimental and environmental conditions. The Code offers advice and recommendations on administrative practices, insurance, terms of employment, medical standards, training standards, dive planning, safety with different breathing gases…

  20. Modeling transonic aerodynamic response using nonlinear systems theory for use with modern control theory

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1993-01-01

    The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.

  1. The journey from forensic to predictive materials science using density functional theory

    DOE PAGES

    Schultz, Peter A.

    2017-09-12

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  2. The journey from forensic to predictive materials science using density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Peter A.

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  3. Superdense Coding over Optical Fiber Links with Complete Bell-State Measurements

    NASA Astrophysics Data System (ADS)

    Williams, Brian P.; Sadlier, Ronald J.; Humble, Travis S.

    2017-02-01

    Adopting quantum communication to modern networking requires transmitting quantum information through a fiber-based infrastructure. We report the first demonstration of superdense coding over optical fiber links, taking advantage of a complete Bell-state measurement enabled by time-polarization hyperentanglement, linear optics, and common single-photon detectors. We demonstrate the highest single-qubit channel capacity to date utilizing linear optics, 1.665 ±0.018 , and we provide a full experimental implementation of a hybrid, quantum-classical communication protocol for image transfer.

  4. Comparison of the sand liquefaction estimated based on codes and practical earthquake damage phenomena

    NASA Astrophysics Data System (ADS)

    Fang, Yi; Huang, Yahong

    2017-12-01

    Conducting sand liquefaction estimated based on codes is the important content of the geotechnical design. However, the result, sometimes, fails to conform to the practical earthquake damages. Based on the damage of Tangshan earthquake and engineering geological conditions, three typical sites are chosen. Moreover, the sand liquefaction probability was evaluated on the three sites by using the method in the Code for Seismic Design of Buildings and the results were compared with the sand liquefaction phenomenon in the earthquake. The result shows that the difference between sand liquefaction estimated based on codes and the practical earthquake damage is mainly attributed to the following two aspects: The primary reasons include disparity between seismic fortification intensity and practical seismic oscillation, changes of groundwater level, thickness of overlying non-liquefied soil layer, local site effect and personal error. Meanwhile, although the judgment methods in the codes exhibit certain universality, they are another reason causing the above difference due to the limitation of basic data and the qualitative anomaly of the judgment formulas.

  5. Internal Corrosion Control of Water Supply Systems Code of Practice

    EPA Science Inventory

    This Code of Practice is part of a series of publications by the IWA Specialist Group on Metals and Related Substances in Drinking Water. It complements the following IWA Specialist Group publications: 1. Best Practice Guide on the Control of Lead in Drinking Water 2. Best Prac...

  6. GloVe C++ v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Jonathan A.

    2015-12-02

    This code implements the GloVe algorithm for learning word vectors from a text corpus. It uses a modern C++ approach. This algorithm is described in the open literature in the referenced paper by Pennington, Jeffrey, Richard Socher, and Christopher D. Manning.

  7. Code Switching in English Language Teaching (ELT) Teaching Practice in Turkey: Student Teacher Practices, Beliefs and Identity

    ERIC Educational Resources Information Center

    Bilgin, Sezen Seymen

    2016-01-01

    Code switching involves the interplay of two languages and as well as serving linguistic functions, it has social and psychological implications. In the context of English language teaching, these psychological implications reveal themselves as teachers' thought processes. While the nature of code switching in language classrooms has been widely…

  8. Hospital Coding Practice, Data Quality, and DRG-Based Reimbursement under the Thai Universal Coverage Scheme

    ERIC Educational Resources Information Center

    Pongpirul, Krit

    2011-01-01

    In the Thai Universal Coverage scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group (DRG) reimbursement. Questionable quality of the submitted DRG codes has been of concern whereas knowledge about hospital coding practice has been lacking. The objectives of this thesis are (1) To explore hospital coding…

  9. Globalisation, Post-Modernity and the State: Comparative Education Facing the Third Millennium.

    ERIC Educational Resources Information Center

    Welch, Anthony R.

    2001-01-01

    Globalization and post-modernity are linked to changes in the nature of late capitalism and crises in the modern state. Neither offers much in practice to the much needed renewal of democracy, including in education. Indeed, both arguably contribute to a trend towards individualism, and a retreat from democratic engagement and visions of the…

  10. The Sacred or the Profane: The Challenge of Modern Dance in Religious Educational Settings

    ERIC Educational Resources Information Center

    Clement, Karen

    2008-01-01

    The article addresses the utilization of modern dance compositional approaches in the development of sacred dance works. A brief history of sacred dance in the Western Church is traced as a foundation for students' stereotypical approaches to dance and religion. Also examined is the 20th Century modern dance choreographers' practice of…

  11. Innovative Teaching Practice: Traditional and Alternative Methods (Challenges and Implications)

    ERIC Educational Resources Information Center

    Nurutdinova, Aida R.; Perchatkina, Veronika G.; Zinatullina, Liliya M.; Zubkova, Guzel I.; Galeeva, Farida T.

    2016-01-01

    The relevance of the present issue is caused be the strong need in alternative methods of learning foreign language and the need in language training and retraining for the modern professionals. The aim of the article is to identify the basic techniques and skills in using various modern techniques in the context of modern educational tasks. The…

  12. Christian Higher Education in a Changing Context: Shifting from Pillars to Practices

    ERIC Educational Resources Information Center

    Broer, Nico A.; Hoogland, Jan; van der Stoep, Jan

    2017-01-01

    The process of global modernization has reached a new phase. In many parts of the world, societies have become so complex that the logic that characterized first modernity no longer works. Simultaneously, societies are confronted by huge and complex side effects of modern rationality, such as climate change, migration influx, global inequality,…

  13. Organization-based self-development prescriptive model for the promotion of professional development of Iranian clinical nurses.

    PubMed

    Rahimaghaee, Flora; Nayeri, Nahid Dehghan; Mohammadi, Eesa; Salavati, Shahram

    2015-01-01

    Professional development is reiterated in the new definition of modern organizations as a serious undertaking of organizations. This article aims to present and describe a prescriptive model to increase the quality of professional development of Iranian nurses within an organization-based framework. This article is an outcome of the results of a study based on grounded theory describing how Iranian nurses develop. The present study adopted purposive sampling and the initial participants were experienced clinical nurses. Then, the study continued by theoretical sampling. The present study involved 21 participants. Data were mainly collected through interviews. Analysis began with open coding and continued with axial coding and selective coding. Trustworthiness was ensured by applying Lincoln and Guba criteria such as credibility, dependability, and conformability. Based on the data gathered in the study and a thorough review of related literature, a prescriptive model has been designed by use of the methodology of Walker and Avant (2005). In this model, the first main component is a three-part structure: Reformation to establish a value-assigning structure, a position for human resource management, and a job redesigning. The second component is certain of opportunities for organization-oriented development. These strategies are as follows: Raising the sensitivity of the organization toward development, goal setting and planning the development of human resources, and improving management practices. Through this model, clinical nurses' professional development can transform the profession from an individual, randomized activity into more planned and systematized services. This model can lead to a better quality of care.

  14. Cross-terminology mapping challenges: a demonstration using medication terminological systems.

    PubMed

    Saitwal, Himali; Qing, David; Jones, Stephen; Bernstam, Elmer V; Chute, Christopher G; Johnson, Todd R

    2012-08-01

    Standardized terminological systems for biomedical information have provided considerable benefits to biomedical applications and research. However, practical use of this information often requires mapping across terminological systems-a complex and time-consuming process. This paper demonstrates the complexity and challenges of mapping across terminological systems in the context of medication information. It provides a review of medication terminological systems and their linkages, then describes a case study in which we mapped proprietary medication codes from an electronic health record to SNOMED CT and the UMLS Metathesaurus. The goal was to create a polyhierarchical classification system for querying an i2b2 clinical data warehouse. We found that three methods were required to accurately map the majority of actively prescribed medications. Only 62.5% of source medication codes could be mapped automatically. The remaining codes were mapped using a combination of semi-automated string comparison with expert selection, and a completely manual approach. Compound drugs were especially difficult to map: only 7.5% could be mapped using the automatic method. General challenges to mapping across terminological systems include (1) the availability of up-to-date information to assess the suitability of a given terminological system for a particular use case, and to assess the quality and completeness of cross-terminology links; (2) the difficulty of correctly using complex, rapidly evolving, modern terminologies; (3) the time and effort required to complete and evaluate the mapping; (4) the need to address differences in granularity between the source and target terminologies; and (5) the need to continuously update the mapping as terminological systems evolve. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Organization-based self-development prescriptive model for the promotion of professional development of Iranian clinical nurses

    PubMed Central

    Rahimaghaee, Flora; Nayeri, Nahid Dehghan; Mohammadi, Eesa; Salavati, Shahram

    2015-01-01

    Background: Professional development is reiterated in the new definition of modern organizations as a serious undertaking of organizations. This article aims to present and describe a prescriptive model to increase the quality of professional development of Iranian nurses within an organization-based framework. Materials and Methods: This article is an outcome of the results of a study based on grounded theory describing how Iranian nurses develop. The present study adopted purposive sampling and the initial participants were experienced clinical nurses. Then, the study continued by theoretical sampling. The present study involved 21 participants. Data were mainly collected through interviews. Analysis began with open coding and continued with axial coding and selective coding. Trustworthiness was ensured by applying Lincoln and Guba criteria such as credibility, dependability, and conformability. Based on the data gathered in the study and a thorough review of related literature, a prescriptive model has been designed by use of the methodology of Walker and Avant (2005). Results: In this model, the first main component is a three-part structure: Reformation to establish a value-assigning structure, a position for human resource management, and a job redesigning. The second component is certain of opportunities for organization-oriented development. These strategies are as follows: Raising the sensitivity of the organization toward development, goal setting and planning the development of human resources, and improving management practices. Conclusions: Through this model, clinical nurses’ professional development can transform the profession from an individual, randomized activity into more planned and systematized services. This model can lead to a better quality of care. PMID:26457100

  16. Cross-terminology mapping challenges: A demonstration using medication terminological systems

    PubMed Central

    Saitwal, Himali; Qing, David; Jones, Stephen; Bernstam, Elmer; Chute, Christopher G.; Johnson, Todd R.

    2015-01-01

    Standardized terminological systems for biomedical information have provided considerable benefits to biomedical applications and research. However, practical use of this information often requires mapping across terminological systems—a complex and time-consuming process. This paper demonstrates the complexity and challenges of mapping across terminological systems in the context of medication information. It provides a review of medication terminological systems and their linkages, then describes a case study in which we mapped proprietary medication codes from an electronic health record to SNOMED-CT and the UMLS Metathesaurus. The goal was to create a polyhierarchical classification system for querying an i2b2 clinical data warehouse. We found that three methods were required to accurately map the majority of actively prescribed medications. Only 62.5% of source medication codes could be mapped automatically. The remaining codes were mapped using a combination of semi-automated string comparison with expert selection, and a completely manual approach. Compound drugs were especially difficult to map: only 7.5% could be mapped using the automatic method. General challenges to mapping across terminological systems include (1) the availability of up-to-date information to assess the suitability of a given terminological system for a particular use case, and to assess the quality and completeness of cross-terminology links; (2) the difficulty of correctly using complex, rapidly evolving, modern terminologies; (3) the time and effort required to complete and evaluate the mapping; (4) the need to address differences in granularity between the source and target terminologies; and (5) the need to continuously update the mapping as terminological systems evolve. PMID:22750536

  17. Development of an Object-Oriented Turbomachinery Analysis Code within the NPSS Framework

    NASA Technical Reports Server (NTRS)

    Jones, Scott M.

    2014-01-01

    During the preliminary or conceptual design phase of an aircraft engine, the turbomachinery designer has a need to estimate the effects of a large number of design parameters such as flow size, stage count, blade count, radial position, etc. on the weight and efficiency of a turbomachine. Computer codes are invariably used to perform this task however, such codes are often very old, written in outdated languages with arcane input files, and rarely adaptable to new architectures or unconventional layouts. Given the need to perform these kinds of preliminary design trades, a modern 2-D turbomachinery design and analysis code has been written using the Numerical Propulsion System Simulation (NPSS) framework. This paper discusses the development of the governing equations and the structure of the primary objects used in OTAC.

  18. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it maymore » be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.« less

  19. Reactive transport codes for subsurface environmental simulation

    DOE PAGES

    Steefel, C. I.; Appelo, C. A. J.; Arora, B.; ...

    2014-09-26

    A general description of the mathematical and numerical formulations used in modern numerical reactive transport codes relevant for subsurface environmental simulations is presented. The formulations are followed by short descriptions of commonly used and available subsurface simulators that consider continuum representations of flow, transport, and reactions in porous media. These formulations are applicable to most of the subsurface environmental benchmark problems included in this special issue. The list of codes described briefly here includes PHREEQC, HPx, PHT3D, OpenGeoSys (OGS), HYTEC, ORCHESTRA, TOUGHREACT, eSTOMP, HYDROGEOCHEM, CrunchFlow, MIN3P, and PFLOTRAN. The descriptions include a high-level list of capabilities for each of themore » codes, along with a selective list of applications that highlight their capabilities and historical development.« less

  20. SPAMCART: a code for smoothed particle Monte Carlo radiative transfer

    NASA Astrophysics Data System (ADS)

    Lomax, O.; Whitworth, A. P.

    2016-10-01

    We present a code for generating synthetic spectral energy distributions and intensity maps from smoothed particle hydrodynamics simulation snapshots. The code is based on the Lucy Monte Carlo radiative transfer method, I.e. it follows discrete luminosity packets as they propagate through a density field, and then uses their trajectories to compute the radiative equilibrium temperature of the ambient dust. The sources can be extended and/or embedded, and discrete and/or diffuse. The density is not mapped on to a grid, and therefore the calculation is performed at exactly the same resolution as the hydrodynamics. We present two example calculations using this method. First, we demonstrate that the code strictly adheres to Kirchhoff's law of radiation. Secondly, we present synthetic intensity maps and spectra of an embedded protostellar multiple system. The algorithm uses data structures that are already constructed for other purposes in modern particle codes. It is therefore relatively simple to implement.

  1. OSIRIS - an object-oriented parallel 3D PIC code for modeling laser and particle beam-plasma interaction

    NASA Astrophysics Data System (ADS)

    Hemker, Roy

    1999-11-01

    The advances in computational speed make it now possible to do full 3D PIC simulations of laser plasma and beam plasma interactions, but at the same time the increased complexity of these problems makes it necessary to apply modern approaches like object oriented programming to the development of simulation codes. We report here on our progress in developing an object oriented parallel 3D PIC code using Fortran 90. In its current state the code contains algorithms for 1D, 2D, and 3D simulations in cartesian coordinates and for 2D cylindrically-symmetric geometry. For all of these algorithms the code allows for a moving simulation window and arbitrary domain decomposition for any number of dimensions. Recent 3D simulation results on the propagation of intense laser and electron beams through plasmas will be presented.

  2. NORTICA—a new code for cyclotron analysis

    NASA Astrophysics Data System (ADS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  3. DRG coding practice: a nationwide hospital survey in Thailand

    PubMed Central

    2011-01-01

    Background Diagnosis Related Group (DRG) payment is preferred by healthcare reform in various countries but its implementation in resource-limited countries has not been fully explored. Objectives This study was aimed (1) to compare the characteristics of hospitals in Thailand that were audited with those that were not and (2) to develop a simplified scale to measure hospital coding practice. Methods A questionnaire survey was conducted of 920 hospitals in the Summary and Coding Audit Database (SCAD hospitals, all of which were audited in 2008 because of suspicious reports of possible DRG miscoding); the questionnaire also included 390 non-SCAD hospitals. The questionnaire asked about general demographics of the hospitals, hospital coding structure and process, and also included a set of 63 opinion-oriented items on the current hospital coding practice. Descriptive statistics and exploratory factor analysis (EFA) were used for data analysis. Results SCAD and Non-SCAD hospitals were different in many aspects, especially the number of medical statisticians, experience of medical statisticians and physicians, as well as number of certified coders. Factor analysis revealed a simplified 3-factor, 20-item model to assess hospital coding practice and classify hospital intention. Conclusion Hospital providers should not be assumed capable of producing high quality DRG codes, especially in resource-limited settings. PMID:22040256

  4. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.

  5. 78 FR 51139 - Notice of Proposed Changes to the National Handbook of Conservation Practices for the Natural...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-20

    ... (Code 324), Field Border (Code 386), Filter Strip (Code 393), Land Smoothing (Code 466), Livestock... the implementation requirement document to the specifications and plans. Filter Strip (Code 393)--The...

  6. Fast and Adaptive Lossless Onboard Hyperspectral Data Compression System

    NASA Technical Reports Server (NTRS)

    Aranki, Nazeeh I.; Keymeulen, Didier; Kimesh, Matthew A.

    2012-01-01

    Modern hyperspectral imaging systems are able to acquire far more data than can be downlinked from a spacecraft. Onboard data compression helps to alleviate this problem, but requires a system capable of power efficiency and high throughput. Software solutions have limited throughput performance and are power-hungry. Dedicated hardware solutions can provide both high throughput and power efficiency, while taking the load off of the main processor. Thus a hardware compression system was developed. The implementation uses a field-programmable gate array (FPGA). The implementation is based on the fast lossless (FL) compression algorithm reported in Fast Lossless Compression of Multispectral-Image Data (NPO-42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which achieves excellent compression performance and has low complexity. This algorithm performs predictive compression using an adaptive filtering method, and uses adaptive Golomb coding. The implementation also packetizes the coded data. The FL algorithm is well suited for implementation in hardware. In the FPGA implementation, one sample is compressed every clock cycle, which makes for a fast and practical realtime solution for space applications. Benefits of this implementation are: 1) The underlying algorithm achieves a combination of low complexity and compression effectiveness that exceeds that of techniques currently in use. 2) The algorithm requires no training data or other specific information about the nature of the spectral bands for a fixed instrument dynamic range. 3) Hardware acceleration provides a throughput improvement of 10 to 100 times vs. the software implementation. A prototype of the compressor is available in software, but it runs at a speed that does not meet spacecraft requirements. The hardware implementation targets the Xilinx Virtex IV FPGAs, and makes the use of this compressor practical for Earth satellites as well as beyond-Earth missions with hyperspectral instruments.

  7. A simple approach to improve recording of concerns about childmaltreatment in primary care records: developing a quality improvement intervention

    PubMed Central

    Woodman, Jenny; Allister, Janice; Rafi, Imran; de Lusignan, Simon; Belsey, Jonathan; Petersen, Irene; Gilbert, Ruth

    2012-01-01

    Background Information is lacking on how concerns about child maltreatment are recorded in primary care records. Aim To determine how the recording of child maltreatment concerns can be improved. Design and setting Development of a quality improvement intervention involving: clinical audit, a descriptive survey, telephone interviews, a workshop, database analyses, and consensus development in UK general practice. Method Descriptive analyses and incidence estimates were carried out based on 11 study practices and 442 practices in The Health Improvement Network (THIN). Telephone interviews, a workshop, and a consensus development meeting were conducted with lead GPs from 11 study practices. Results The rate of children with at least one maltreatment-related code was 8.4/1000 child years (11 study practices, 2009–2010), and 8.0/1000 child years (THIN, 2009–2010). Of 25 patients with known maltreatment, six had no maltreatment-related codes recorded, but all had relevant free text, scanned documents, or codes. When stating their reasons for undercoding maltreatment concerns, GPs cited damage to the patient relationship, uncertainty about which codes to use, and having concerns about recording information on other family members in the child’s records. Consensus recommendations are to record the code ‘child is cause for concern’ as a red flag whenever maltreatment is considered, and to use a list of codes arranged around four clinical concepts, with an option for a templated short data entry form. Conclusion GPs under-record maltreatment-related concerns in children’s electronic medical records. As failure to use codes makes it impossible to search or audit these cases, an approach designed to be simple and feasible to implement in UK general practice was recommended. PMID:22781996

  8. Practice management education during surgical residency.

    PubMed

    Jones, Kory; Lebron, Ricardo A; Mangram, Alicia; Dunn, Ernest

    2008-12-01

    Surgical education has undergone radical changes in the past decade. The introductions of laparoscopic surgery and endovascular techniques have required program directors to alter surgical training. The 6 competencies are now in place. One issue that still needs to be addressed is the business aspect of surgical practice. Often residents complete their training with minimal or no knowledge on coding of charges or basic aspects on how to set up a practice. We present our program, which has been in place over the past 2 years and is designed to teach the residents practice management. The program begins with a series of 10 lectures given monthly beginning in August. Topics include an introduction to types of practices available, negotiating a contract, managed care, and marketing the practice. Both medical and surgical residents attend these conferences. In addition, the surgical residents meet monthly with the business office to discuss billing and coding issues. These are didactic sessions combined with in-house chart reviews of surgical coding. The third phase of the practice management plan has the coding team along with the program director attend the outpatient clinic to review in real time the evaluation and management coding of clinic visits. Resident evaluations were completed for each of the practice management lectures. The responses were recorded on a Likert scale. The scores ranged from 4.1 to 4.8 (average, 4.3). Highest scores were given to lectures concerning negotiating employee agreements, recruiting contracts, malpractice insurance, and risk management. The medical education department has tracked resident coding compliance over the past 2 years. Surgical coding compliance increased from 36% to 88% over a 12-month period. The program director who participated in the educational process increased his accuracy from 50% to 90% over the same time period. When residents finish their surgical training they need to be ready to enter the world of business. These needs will be present whether pursuing a career in academic medicine or the private sector. A program that focuses on the business aspect of surgery enables the residents to better navigate the future while helping to fulfill the systems-based practice competency.

  9. A Pragmatic Approach to the Application of the Code of Ethics in Nursing Education.

    PubMed

    Tinnon, Elizabeth; Masters, Kathleen; Butts, Janie

    The code of ethics for nurses was written for nurses in all settings. However, the language focuses primarily on the nurse in context of the patient relationship, which may make it difficult for nurse educators to internalize the code to inform practice. The purpose of this article is to explore the code of ethics, establish that it can be used to guide nurse educators' practice, and provide a pragmatic approach to application of the provisions.

  10. CMCpy: Genetic Code-Message Coevolution Models in Python

    PubMed Central

    Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.

    2013-01-01

    Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367

  11. Global Magnetohydrodynamic Simulation Using High Performance FORTRAN on Parallel Computers

    NASA Astrophysics Data System (ADS)

    Ogino, T.

    High Performance Fortran (HPF) is one of modern and common techniques to achieve high performance parallel computation. We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5 VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.

  12. Analysis of Variance in the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Deloach, Richard

    2010-01-01

    This paper is a tutorial introduction to the analysis of variance (ANOVA), intended as a reference for aerospace researchers who are being introduced to the analytical methods of the Modern Design of Experiments (MDOE), or who may have other opportunities to apply this method. One-way and two-way fixed-effects ANOVA, as well as random effects ANOVA, are illustrated in practical terms that will be familiar to most practicing aerospace researchers.

  13. Construction safety program for the National Ignition Facility, July 30, 1999 (NIF-0001374-OC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benjamin, D W

    1999-07-30

    These rules apply to all LLNL employees, non-LLNL employees (including contract labor, supplemental labor, vendors, personnel matrixed/assigned from other National Laboratories, participating guests, visitors and students) and contractors/subcontractors. The General Rules-Code of Safe Practices shall be used by management to promote accident prevention through indoctrination, safety and health training and on-the-job application. As a condition for contracts award, all contractors and subcontractors and their employees must certify on Form S and H A-l that they have read and understand, or have been briefed and understand, the National Ignition Facility OCIP Project General Rules-Code of Safe Practices. (An interpreter must briefmore » those employees who do not speak or read English fluently.) In addition, all contractors and subcontractors shall adopt a written General Rules-Code of Safe Practices that relates to their operations. The General Rules-Code of Safe Practices must be posted at a conspicuous location at the job site office or be provided to each supervisory employee who shall have it readily available. Copies of the General Rules-Code of Safe Practices can also be included in employee safety pamphlets.« less

  14. Words and works in the history of alchemy.

    PubMed

    Nummedal, Tara E

    2011-06-01

    This essay considers the implications of a shift in focus from ideas to practices in the history of alchemy. On the one hand, it is argued, this new attention to practice highlights the diversity of ways that early modern Europeans engaged alchemy, ranging from the literary to the entrepreneurial and artisanal, as well as the broad range of social and cultural spaces that alchemists inhabited. At the same time, however, recent work has demonstrated what most alchemists shared-namely, a penchant for reading, writing, making, and doing, all at the same time. Any history of early modern alchemy, therefore, must attend to all of these practices, as well as the interplay among them. In this sense, alchemy offers a model for thinking and writing about early modern science more generally, particularly in light of recent work that has explored the intersection of scholarly, artisanal, and entrepreneurial forms of knowledge in the early modem period.

  15. Exploring the Utility of Sequential Analysis in Studying Informal Formative Assessment Practices

    ERIC Educational Resources Information Center

    Furtak, Erin Marie; Ruiz-Primo, Maria Araceli; Bakeman, Roger

    2017-01-01

    Formative assessment is a classroom practice that has received much attention in recent years for its established potential at increasing student learning. A frequent analytic approach for determining the quality of formative assessment practices is to develop a coding scheme and determine frequencies with which the codes are observed; however,…

  16. Code of Sustainable Practice in Occupational and Environmental Health and Safety for Corporations.

    PubMed

    Castleman, Barry; Allen, Barbara; Barca, Stefania; Bohme, Susanna Rankin; Henry, Emmanuel; Kaur, Amarjit; Massard-Guilbaud, Genvieve; Melling, Joseph; Menendez-Navarro, Alfredo; Renfrew, Daniel; Santiago, Myrna; Sellers, Christopher; Tweedale, Geoffrey; Zalik, Anna; Zavestoski, Stephen

    2008-01-01

    At a conference held at Stony Brook University in December 2007, "Dangerous Trade: Histories of Industrial Hazard across a Globalizing World," participants endorsed a Code of Sustainable Practice in Occupational and Environmental Health and Safety for Corporations. The Code outlines practices that would ensure corporations enact the highest health and environmentally protective measures in all the locations in which they operate. Corporations should observe international guidelines on occupational exposure to air contaminants, plant safety, air and water pollutant releases, hazardous waste disposal practices, remediation of polluted sites, public disclosure of toxic releases, product hazard labeling, sale of products for specific uses, storage and transport of toxic intermediates and products, corporate safety and health auditing, and corporate environmental auditing. Protective measures in all locations should be consonant with the most protective measures applied anywhere in the world, and should apply to the corporations' subsidiaries, contractors, suppliers, distributors, and licensees of technology. Key words: corporations, sustainability, environmental protection, occupational health, code of practice.

  17. Re-Assessing Practice: Visual Art, Visually Impaired People and the Web.

    ERIC Educational Resources Information Center

    Howell, Caro; Porter, Dan

    The latest development to come out of ongoing research at Tate Modern, London's new museum of modern art, is i-Map art resources for blind and partially sighted people that are delivered online. Currently i-Map explores the work of Matisse and Picasso, their innovations, influences and personal motivations, as well as key concepts in modern art.…

  18. Human Science for Human Freedom? Piaget's Developmental Research and Foucault's Ethical Truth Games

    ERIC Educational Resources Information Center

    Zhao, Guoping

    2012-01-01

    The construction of the modern subject and the pursuit of human freedom and autonomy, as well as the practice of human science has been pivotal in the development of modern education. But for Foucault, the subject is only the effect of discourses and power-knowledge arrangements, and modern human science is part of the very arrangement that has…

  19. From Leo Strauss to Collapse Theory: Considering the Neoconservative Attack on Modernity and the Work of Education

    ERIC Educational Resources Information Center

    Smith, David Geoffrey

    2008-01-01

    This paper locates the work of Leo Strauss within the broader conservative assault on modernity and especially its roots in liberalism. Four themes from Strauss's work are identified, then hermeneutically engaged for their relevance to educational practice in global times. The four themes are: (1) the liberal/modern concept of an open society is…

  20. Sophistry, the Sophists and modern medical education.

    PubMed

    Macsuibhne, S P

    2010-01-01

    The term 'sophist' has become a term of intellectual abuse in both general discourse and that of educational theory. However the actual thought of the fifth century BC Athenian-based philosophers who were the original Sophists was very different from the caricature. In this essay, I draw parallels between trends in modern medical educational practice and the thought of the Sophists. Specific areas discussed are the professionalisation of medical education, the teaching of higher-order characterological attributes such as personal development skills, and evidence-based medical education. Using the specific example of the Sophist Protagoras, it is argued that the Sophists were precursors of philosophical approaches and practices of enquiry underlying modern medical education.

  1. Developing and Modifying Behavioral Coding Schemes in Pediatric Psychology: A Practical Guide

    PubMed Central

    McMurtry, C. Meghan; Chambers, Christine T.; Bakeman, Roger

    2015-01-01

    Objectives To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. Methods This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. Results A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Conclusions Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. PMID:25416837

  2. Validation Data and Model Development for Fuel Assembly Response to Seismic Loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardet, Philippe; Ricciardi, Guillaume

    2016-01-31

    Vibrations are inherently present in nuclear reactors, especially in cores and steam generators of pressurized water reactors (PWR). They can have significant effects on local heat transfer and wear and tear in the reactor and often set safety margins. The simulation of these multiphysics phenomena from first principles requires the coupling of several codes, which is one the most challenging tasks in modern computer simulation. Here an ambitious multiphysics multidisciplinary validation campaign is conducted. It relied on an integrated team of experimentalists and code developers to acquire benchmark and validation data for fluid-structure interaction codes. Data are focused on PWRmore » fuel bundle behavior during seismic transients.« less

  3. Perception of "no code" and the role of the nurse.

    PubMed

    Honan, S; Helseth, C C; Bakke, J; Karpiuk, K; Krsnak, G; Torkelson, R

    1991-01-01

    CPR is now the rule rather than the exception and death is often viewed as the ultimate failure in modern medicine, rather than the final event of the natural life process (Stevens, 1986). The "No Code" concept has created a major dilemma in health care. An interagency collaborative study was conducted to ascertain the perceptions of nurses, physicians, and laypersons about this issue. This article deals primarily with the nurse's role and perceptions of the "No Code" issue. The comparison of nurses' perceptions with those of physicians and laypersons is unique to this study. Based on this research, suggestions are presented that will assist nursing educators and health care professionals in managing this complex dilemma.

  4. A Tool for Longitudinal Beam Dynamics in Synchrotrons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostiguy, J.-F.; Lebedev, V. A.

    2017-05-01

    A number of codes are available to simulate longitudinal dynamics in synchrotrons. Some established ones include TIBETAN, LONG1D, ESME and ORBIT. While they embody a wealth of accumulated wisdom and experience, most of these codes were written decades ago and to some extent they reflect the constraints of their time. As a result, there is an interest for updated tools taking better advantage of modern software and hardware capabilities. At Fermilab, the PIP-II project has provided the impetus for development of such a tool. In this contribution, we discuss design decisions and code architecture. A selection of test cases basedmore » on an initial prototype are also presented.« less

  5. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  6. Empirically evaluating the WHO global code of practice on the international recruitment of health personnel's impact on four high-income countries four years after adoption.

    PubMed

    Tam, Vivian; Edge, Jennifer S; Hoffman, Steven J

    2016-10-12

    Shortages of health workers in low-income countries are exacerbated by the international migration of health workers to more affluent countries. This problem is compounded by the active recruitment of health workers by destination countries, particularly Australia, Canada, UK and USA. The World Health Organization (WHO) adopted a voluntary Code of Practice in May 2010 to mitigate tensions between health workers' right to migrate and the shortage of health workers in source countries. The first empirical impact evaluation of this Code was conducted 11-months after its adoption and demonstrated a lack of impact on health workforce recruitment policy and practice in the short-term. This second empirical impact evaluation was conducted 4-years post-adoption using the same methodology to determine whether there have been any changes in the perceived utility, applicability, and implementation of the Code in the medium-term. Forty-four respondents representing government, civil society and the private sector from Australia, Canada, UK and USA completed an email-based survey evaluating their awareness of the Code, perceived impact, changes to policy or recruitment practices resulting from the Code, and the effectiveness of non-binding Codes generally. The same survey instrument from the original study was used to facilitate direct comparability of responses. Key lessons were identified through thematic analysis. The main findings between the initial impact evaluation and the current one are unchanged. Both sets of key informants reported no significant policy or regulatory changes to health worker recruitment in their countries as a direct result of the Code due to its lack of incentives, institutional mechanisms and interest mobilizers. Participants emphasized the existence of previous bilateral and regional Codes, the WHO Code's non-binding nature, and the primacy of competing domestic healthcare priorities in explaining this perceived lack of impact. The Code has probably still not produced the tangible improvements in health worker flows it aspired to achieve. Several actions, including a focus on developing bilateral codes, linking the Code to topical global priorities, and reframing the Code's purpose to emphasize health system sustainability, are proposed to improve the Code's uptake and impact.

  7. [Occupational medicine: practice and ethical requirements of the new law on health and safety in the workplace (legislative decree 81/2008)].

    PubMed

    Franco, Giuliano; Mora, Erika

    2009-01-01

    Decisions in occupational health may involve ethical conflicts arising from conflicts between stakeholders' interests. Codes of ethics can provide a practical guide to solve dilemmas. The new law on health and safety in the workplace in Italy (decree 81/2008) states that occupational health practice must comply with the code of ethics of the International Commission on Occupational Health. The universally acknowledged ethical principles of beneficience/nonmaleficience, autonomy and justice, which are the basis of the Charter of fundamental rights of the European Union, inspired this code. Although the code is not a systematic textbook of occupational health ethics and does not cover all possible aspects arising from the practice, making decisions based on it will assure their effectiveness and compliance with ethical principles, besides the formal respect of the law.

  8. BeiDou Signal Acquisition with Neumann–Hoffman Code Modulation in a Degraded Channel

    PubMed Central

    Zhao, Lin; Liu, Aimeng; Ding, Jicheng; Wang, Jing

    2017-01-01

    With the modernization of global navigation satellite systems (GNSS), secondary codes, also known as the Neumann–Hoffman (NH) codes, are modulated on the satellite signal to obtain a better positioning performance. However, this leads to an attenuation of the acquisition sensitivity of classic integration algorithms because of the frequent bit transitions that refer to the NH codes. Taking weak BeiDou navigation satellite system (BDS) signals as objects, the present study analyzes the side effect of NH codes on acquisition in detail and derives a straightforward formula, which indicates that bit transitions decrease the frequency accuracy. To meet the requirement of carrier-tracking loop initialization, a frequency recalculation algorithm is proposed based on verified fast Fourier transform (FFT) to mitigate the effect, meanwhile, the starting point of NH codes is found. Then, a differential correction is utilized to improve the acquisition accuracy of code phase. Monte Carlo simulations and real BDS data tests demonstrate that the new structure is superior to the conventional algorithms both in detection probability and frequency accuracy in a degraded channel. PMID:28208776

  9. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    NASA Astrophysics Data System (ADS)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years, SAC files contained a fixed-length header. Time and distance-related values are stored in single precision, which has become a problem with the increase in desired precision for data compared to thirty years ago. A future goal is to address this precision problem, but in a backward compatible manner. We would also like to transition SAC to a more open source license.

  10. Compliance with the International Code of Marketing of breast-milk substitutes: an observational study of pediatricians' waiting rooms.

    PubMed

    Dodgson, Joan E; Watkins, Amanda L; Bond, Angela B; Kintaro-Tagaloa, Cheryl; Arellano, Alondra; Allred, Patrick A

    2014-04-01

    Abstract The importance of breastmilk as a primary preventative intervention is widely known and understood by most healthcare providers. The actions or non-actions that heathcare providers take toward promoting and supporting breastfeeding families make a difference in the success and duration of breastfeeding. Recognizing this relationship, the World Health Organization developed the International Code of Marketing of Breast-milk Substitutes (the Code), which defines best practices in breastfeeding promotion, including physicians' offices. The pediatric practices' waiting rooms are often a family's first experience with pediatric care. The specific aims of this study were to describe (1) Code compliance, (2) the demographic factors affecting the Code compliance, and (3) the amount and type of breastfeeding-supportive materials available in the pediatricians' waiting rooms. An observational cross-sectional design was used to collect data from 163 (82%) of the pediatric practices in Maricopa County, Arizona. None of the 100 waiting rooms that had any materials displayed (61%) was found to be completely Code compliant, with 81 of the offices having formula-promotional materials readily available. Waiting rooms in higher income areas offered more non-Code-compliant materials and gifts. Breastfeeding support information and materials were lacking in all but 18 (18%) offices. A positive relationship (t97=-2.31, p=0.02) occurred between the presence of breastfeeding educational materials and higher income areas. We were able to uncover some practice-related patterns that impact families and potentially undermine breastfeeding success. To move current practices toward breastfeeding-friendly physicians' offices, change is needed.

  11. Advanced Computational Methods for Monte Carlo Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  12. New Opportunities of Low-Cost Photogrammetry for Culture Heritage Preservation

    NASA Astrophysics Data System (ADS)

    Shults, R.

    2017-05-01

    In the paper, the questions of using the technologies of low-cost photogrammetry in combination with the additional capabilities of modern smartphones are considered. The research was carried out on the example of documenting the historical construction of the II World War - the Kiev Fortified Region. Brief historical information about the object of research is given. The possibilities of using modern smartphones as measuring instruments are considered. To get high-quality results, the camera of the smartphone was calibrated. The calibration results were used in the future to perform 3D modeling of defense facilities. Photographing of three defense structures in a different state: destroyed, partially destroyed and operating was carried out. Based on the results of photography using code targets, 3D object models were constructed. To verify the accuracy of the 3D modelling, control measurements of the lines between the code targets at the objects were performed. The obtained results are satisfying, and the technology considered in the paper can be recommended for use in performing archaeological and historical studies.

  13. Code of Ethics for Rehabilitation Educators and Counselors: A Call for Evidence-Based Practice

    ERIC Educational Resources Information Center

    Burker, Eileen J.; Kazukauskas, Kelly A.

    2010-01-01

    Given the emphasis on evidence-based practice (EBP) in the 2010 Code of Professional Ethics for Rehabilitation Counselors, it has become even more critical for rehabilitation educators and rehabilitation counselors to understand EBP, how to implement it in teaching and in practice, and how to access available EBP resources. This paper defines and…

  14. The call of the sirens: ethically navigating the sea of nonvalidated therapies.

    PubMed

    Grimmett, M R; Sulmasy, D P

    1998-01-01

    Medical research and innovation are vital to the advancement of medicine and, ultimately, benefit society and individual patients. However, the ethical principles of beneficence, respect for persons, and justice must guide the development and implementation of new practices. Ethical codes governing clinical practice and research already warn practitioners to avoid the use of nonvalidated practices outside of controlled clinical trials. Nonetheless, lack of compliance with these codes places many patients at risk for harm. Ophthalmologists, as well as all physicians, must recommit themselves to these ethical principles and codes and establish more vigorous peer-review methods to protect patients from nonvalidated practices that are implemented without a scientific basis.

  15. Sustainability of current agriculture practices, community perception, and implications for ecosystem health: an Indian study.

    PubMed

    Sarkar, Atanu; Patil, Shantagouda; Hugar, Lingappa B; vanLoon, Gary

    2011-12-01

    In order to support agribusiness and to attain food security for ever-increasing populations, most countries in the world have embraced modern agricultural technologies. Ecological consequences of the technocentric approaches, and their sustainability and impacts on human health have, however, not received adequate attention particularly in developing countries. India is one country that has undergone a rapid transformation in the field of agriculture by adopting strategies of the Green Revolution. This article provides a comparative analysis of the effects of older and newer paradigms of agricultural practices on ecosystem and human health within the larger context of sustainability. The study was conducted in three closely situated areas where different agricultural practices were followed: (a) the head-end of a modern canal-irrigated area, (b) an adjacent dryland, and (c) an area (the ancient area) that has been provided with irrigation for some 800 years. Data were collected by in-depth interviews of individual farmers, focus-group discussions, participatory observations, and from secondary sources. The dryland, receiving limited rainfall, continues to practice diverse cropping centered to a large extent on traditional coarse cereals and uses only small amounts of chemical inputs. On the other hand, modern agriculture in the head-end emphasizes continuous cropping of rice supported by extensive and indiscriminate use of agrochemicals. Market forces have, to a significant degree, influenced the ancient area to abandon much of its early practices of organic farming and to take up aspects of modern agricultural practice. Rice cultivation in the irrigated parts has changed the local landscape and vegetation and has augmented the mosquito population, which is a potential vector for malaria, Japanese encephalitis and other diseases. Nevertheless, despite these problems, perceptions of adverse environmental effects are lowest in the heavily irrigated area.

  16. Using Gemba Boards to Facilitate Evidence-Based Practice in Critical Care.

    PubMed

    Bourgault, Annette M; Upvall, Michele J; Graham, Alison

    2018-06-01

    Tradition-based practices lack supporting research evidence and may be harmful or ineffective. Engagement of key stakeholders is a critical step toward facilitating evidence-based practice change. Gemba , derived from Japanese, refers to the real place where work is done. Gemba boards (visual management tools) appear to be an innovative method to engage stakeholders and facilitate evidence-based practice. To explore the use of gemba boards and gemba huddles to facilitate practice change. Twenty-two critical care nurses participated in interviews in this qualitative, descriptive study. Thematic analysis was used to code and categorize interview data. Two researchers reached consensus on coding and derived themes. Data were managed with qualitative analysis software. The code gemba occurred most frequently; a secondary analysis was performed to explore its impact on practice change. Four themes were derived from the gemba code: (1) facilitation of staff, leadership, and interdisciplinary communication, (2) transparency of outcome data, (3) solicitation of staff ideas and feedback, and (4) dissemination of practice changes. Gemba boards and gemba huddles became part of the organizational culture for promoting and disseminating evidence-based practices. Unit-based, publicly located gemba boards and huddles have become key components of evidence-based practice culture. Gemba is both a tool and a process to engage team members and the public to generate clinical questions and to plan, implement, and evaluate practice changes. Future research on the effectiveness of gemba boards to facilitate evidence-based practice is warranted. ©2018 American Association of Critical-Care Nurses.

  17. Second order gyrokinetic theory for particle-in-cell codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tronko, Natalia; Bottino, Alberto; Sonnendrücker, Eric

    2016-08-15

    The main idea of the gyrokinetic dynamical reduction consists in a systematical removal of the fast scale motion (the gyromotion) from the dynamics of the plasma, resulting in a considerable simplification and a significant gain of computational time. The gyrokinetic Maxwell–Vlasov equations are nowadays implemented in for modeling (both laboratory and astrophysical) strongly magnetized plasmas. Different versions of the reduced set of equations exist, depending on the construction of the gyrokinetic reduction procedure and the approximations performed in the derivation. The purpose of this article is to explicitly show the connection between the general second order gyrokinetic Maxwell–Vlasov system issuedmore » from the modern gyrokinetic theory and the model currently implemented in the global electromagnetic Particle-in-Cell code ORB5. Necessary information about the modern gyrokinetic formalism is given together with the consistent derivation of the gyrokinetic Maxwell–Vlasov equations from first principles. The variational formulation of the dynamics is used to obtain the corresponding energy conservation law, which in turn is used for the verification of energy conservation diagnostics currently implemented in ORB5. This work fits within the context of the code verification project VeriGyro currently run at IPP Max-Planck Institut in collaboration with others European institutions.« less

  18. Creating a Culture of Safety Around Bar-Code Medication Administration: An Evidence-Based Evaluation Framework.

    PubMed

    Kelly, Kandace; Harrington, Linda; Matos, Pat; Turner, Barbara; Johnson, Constance

    2016-01-01

    Bar-code medication administration (BCMA) effectiveness is contingent upon compliance with best-practice protocols. We developed a 4-phased BCMA evaluation program to evaluate the degree of integration of current evidence into BCMA policies, procedures, and practices; identify barriers to best-practice BCMA use; and modify BCMA practice in concert with changes to the practice environment. This program provides an infrastructure for frontline nurses to partner with hospital leaders to continually evaluate and improve BCMA using a systematic process.

  19. Communicating with families about post-mortems: practice guidance.

    PubMed

    Henderson, Nicola

    2006-02-01

    In January 2001 the Chief Medical Officer announced the Public Inquiry (Redfern Report) into post-mortem practice at Alder Hey Hospital in Liverpool. It was expected that this inquiry report would influence post-mortem practice in general and communication with parents in particular and in May 2003 a code of practice for clinical staff was produced by the Department of Health (DH) (2003a). This article discusses the code of practice Families and Post Mortems and explores the relevance of these recommendations to neonatal and children's nurses.

  20. Statistical mechanics of broadcast channels using low-density parity-check codes.

    PubMed

    Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David

    2003-03-01

    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.

  1. Deciphering the transcriptional cis-regulatory code.

    PubMed

    Yáñez-Cuna, J Omar; Kvon, Evgeny Z; Stark, Alexander

    2013-01-01

    Information about developmental gene expression resides in defined regulatory elements, called enhancers, in the non-coding part of the genome. Although cells reliably utilize enhancers to orchestrate gene expression, a cis-regulatory code that would allow their interpretation has remained one of the greatest challenges of modern biology. In this review, we summarize studies from the past three decades that describe progress towards revealing the properties of enhancers and discuss how recent approaches are providing unprecedented insights into regulatory elements in animal genomes. Over the next years, we believe that the functional characterization of regulatory sequences in entire genomes, combined with recent computational methods, will provide a comprehensive view of genomic regulatory elements and their building blocks and will enable researchers to begin to understand the sequence basis of the cis-regulatory code. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Creation of fully vectorized FORTRAN code for integrating the movement of dust grains in interplanetary environments

    NASA Technical Reports Server (NTRS)

    Colquitt, Walter

    1989-01-01

    The main objective is to improve the performance of a specific FORTRAN computer code from the Planetary Sciences Division of NASA/Johnson Space Center when used on a modern vectorizing supercomputer. The code is used to calculate orbits of dust grains that separate from comets and asteroids. This code accounts for influences of the sun and 8 planets (neglecting Pluto), solar wind, and solar light pressure including Poynting-Robertson drag. Calculations allow one to study the motion of these particles as they are influenced by the Earth or one of the other planets. Some of these particles become trapped just beyond the Earth for long periods of time. These integer period resonances vary from 3 orbits of the Earth and 2 orbits of the particles to as high as 14 to 13.

  3. An audit of health products advertised for sale on chiropractic Web sites in Canada and consideration of these practices in the context of Canadian chiropractic codes of ethics and conduct.

    PubMed

    Page, Stacey A; Grod, Jaroslaw P

    2009-01-01

    This study describes the extent to which chiropractors with Web sites practicing in Canada advertise health products for sale and considers this practice in the context of chiropractic codes of ethics and conduct. Chiropractic Web sites in Canada were identified using a public online business directory (Canada 411). The Web sites were searched, and an inventory of the health products for sale was taken. The influences of type of practice and province of practice on the sale of health product were assessed. Textual comments about health product marketing were summarized. National and provincial codes of ethics were reviewed, and the content on health product advertising was summarized. Two hundred eighty-seven Web sites were reviewed. Just more than half of the Web sites contained information on health products for sale (n = 158, 54%). Orthotics were advertised most often (n = 136 practices, 47%), followed by vitamins/nutritional supplements (n = 53, 18%), pillows and supports (n = 40, 14%), and exercise/rehabilitation products (n = 20, 7%). Chiropractors in solo or group chiropractic practices were less likely to advertise health products than those in multidisciplinary practice (P < .001), whereas chiropractors in BC were less likely to advertise nutritional supplements (P < .01). Provincial codes of ethics and conduct varied in their guidelines regarding health product sales. Variations in codes of ethics and in the proportions of practitioners advertising health products for sales across the country suggest that opinions may be divided on the acceptability of health product sales. Such practices raise questions and considerations for the chiropractic profession.

  4. Gilles Deleuze: psychiatry, subjectivity, and the passive synthesis of time.

    PubMed

    Roberts, Marc

    2006-10-01

    Abstract Although 'modern' mental health care comprises a variety of theoretical approaches and practices, the supposed identification of 'mental illness' can be understood as being made on the basis of a specific conception of subjectivity that is characteristic of 'modernity'. This is to say that any perceived 'deviation' from this characteristically 'modern self' is seen as a possible 'sign' of 'mental illness', given a 'negative determination', and conceptualized in terms of a 'deficiency' or a 'lack'; accordingly, the 'ideal''therapeutic' aim of 'modern' mental health care can be understood as the 'rectification' of that 'deficiency' through a 're-instatement' of the 'modern self'. Although contemporary mental health care is increasingly becoming influenced by the so-called 'death' of the 'modern self', this paper will suggest that it is the work of the 20th century French philosopher, Gilles Deleuze, that is able to provide mental health care with a coherent determination of a 'post-modern self'. However, a Deleuzian account of subjectivity stands in stark contrast to 'modernity's' conception of subjectivity and, as such, this paper will attempt to show how this 'post-modern' subjectivity challenges many of the assumptions of 'modern' mental health care. Moreover, acknowledging the complexity and the perceived difficulty of Deleuze's work, this paper will provide an account of subjectivity that can be understood as 'Deleuzian' in its orientation, rather than 'Deleuze's theory of subjectivity', and therefore, this paper also seeks to stimulate further research and discussion of Deleuze's work on subjectivity, and how that work may be able to inform, and possibly even reform, the theoretical foundations and associated diagnostic and therapeutic practices of psychiatry, psychotherapy, and mental health nursing.

  5. Developing and modifying behavioral coding schemes in pediatric psychology: a practical guide.

    PubMed

    Chorney, Jill MacLaren; McMurtry, C Meghan; Chambers, Christine T; Bakeman, Roger

    2015-01-01

    To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. The path toward HEP High Performance Computing

    NASA Astrophysics Data System (ADS)

    Apostolakis, John; Brun, René; Carminati, Federico; Gheata, Andrei; Wenzel, Sandro

    2014-06-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a "High Performance" implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on the development of a highperformance prototype for particle transport. Achieving a good concurrency level on the emerging parallel architectures without a complete redesign of the framework can only be done by parallelizing at event level, or with a much larger effort at track level. Apart the shareable data structures, this typically implies a multiplication factor in terms of memory consumption compared to the single threaded version, together with sub-optimal handling of event processing tails. Besides this, the low level instruction pipelining of modern processors cannot be used efficiently to speedup the program. We have implemented a framework that allows scheduling vectors of particles to an arbitrary number of computing resources in a fine grain parallel approach. The talk will review the current optimisation activities within the SFT group with a particular emphasis on the development perspectives towards a simulation framework able to profit best from the recent technology evolution in computing.

  7. SMT-Aware Instantaneous Footprint Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy, Probir; Liu, Xu; Song, Shuaiwen

    Modern architectures employ simultaneous multithreading (SMT) to increase thread-level parallelism. SMT threads share many functional units and the whole memory hierarchy of a physical core. Without a careful code design, SMT threads can easily contend with each other for these shared resources, causing severe performance degradation. Minimizing SMT thread contention for HPC applications running on dedicated platforms is very challenging, because they usually spawn threads within Single Program Multiple Data (SPMD) models. To address this important issue, we introduce a simple scheme for SMT-aware code optimization, which aims to reduce the memory contention across SMT threads.

  8. MPACT Standard Input User s Manual, Version 2.2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Benjamin S.; Downar, Thomas; Fitzgerald, Andrew

    The MPACT (Michigan PArallel Charactistics based Transport) code is designed to perform high-fidelity light water reactor (LWR) analysis using whole-core pin-resolved neutron transport calculations on modern parallel-computing hardware. The code consists of several libraries which provide the functionality necessary to solve steady-state eigenvalue problems. Several transport capabilities are available within MPACT including both 2-D and 3-D Method of Characteristics (MOC). A three-dimensional whole core solution based on the 2D-1D solution method provides the capability for full core depletion calculations.

  9. Exploiting Thread Parallelism for Ocean Modeling on Cray XC Supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarje, Abhinav; Jacobsen, Douglas W.; Williams, Samuel W.

    The incorporation of increasing core counts in modern processors used to build state-of-the-art supercomputers is driving application development towards exploitation of thread parallelism, in addition to distributed memory parallelism, with the goal of delivering efficient high-performance codes. In this work we describe the exploitation of threading and our experiences with it with respect to a real-world ocean modeling application code, MPAS-Ocean. We present detailed performance analysis and comparisons of various approaches and configurations for threading on the Cray XC series supercomputers.

  10. Superdense Coding over Optical Fiber Links with Complete Bell-State Measurements

    DOE PAGES

    Williams, Brian P.; Sadlier, Ronald J.; Humble, Travis S.

    2017-02-01

    Adopting quantum communication to modern networking requires transmitting quantum information through a fiber-based infrastructure. In this paper, we report the first demonstration of superdense coding over optical fiber links, taking advantage of a complete Bell-state measurement enabled by time-polarization hyperentanglement, linear optics, and common single-photon detectors. Finally, we demonstrate the highest single-qubit channel capacity to date utilizing linear optics, 1.665 ± 0.018, and we provide a full experimental implementation of a hybrid, quantum-classical communication protocol for image transfer.

  11. Recent advances in hypersonic technology

    NASA Technical Reports Server (NTRS)

    Dwoyer, Douglas L.

    1990-01-01

    This paper will focus on recent advances in hypersonic aerodynamic prediction techniques. Current capabilities of existing numerical methods for predicting high Mach number flows will be discussed and shortcomings will be identified. Physical models available for inclusion into modern codes for predicting the effects of transition and turbulence will also be outlined and their limitations identified. Chemical reaction models appropriate to high-speed flows will be addressed, and the impact of their inclusion in computational fluid dynamics codes will be discussed. Finally, the problem of validating predictive techniques for high Mach number flows will be addressed.

  12. Integrating advanced practice providers into medical critical care teams.

    PubMed

    McCarthy, Christine; O'Rourke, Nancy C; Madison, J Mark

    2013-03-01

    Because there is increasing demand for critical care providers in the United States, many medical ICUs for adults have begun to integrate nurse practitioners and physician assistants into their medical teams. Studies suggest that such advanced practice providers (APPs), when appropriately trained in acute care, can be highly effective in helping to deliver high-quality medical critical care and can be important elements of teams with multiple providers, including those with medical house staff. One aspect of building an integrated team is a practice model that features appropriate coding and billing of services by all providers. Therefore, it is important to understand an APP's scope of practice, when they are qualified for reimbursement, and how they may appropriately coordinate coding and billing with other team providers. In particular, understanding when and how to appropriately code for critical care services (Current Procedural Terminology [CPT] code 99291, critical care, evaluation and management of the critically ill or critically injured patient, first 30-74 min; CPT code 99292, critical care, each additional 30 min) and procedures is vital for creating a sustainable program. Because APPs will likely play a growing role in medical critical care units in the future, more studies are needed to compare different practice models and to determine the best way to deploy this talent in specific ICU settings.

  13. Best practice in the management of clinical coding services: Insights from a project in the Republic of Ireland, Part 2.

    PubMed

    Reid, Beth A; Ridoutt, Lee; O'Connor, Paul; Murphy, Deirdre

    2017-09-01

    This is the second of two articles about best practice in the management of coding services. The best practice project was part of a year-long project conducted in the Republic of Ireland to review the quality of the Hospital Inpatient Enquiry data for its use in activity-based funding. The four methods used to address the best practice aspect of the project were described in detail in Part 1. The results included in this article are those relating to the coding manager's background, preparation and style, clinical coder (CC) workforce adequacy, the CC workforce structure and career pathway, and the physical and psychological work environment for the clinical coding service. Examples of best practice were found in the study hospitals but there were also areas for improvement. Coding managers would benefit from greater support in the form of increased opportunities for management training and a better method for calculating CC workforce numbers. A career pathway is needed for CCs to progress from entry to expert CC, mentor, manager and quality controller. Most hospitals could benefit from investment in infrastructure that places CCs in a physical environment that tells them they are an important part of the hospital and their work is valued.

  14. Data Management for a Climate Data Record in an Evolving Technical Landscape

    NASA Astrophysics Data System (ADS)

    Moore, K. D.; Walter, J.; Gleason, J. L.

    2017-12-01

    For nearly twenty years, NASA Langley Research Center's Clouds and the Earth's Radiant Energy System (CERES) Science Team has been producing a suite of data products that forms a persistent climate data record of the Earth's radiant energy budget. Many of the team's physical scientists and key research contributors have been with the team since the launch of the first CERES instrument in 1997. This institutional knowledge is irreplaceable and its longevity and continuity are among the reasons that the team has been so productive. Such legacy involvement, however, can also be a limiting factor. Some CERES scientists-cum-coders might possess skills that were state-of-the-field when they were emerging scientists but may now be outdated with respect to developments in software development best practices and supporting technologies. Both programming languages and processing frameworks have evolved significantly in the past twenty years, and updating one of these factors warrants consideration of updating the other. With the imminent launch of a final CERES instrument and the good health of those in flight, the CERES data record stands to continue far into the future. The CERES Science Team is, therefore, undergoing a re-architecture of its codebase to maintain compatibility with newer data processing platforms and technologies and to leverage modern software development best practices. This necessitates training our staff and consequently presents several challenges, including: Development continues immediately on the next "edition" of research algorithms upon release of the previous edition. How can code be rewritten at the same time that the science algorithms are being updated and integrated? With limited time to devote to training, how can we update the staff's existing skillset without slowing progress or introducing new errors? The CERES Science Team is large and complex, much like the current state of its codebase. How can we identify, in a breadth-wise manner, areas for code improvement across multiple research groups that maintain code with varying semantics but common concepts? In this work, we discuss the successes and pitfalls of this major re-architecture effort and share how we will sustain improvement into the future.

  15. Intellectual Freedom

    ERIC Educational Resources Information Center

    Knox, Emily

    2011-01-01

    Support for intellectual freedom, a concept codified in the American Library Association's Library Bill of Rights and Code of Ethics, is one of the core tenets of modern librarianship. According to the most recent interpretation of the Library Bill of Rights, academic librarians are encouraged to incorporate the principles of intellectual freedom…

  16. Social Information Processing Analysis (SIPA): Coding Ongoing Human Communication.

    ERIC Educational Resources Information Center

    Fisher, B. Aubrey; And Others

    1979-01-01

    The purpose of this paper is to present a new analytical system to be used in communication research. Unlike many existing systems devised ad hoc, this research tool, a system for interaction analysis, is embedded in a conceptual rationale based on modern systems theory. (Author)

  17. Feminization and marginalization? Women Ayurvedic doctors and modernizing health care in Nepal.

    PubMed

    Cameron, Mary

    2010-03-01

    The important diversity of indigenous medical systems around the world suggests that gender issues, well understood for Western science, may differ in significant ways for non-Western science practices and are an important component in understanding how social dimensions of women's health care are being transformed by global biomedicine. Based on ethnographic research conducted with formally trained women Ayurvedic doctors in Nepal, I identify important features of medical knowledge and practice beneficial to women patients, and I discuss these features as potentially transformed by modernizing health care development. The article explores the indirect link between Ayurveda's feminization and its marginalization, in relation to modern biomedicine, which may evolve to become more direct and consequential for women's health in the country.

  18. Modern medical practice: a profession in transition.

    PubMed

    Merry, M D

    1984-05-01

    Modern medical practice is in a state of transition. The solo practitioner is slowly giving way to the large organized groups of health care providers. Driving this force of change is a change in payment for health care services from cost plus to preestablished pricing. For the first time, medical practice patterns are having a direct impact on the financial viability of the health care institution. To maintain quality of patient care and contain costs, more and more physicians are becoming involved in the administrative side of running a hospital. This article describes the forces of change, the change itself, and the future of medicine.

  19. 77 FR 74456 - Notice of Proposed Changes to the National Handbook of Conservation Practices for the Natural...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-14

    ...), Row Arrangement (Code 557), Sprinkler System (Code 442), Tree/Shrub Site Preparation (Code 490), Waste.... Tree/Shrub Site Preparation (Code 490)--Only minor changes were made to the standard including...

  20. Toward Developing a Universal Code of Ethics for Adult Educators.

    ERIC Educational Resources Information Center

    Siegel, Irwin H.

    2000-01-01

    Presents conflicting viewpoints on a universal code of ethics for adult educators. Suggests objectives of a code (guidance for practice, policymaking direction, common reference point, shared values). Outlines content and methods for implementing a code. (SK)

  1. Development of a web-based CT dose calculator: WAZA-ARI.

    PubMed

    Ban, N; Takahashi, F; Sato, K; Endo, A; Ono, K; Hasegawa, T; Yoshitake, T; Katsunuma, Y; Kai, M

    2011-09-01

    A web-based computed tomography (CT) dose calculation system (WAZA-ARI) is being developed based on the modern techniques for the radiation transport simulation and for software implementation. Dose coefficients were calculated in a voxel-type Japanese adult male phantom (JM phantom), using the Particle and Heavy Ion Transport code System. In the Monte Carlo simulation, the phantom was irradiated with a 5-mm-thick, fan-shaped photon beam rotating in a plane normal to the body axis. The dose coefficients were integrated into the system, which runs as Java servlets within Apache Tomcat. Output of WAZA-ARI for GE LightSpeed 16 was compared with the dose values calculated similarly using MIRD and ICRP Adult Male phantoms. There are some differences due to the phantom configuration, demonstrating the significance of the dose calculation with appropriate phantoms. While the dose coefficients are currently available only for limited CT scanner models and scanning options, WAZA-ARI will be a useful tool in clinical practice when development is finalised.

  2. A natural history of hygiene

    PubMed Central

    Curtis, Valerie A

    2007-01-01

    In unpacking the Pandora's box of hygiene, the author looks into its ancient evolutionary history and its more recent human history. Within the box, she finds animal behaviour, dirt, disgust and many diseases, as well as illumination concerning how hygiene can be improved. It is suggested that hygiene is the set of behaviours that animals, including humans, use to avoid harmful agents. The author argues that hygiene has an ancient evolutionary history, and that most animals exhibit such behaviours because they are adaptive. In humans, responses to most infectious threats are accompanied by sensations of disgust. In historical times, religions, social codes and the sciences have all provided rationales for hygiene behaviour. However, the author argues that disgust and hygiene behaviour came first, and that the rationales came later. The implications for the modern-day practice of hygiene are profound. The natural history of hygiene needs to be better understood if we are to promote safe hygiene and, hence, win our evolutionary war against the agents of infectious disease. PMID:18923689

  3. High-Performance 3D Compressive Sensing MRI Reconstruction Using Many-Core Architectures

    PubMed Central

    Kim, Daehyun; Trzasko, Joshua; Smelyanskiy, Mikhail; Haider, Clifton; Dubey, Pradeep; Manduca, Armando

    2011-01-01

    Compressive sensing (CS) describes how sparse signals can be accurately reconstructed from many fewer samples than required by the Nyquist criterion. Since MRI scan duration is proportional to the number of acquired samples, CS has been gaining significant attention in MRI. However, the computationally intensive nature of CS reconstructions has precluded their use in routine clinical practice. In this work, we investigate how different throughput-oriented architectures can benefit one CS algorithm and what levels of acceleration are feasible on different modern platforms. We demonstrate that a CUDA-based code running on an NVIDIA Tesla C2050 GPU can reconstruct a 256 × 160 × 80 volume from an 8-channel acquisition in 19 seconds, which is in itself a significant improvement over the state of the art. We then show that Intel's Knights Ferry can perform the same 3D MRI reconstruction in only 12 seconds, bringing CS methods even closer to clinical viability. PMID:21922017

  4. Understanding Radiation Thermometry. Part II

    NASA Technical Reports Server (NTRS)

    Risch, Timothy K.

    2015-01-01

    This document is a two-part course on the theory and practice of radiation thermometry. Radiation thermometry is the technique for determining the temperature of a surface or a volume by measuring the electromagnetic radiation it emits. This course covers the theory and practice of radiative thermometry and emphasizes the modern application of the field using commercially available electronic detectors and optical components. The course covers the historical development of the field, the fundamental physics of radiative surfaces, along with modern measurement methods and equipment.

  5. Understanding Radiation Thermometry. Part I

    NASA Technical Reports Server (NTRS)

    Risch Timothy K.

    2015-01-01

    This document is a two-part course on the theory and practice of radiation thermometry. Radiation thermometry is the technique for determining the temperature of a surface or a volume by measuring the electromagnetic radiation it emits. This course covers the theory and practice of radiative thermometry and emphasizes the modern application of the field using commercially available electronic detectors and optical components. The course covers the historical development of the field, the fundamental physics of radiative surfaces, along with modern measurement methods and equipment.

  6. A free interactive matching program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.-F. Ostiguy

    1999-04-16

    For physicists and engineers involved in the design and analysis of beamlines (transfer lines or insertions) the lattice function matching problem is central and can be time-consuming because it involves constrained nonlinear optimization. For such problems convergence can be difficult to obtain in general without expert human intervention. Over the years, powerful codes have been developed to assist beamline designers. The canonical example is MAD (Methodical Accelerator Design) developed at CERN by Christophe Iselin. MAD, through a specialized command language, allows one to solve a wide variety of problems, including matching problems. Although in principle, the MAD command interpreter canmore » be run interactively, in practice the solution of a matching problem involves a sequence of independent trial runs. Unfortunately, but perhaps not surprisingly, there still exists relatively few tools exploiting the resources offered by modern environments to assist lattice designer with this routine and repetitive task. In this paper, we describe a fully interactive lattice matching program, written in C++ and assembled using freely available software components. An important feature of the code is that the evolution of the lattice functions during the nonlinear iterative process can be graphically monitored in real time; the user can dynamically interrupt the iterations at will to introduce new variables, freeze existing ones into their current state and/or modify constraints. The program runs under both UNIX and Windows NT.« less

  7. Semantic graphs and associative memories

    NASA Astrophysics Data System (ADS)

    Pomi, Andrés; Mizraji, Eduardo

    2004-12-01

    Graphs have been increasingly utilized in the characterization of complex networks from diverse origins, including different kinds of semantic networks. Human memories are associative and are known to support complex semantic nets; these nets are represented by graphs. However, it is not known how the brain can sustain these semantic graphs. The vision of cognitive brain activities, shown by modern functional imaging techniques, assigns renewed value to classical distributed associative memory models. Here we show that these neural network models, also known as correlation matrix memories, naturally support a graph representation of the stored semantic structure. We demonstrate that the adjacency matrix of this graph of associations is just the memory coded with the standard basis of the concept vector space, and that the spectrum of the graph is a code invariant of the memory. As long as the assumptions of the model remain valid this result provides a practical method to predict and modify the evolution of the cognitive dynamics. Also, it could provide us with a way to comprehend how individual brains that map the external reality, almost surely with different particular vector representations, are nevertheless able to communicate and share a common knowledge of the world. We finish presenting adaptive association graphs, an extension of the model that makes use of the tensor product, which provides a solution to the known problem of branching in semantic nets.

  8. Professional nursing values among baccalaureate nursing students in Hong Kong.

    PubMed

    Lui, May H L; Lam, Lai Wah; Lee, Iris F K; Chien, Wai Tong; Chau, Janita P C; Ip, Wan Yim

    2008-01-01

    The development of a nursing code of professional conduct is to guide nurses to make appropriate clinical decision, in particular when facing ethical dilemma. It is of paramount importance that nurse educators understand baccalaureate nursing students' perceptions of the importance of the code of professional conduct and the level of difficulties in implementing this code while preparing them for future practicing nurses. The Code of Professional Conduct in Hong Kong has been developed to guide nursing practice for over two decades. Nevertheless, no study has examined Hong Kong baccalaureate nursing students' perception about this professional code. The aim of this paper was to examine the perceptions of 263 baccalaureate nursing students about this professional code using a cross sectional survey design. The results indicated that most items in the professional code were rated as important and "provide safe and competent care" was rated as the most important one. A few areas that the students perceived as difficult to implement were discussed and future research was recommended. The significant differences identified among students from different years of study also highlighted areas for consideration in planning educational program to further equip students with the ability to deal with challenges in professional practice.

  9. Diabetes Mellitus Coding Training for Family Practice Residents.

    PubMed

    Urse, Geraldine N

    2015-07-01

    Although physicians regularly use numeric coding systems such as the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to describe patient encounters, coding errors are common. One of the most complicated diagnoses to code is diabetes mellitus. The ICD-9-CM currently has 39 separate codes for diabetes mellitus; this number will be expanded to more than 50 with the introduction of ICD-10-CM in October 2015. To assess the effect of a 1-hour focused presentation on ICD-9-CM codes on diabetes mellitus coding. A 1-hour focused lecture on the correct use of diabetes mellitus codes for patient visits was presented to family practice residents at Doctors Hospital Family Practice in Columbus, Ohio. To assess resident knowledge of the topic, a pretest and posttest were given to residents before and after the lecture, respectively. Medical records of all patients with diabetes mellitus who were cared for at the hospital 6 weeks before and 6 weeks after the lecture were reviewed and compared for the use of diabetes mellitus ICD-9 codes. Eighteen residents attended the lecture and completed the pretest and posttest. The mean (SD) percentage of correct answers was 72.8% (17.1%) for the pretest and 84.4% (14.6%) for the posttest, for an improvement of 11.6 percentage points (P≤.035). The percentage of total available codes used did not substantially change from before to after the lecture, but the use of the generic ICD-9-CM code for diabetes mellitus type II controlled (250.00) declined (58 of 176 [33%] to 102 of 393 [26%]) and the use of other codes increased, indicating a greater variety in codes used after the focused lecture. After a focused lecture on diabetes mellitus coding, resident coding knowledge improved. Review of medical record data did not reveal an overall change in the number of diabetic codes used after the lecture but did reveal a greater variety in the codes used.

  10. Flexible use and technique extension of logistics management

    NASA Astrophysics Data System (ADS)

    Xiong, Furong

    2011-10-01

    As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.

  11. Enhanced Breakup of Entering Meteoroids by Internal Air Percolation

    NASA Astrophysics Data System (ADS)

    Melosh, H.; Tabetah, M.

    2017-12-01

    It is often observed that meteoroids break up in flight while entering the Earth's atmosphere. The effective strength of such meteoroids can be deduced from their speed and altitude at which breakup occurs. Surprisingly, the resulting strength is typically very low: Only 1 - 5 MPa for the Chelyabinsk meteoroid. This contrasts to the measured crushing strength of about 300 MPa for the recovered fragments. This great difference in strength is usually attributed to a selection effect: The surviving fragments are stronger simply because the weaker materials were eliminated before reaching the ground. We have modeled the entry of meteoroids using a two-material computer code based on the old Los Alamos code KFIX. This code permits us to treat the solid meteoroid and atmospheric gases as two interpenetrating phases that can exchange mass, energy and momentum. Among other advantages of the code, it inherently treats the meteoroid as a porous, permeable solid, in keeping with the modern observation that most asteroids are highly porous. During these simulations we noted that compressed atmospheric gases in the bow shock readily percolate into the body of the meteoroid. This greatly increases the internal pore pressure and leads to a rapid expansion that quickly disperses the meteoroid into small fragments. As is well known from geological and engineering practice, high pore pressures greatly decrease the strength of geologic materials and this factor may thus account for much of the discrepancy between meteoroid strength deduced from breakup and that measured on recovered fragments, although the selection effect certainly plays some role. The percolation of hot, high pressure air into the body of entering meteoroids is a previously unrecognized process that may greatly enhance their fragmentation and dispersion. This phenomenon may explain why the ca. 100 m diameter Tunguska object disintegrated so completely before reaching the surface, and it argues that the Earth's atmosphere may be a better screen against small impacts than previously recognized.

  12. Critical Care Coding for Neurologists.

    PubMed

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  13. Coding of Neuroinfectious Diseases.

    PubMed

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  14. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  15. StagBL : A Scalable, Portable, High-Performance Discretization and Solver Layer for Geodynamic Simulation

    NASA Astrophysics Data System (ADS)

    Sanan, P.; Tackley, P. J.; Gerya, T.; Kaus, B. J. P.; May, D.

    2017-12-01

    StagBL is an open-source parallel solver and discretization library for geodynamic simulation,encapsulating and optimizing operations essential to staggered-grid finite volume Stokes flow solvers.It provides a parallel staggered-grid abstraction with a high-level interface in C and Fortran.On top of this abstraction, tools are available to define boundary conditions and interact with particle systems.Tools and examples to efficiently solve Stokes systems defined on the grid are provided in small (direct solver), medium (simple preconditioners), and large (block factorization and multigrid) model regimes.By working directly with leading application codes (StagYY, I3ELVIS, and LaMEM) and providing an API and examples to integrate with others, StagBL aims to become a community tool supplying scalable, portable, reproducible performance toward novel science in regional- and planet-scale geodynamics and planetary science.By implementing kernels used by many research groups beneath a uniform abstraction layer, the library will enable optimization for modern hardware, thus reducing community barriers to large- or extreme-scale parallel simulation on modern architectures. In particular, the library will include CPU-, Manycore-, and GPU-optimized variants of matrix-free operators and multigrid components.The common layer provides a framework upon which to introduce innovative new tools.StagBL will leverage p4est to provide distributed adaptive meshes, and incorporate a multigrid convergence analysis tool.These options, in addition to a wealth of solver options provided by an interface to PETSc, will make the most modern solution techniques available from a common interface. StagBL in turn provides a PETSc interface, DMStag, to its central staggered grid abstraction.We present public version 0.5 of StagBL, including preliminary integration with application codes and demonstrations with its own demonstration application, StagBLDemo. Central to StagBL is the notion of an uninterrupted pipeline from toy/teaching codes to high-performance, extreme-scale solves. StagBLDemo replicates the functionality of an advanced MATLAB-style regional geodynamics code, thus providing users with a concrete procedure to exceed the performance and scalability limitations of smaller-scale tools.

  16. A Paradigm Shift in the Implementation of Ethics Codes in Construction Organizations in Hong Kong: Towards an Ethical Behaviour.

    PubMed

    Ho, Christabel Man-Fong; Oladinrin, Olugbenga Timo

    2018-01-30

    Due to the economic globalization which is characterized with business scandals, scholars and practitioners are increasingly engaged with the implementation of codes of ethics as a regulatory mechanism for stimulating ethical behaviours within an organization. The aim of this study is to examine various organizational practices regarding the effective implementation of codes of ethics within construction contracting companies. Views on ethics management in construction organizations together with the recommendations for improvement were gleaned through 19 semi-structured interviews, involving construction practitioners from various construction companies in Hong Kong. The findings suggested some practices for effective implementation of codes of ethics in order to diffuse ethical behaviours in an organizational setting which include; introduction of effective reward schemes, arrangement of ethics training for employees, and leadership responsiveness to reported wrongdoings. Since most of the construction companies in Hong Kong have codes of ethics, emphasis is made on the practical implementation of codes within the organizations. Hence, implications were drawn from the recommended measures to guide construction companies and policy makers.

  17. Wasatch: An architecture-proof multiphysics development environment using a Domain Specific Language and graph theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saad, Tony; Sutherland, James C.

    To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less

  18. Wasatch: An architecture-proof multiphysics development environment using a Domain Specific Language and graph theory

    DOE PAGES

    Saad, Tony; Sutherland, James C.

    2016-05-04

    To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less

  19. PHASE I MATERIALS PROPERTY DATABASE DEVELOPMENT FOR ASME CODES AND STANDARDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Lin, Lianshan

    2013-01-01

    To support the ASME Boiler and Pressure Vessel Codes and Standard (BPVC) in modern information era, development of a web-based materials property database is initiated under the supervision of ASME Committee on Materials. To achieve efficiency, the project heavily draws upon experience from development of the Gen IV Materials Handbook and the Nuclear System Materials Handbook. The effort is divided into two phases. Phase I is planned to deliver a materials data file warehouse that offers a depository for various files containing raw data and background information, and Phase II will provide a relational digital database that provides advanced featuresmore » facilitating digital data processing and management. Population of the database will start with materials property data for nuclear applications and expand to data covering the entire ASME Code and Standards including the piping codes as the database structure is continuously optimized. The ultimate goal of the effort is to establish a sound cyber infrastructure that support ASME Codes and Standards development and maintenance.« less

  20. ANNarchy: a code generation approach to neural simulations on parallel hardware

    PubMed Central

    Vitay, Julien; Dinkelbach, Helge Ü.; Hamker, Fred H.

    2015-01-01

    Many modern neural simulators focus on the simulation of networks of spiking neurons on parallel hardware. Another important framework in computational neuroscience, rate-coded neural networks, is mostly difficult or impossible to implement using these simulators. We present here the ANNarchy (Artificial Neural Networks architect) neural simulator, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both. The interface in Python has been designed to be close to the PyNN interface, while the definition of neuron and synapse models can be specified using an equation-oriented mathematical description similar to the Brian neural simulator. This information is used to generate C++ code that will efficiently perform the simulation on the chosen parallel hardware (multi-core system or graphical processing unit). Several numerical methods are available to transform ordinary differential equations into an efficient C++code. We compare the parallel performance of the simulator to existing solutions. PMID:26283957

  1. Grid Standards and Codes | Grid Modernization | NREL

    Science.gov Websites

    simulations that take advantage of advanced concepts such as hardware-in-the-loop testing. Such methods of methods and solutions. Projects Accelerating Systems Integration Standards Sharp increases in goal of this project is to develop streamlined and accurate methods for New York utilities to determine

  2. Real Time Network Monitoring and Reporting System

    ERIC Educational Resources Information Center

    Massengale, Ricky L., Sr.

    2009-01-01

    With the ability of modern system developers to develop intelligent programs that allows machines to learn, modify and evolve themselves, current trends of reactionary methods to detect and eradicate malicious software code from infected machines is proving to be too costly. Addressing malicious software after an attack is the current methodology…

  3. Teaching with and through Teams: Student-Written, Instructor-Facilitated Case Writing and the Signatory Code

    ERIC Educational Resources Information Center

    Bailey, James; Sass, Mary; Swiercz, Paul M.; Seal, Craig; Kayes, D. Christopher

    2005-01-01

    Modern organizations prize teamwork. Management schools have responded to this reality by integrating teamwork into the curriculum. Two important challenges associated with integrating teams in the management classroom include (a) designing teamwork assignments that achieve multiple, sophisticated learning outcomes and (b) instruction in, and…

  4. Technostress: A Content Analysis.

    ERIC Educational Resources Information Center

    Clute, Robin

    This paper reports on a study that explores the literature of technostress--the anxiety over using technological equipment--both inside and outside of the library field. Fifty-eight unique articles were abstracted and evaluated. By using a coding sheet a measurement was taken of symptoms, reasons given for the "modern disease," and…

  5. Procedures and Frequencies of Embalming and Heart Extractions in Modern Period in Brittany. Contribution to the Evolution of Ritual Funerary in Europe

    PubMed Central

    Dedouit, Fabrice; Duchesne, Sylvie; Mokrane, Fatima-Zohra; Gendrot, Véronique; Gérard, Patrice; Dabernat, Henri; Crubézy, Éric; Telmon, Norbert

    2016-01-01

    The evolution of funeral practices from the Middle Ages through the Modern era in Europe is generally seen as a process of secularization. The study, through imaging and autopsy, of two mummies, five lead urns containing hearts, and more than six hundred skeletons of nobles and clergymen from a Renaissance convent in Brittany has led us to reject this view. In addition to exceptional embalming, we observed instances in which hearts alone had been extracted, a phenomenon that had never before been described, and brains alone as well, and instances in which each spouse's heart had been placed on the other's coffin. In some identified cases we were able to establish links between the religious attitudes of given individuals and either ancient Medieval practices or more modern ones generated by the Council of Trent. All of these practices, which were a function of social status, were rooted in religion. They offer no evidence of secularization whatsoever. PMID:28030554

  6. Utilization of alternative medical services by people of a north central city of Nigeria.

    PubMed

    Abodunrin, O L; Omojasola, T; Rojugbokan, O O

    2011-06-01

    The use of alternative therapies is becoming more popular in the recent times especially due to the increasing cost, distrust and limitations of modern western medical care. There is a universal trend toward naturalness and herbal medicine is now being modernized and being accepted by people who would not have used them. This community based study seeks to assess the prevalence, pattern, behaviour and determinants of AT use. It was a cross-sectional descriptive survey among adults in the Ilorin city of Nigeria. Participants were selected by multistage sampling and information obtained by the use of semi-structured questionnaire. Total prevalence AT use was 67.7% while total prevalence of indigenous and foreign AT use was 44.8 and 30.4% respectively. Among indigenous AT users, 87.5% will use both conventional and modernized type while 12.5% will use only the modernized type. More than 10% were new users of AT. Respondents use AT for promotive, preventive and curative purposes. Only 3.5% were considered as safe users according to 9-point items. The male respondents and the never married ones practice a safer use of alternative therapy (p<0.05). Similarly, the respondents with higher educational status also have a safer practice of AT use (p<0.05). There is high prevalence but unsafe AT use in Ilorin. There should be intensification of regulation of advertisement and sales of unwholesome herbal medicines. Further research to integrate the practice into modern healthcare is recommended.

  7. Tests of Exoplanet Atmospheric Radiative Transfer Codes

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Challener, Ryan; DeLarme, Emerson; Cubillos, Patricio; Blecic, Jasmina; Foster, Austin; Garland, Justin

    2016-10-01

    Atmospheric radiative transfer codes are used both to predict planetary spectra and in retrieval algorithms to interpret data. Observational plans, theoretical models, and scientific results thus depend on the correctness of these calculations. Yet, the calculations are complex and the codes implementing them are often written without modern software-verification techniques. In the process of writing our own code, we became aware of several others with artifacts of unknown origin and even outright errors in their spectra. We present a series of tests to verify atmospheric radiative-transfer codes. These include: simple, single-line line lists that, when combined with delta-function abundance profiles, should produce a broadened line that can be verified easily; isothermal atmospheres that should produce analytically-verifiable blackbody spectra at the input temperatures; and model atmospheres with a range of complexities that can be compared to the output of other codes. We apply the tests to our own code, Bayesian Atmospheric Radiative Transfer (BART) and to several other codes. The test suite is open-source software. We propose this test suite as a standard for verifying current and future radiative transfer codes, analogous to the Held-Suarez test for general circulation models. This work was supported by NASA Planetary Atmospheres grant NX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G.

  8. TAIR- TRANSONIC AIRFOIL ANALYSIS COMPUTER CODE

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.

    1994-01-01

    The Transonic Airfoil analysis computer code, TAIR, was developed to employ a fast, fully implicit algorithm to solve the conservative full-potential equation for the steady transonic flow field about an arbitrary airfoil immersed in a subsonic free stream. The full-potential formulation is considered exact under the assumptions of irrotational, isentropic, and inviscid flow. These assumptions are valid for a wide range of practical transonic flows typical of modern aircraft cruise conditions. The primary features of TAIR include: a new fully implicit iteration scheme which is typically many times faster than classical successive line overrelaxation algorithms; a new, reliable artifical density spatial differencing scheme treating the conservative form of the full-potential equation; and a numerical mapping procedure capable of generating curvilinear, body-fitted finite-difference grids about arbitrary airfoil geometries. Three aspects emphasized during the development of the TAIR code were reliability, simplicity, and speed. The reliability of TAIR comes from two sources: the new algorithm employed and the implementation of effective convergence monitoring logic. TAIR achieves ease of use by employing a "default mode" that greatly simplifies code operation, especially by inexperienced users, and many useful options including: several airfoil-geometry input options, flexible user controls over program output, and a multiple solution capability. The speed of the TAIR code is attributed to the new algorithm and the manner in which it has been implemented. Input to the TAIR program consists of airfoil coordinates, aerodynamic and flow-field convergence parameters, and geometric and grid convergence parameters. The airfoil coordinates for many airfoil shapes can be generated in TAIR from just a few input parameters. Most of the other input parameters have default values which allow the user to run an analysis in the default mode by specifing only a few input parameters. Output from TAIR may include aerodynamic coefficients, the airfoil surface solution, convergence histories, and printer plots of Mach number and density contour maps. The TAIR program is written in FORTRAN IV for batch execution and has been implemented on a CDC 7600 computer with a central memory requirement of approximately 155K (octal) of 60 bit words. The TAIR program was developed in 1981.

  9. CHEETAH: A fast thermochemical code for detonation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fried, L.E.

    1993-11-01

    For more than 20 years, TIGER has been the benchmark thermochemical code in the energetic materials community. TIGER has been widely used because it gives good detonation parameters in a very short period of time. Despite its success, TIGER is beginning to show its age. The program`s chemical equilibrium solver frequently crashes, especially when dealing with many chemical species. It often fails to find the C-J point. Finally, there are many inconveniences for the user stemming from the programs roots in pre-modern FORTRAN. These inconveniences often lead to mistakes in preparing input files and thus erroneous results. We are producingmore » a modern version of TIGER, which combines the best features of the old program with new capabilities, better computational algorithms, and improved packaging. The new code, which will evolve out of TIGER in the next few years, will be called ``CHEETAH.`` Many of the capabilities that will be put into CHEETAH are inspired by the thermochemical code CHEQ. The new capabilities of CHEETAH are: calculate trace levels of chemical compounds for environmental analysis; kinetics capability: CHEETAH will predict chemical compositions as a function of time given individual chemical reaction rates. Initial application: carbon condensation; CHEETAH will incorporate partial reactions; CHEETAH will be based on computer-optimized JCZ3 and BKW parameters. These parameters will be fit to over 20 years of data collected at LLNL. We will run CHEETAH thousands of times to determine the best possible parameter sets; CHEETAH will fit C-J data to JWL`s,and also predict full-wall and half-wall cylinder velocities.« less

  10. Reed-Solomon Codes and the Deep Hole Problem

    NASA Astrophysics Data System (ADS)

    Keti, Matt

    In many types of modern communication, a message is transmitted over a noisy medium. When this is done, there is a chance that the message will be corrupted. An error-correcting code adds redundant information to the message which allows the receiver to detect and correct errors accrued during the transmission. We will study the famous Reed-Solomon code (found in QR codes, compact discs, deep space probes,ldots) and investigate the limits of its error-correcting capacity. It can be shown that understanding this is related to understanding the "deep hole" problem, which is a question of determining when a received message has, in a sense, incurred the worst possible corruption. We partially resolve this in its traditional context, when the code is based on the finite field F q or Fq*, as well as new contexts, when it is based on a subgroup of F q* or the image of a Dickson polynomial. This is a new and important problem that could give insight on the true error-correcting potential of the Reed-Solomon code.

  11. Unique identification code for medical fundus images using blood vessel pattern for tele-ophthalmology applications.

    PubMed

    Singh, Anushikha; Dutta, Malay Kishore; Sharma, Dilip Kumar

    2016-10-01

    Identification of fundus images during transmission and storage in database for tele-ophthalmology applications is an important issue in modern era. The proposed work presents a novel accurate method for generation of unique identification code for identification of fundus images for tele-ophthalmology applications and storage in databases. Unlike existing methods of steganography and watermarking, this method does not tamper the medical image as nothing is embedded in this approach and there is no loss of medical information. Strategic combination of unique blood vessel pattern and patient ID is considered for generation of unique identification code for the digital fundus images. Segmented blood vessel pattern near the optic disc is strategically combined with patient ID for generation of a unique identification code for the image. The proposed method of medical image identification is tested on the publically available DRIVE and MESSIDOR database of fundus image and results are encouraging. Experimental results indicate the uniqueness of identification code and lossless recovery of patient identity from unique identification code for integrity verification of fundus images. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Does Print Size Matter for Reading? A Review of Findings from Vision Science and Typography

    PubMed Central

    Legge, Gordon E.; Bigelow, Charles A.

    2012-01-01

    The size and shape of printed symbols determine the legibility of text. In this paper we focus on print size because of its crucial role in understanding reading performance and its significance in the history and contemporary practice of typography. We present evidence supporting the hypothesis that the distribution of print sizes in historical and contemporary publications falls within the psychophysically defined range of fluent print size — the range over which text can be read at maximum speed. The fluent range extends over a factor of 10 in angular print size (x-height) from approximately 0.2° to 2°. Assuming a standard reading distance of 40 cm (16 inches), the corresponding physical x-heights are 1.4 mm (4 points) and 14 mm (40 points). We provide new data on the distributions of print sizes in published books and newspapers and in type founders' specimens, and consider factors influencing these distributions. We discuss theoretical concepts from vision science concerning visual size coding that help inform our understanding of historical and modern typographical practices. While economic, social, technological and artistic factors influence type design and selection, we conclude that properties of human visual processing play a dominant role in constraining the distribution of print sizes in common use. PMID:21828237

  13. Does print size matter for reading? A review of findings from vision science and typography.

    PubMed

    Legge, Gordon E; Bigelow, Charles A

    2011-08-09

    The size and shape of printed symbols determine the legibility of text. In this paper, we focus on print size because of its crucial role in understanding reading performance and its significance in the history and contemporary practice of typography. We present evidence supporting the hypothesis that the distribution of print sizes in historical and contemporary publications falls within the psychophysically defined range of fluent print size--the range over which text can be read at maximum speed. The fluent range extends over a factor of 10 in angular print size (x-height) from approximately 0.2° to 2°. Assuming a standard reading distance of 40 cm (16 inches), the corresponding physical x-heights are 1.4 mm (4 points) and 14 mm (40 points). We provide new data on the distributions of print sizes in published books and newspapers and in typefounders' specimens, and consider factors influencing these distributions. We discuss theoretical concepts from vision science concerning visual size coding that help inform our understanding of historical and modern typographical practices. While economic, social, technological, and artistic factors influence type design and selection, we conclude that properties of human visual processing play a dominant role in constraining the distribution of print sizes in common use.

  14. Ethics, culture and nursing practice in Ghana.

    PubMed

    Donkor, N T; Andrews, L D

    2011-03-01

    This paper describes how nurses in Ghana approach ethical problems. The International Council of Nurses' (ICN) Code for Nurses (2006) that serves as the model for professional code of ethics worldwide also acknowledges respect for healthy cultural values. Using the ICN's Code and universal ethical principles as a benchmark, a survey was conducted in 2009 to ascertain how nurses in Ghana respond to ethical and cultural issues in their practice. The study was qualitative with 200 participant nurses. Data were obtained through anonymous self-administered questionnaires. Descriptive statistics were used to analyze the data. Nurses' approaches to ethical problems in Ghana do not always meet expectations of the ICN Code for Nurses. They are also informed by local ethical practices related to the institutional setting and cultural environment in the country. While some cultural values complemented the ICN's Code and universal ethical principles, others conflicted with them. These data can assist nurses to provide culturally competent solutions to ethical dilemmas in their practice. Dynamic communication between nurses and patients/clients, intentional study of local cultural beliefs, and the development of ethics education will improve the conformity between universal ethical standards and local cultural values. © 2011 The Authors. International Nursing Review © 2011 International Council of Nurses.

  15. Addressing medical coding and billing part II: a strategy for achieving compliance. A risk management approach for reducing coding and billing errors.

    PubMed Central

    Adams, Diane L.; Norman, Helen; Burroughs, Valentine J.

    2002-01-01

    Medical practice today, more than ever before, places greater demands on physicians to see more patients, provide more complex medical services and adhere to stricter regulatory rules, leaving little time for coding and billing. Yet, the need to adequately document medical records, appropriately apply billing codes and accurately charge insurers for medical services is essential to the medical practice's financial condition. Many physicians rely on office staff and billing companies to process their medical bills without ever reviewing the bills before they are submitted for payment. Some physicians may not be receiving the payment they deserve when they do not sufficiently oversee the medical practice's coding and billing patterns. This article emphasizes the importance of monitoring and auditing medical record documentation and coding application as a strategy for achieving compliance and reducing billing errors. When medical bills are submitted with missing and incorrect information, they may result in unpaid claims and loss of revenue to physicians. Addressing Medical Audits, Part I--A Strategy for Achieving Compliance--CMS, JCAHO, NCQA, published January 2002 in the Journal of the National Medical Association, stressed the importance of preparing the medical practice for audits. The article highlighted steps the medical practice can take to prepare for audits and presented examples of guidelines used by regulatory agencies to conduct both medical and financial audits. The Medicare Integrity Program was cited as an example of guidelines used by regulators to identify coding errors during an audit and deny payment to providers when improper billing occurs. For each denied claim, payments owed to the medical practice are are also denied. Health care is, no doubt, a costly endeavor for health care providers, consumers and insurers. The potential risk to physicians for improper billing may include loss of revenue, fraud investigations, financial sanction, disciplinary action and exclusion from participation in government programs. Part II of this article recommends an approach for assessing potential risk, preventing improper billing, and improving financial management of the medical practice. Images p432-a PMID:12078924

  16. A proto-code of ethics and conduct for European nurse directors.

    PubMed

    Stievano, Alessandro; De Marinis, Maria Grazia; Kelly, Denise; Filkins, Jacqueline; Meyenburg-Altwarg, Iris; Petrangeli, Mauro; Tschudin, Verena

    2012-03-01

    The proto-code of ethics and conduct for European nurse directors was developed as a strategic and dynamic document for nurse managers in Europe. It invites critical dialogue, reflective thinking about different situations, and the development of specific codes of ethics and conduct by nursing associations in different countries. The term proto-code is used for this document so that specifically country-orientated or organization-based and practical codes can be developed from it to guide professionals in more particular or situation-explicit reflection and values. The proto-code of ethics and conduct for European nurse directors was designed and developed by the European Nurse Directors Association's (ENDA) advisory team. This article gives short explanations of the code' s preamble and two main parts: Nurse directors' ethical basis, and Principles of professional practice, which is divided into six specific points: competence, care, safety, staff, life-long learning and multi-sectorial working.

  17. Evaluation in industry of a draft code of practice for manual handling.

    PubMed

    Ashby, Liz; Tappin, David; Bentley, Tim

    2004-05-01

    This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.

  18. A Novel Technique for Running the NASA Legacy Code LAPIN Synchronously With Simulations Developed Using Simulink

    NASA Technical Reports Server (NTRS)

    Vrnak, Daniel R.; Stueber, Thomas J.; Le, Dzu K.

    2012-01-01

    This report presents a method for running a dynamic legacy inlet simulation in concert with another dynamic simulation that uses a graphical interface. The legacy code, NASA's LArge Perturbation INlet (LAPIN) model, was coded using the FORTRAN 77 (The Portland Group, Lake Oswego, OR) programming language to run in a command shell similar to other applications that used the Microsoft Disk Operating System (MS-DOS) (Microsoft Corporation, Redmond, WA). Simulink (MathWorks, Natick, MA) is a dynamic simulation that runs on a modern graphical operating system. The product of this work has both simulations, LAPIN and Simulink, running synchronously on the same computer with periodic data exchanges. Implementing the method described in this paper avoided extensive changes to the legacy code and preserved its basic operating procedure. This paper presents a novel method that promotes inter-task data communication between the synchronously running processes.

  19. Status Report on NEAMS PROTEUS/ORIGEN Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieselquist, William A

    2016-02-18

    The US Department of Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program has contributed significantly to the development of the PROTEUS neutron transport code at Argonne National Laboratory and to the Oak Ridge Isotope Generation and Depletion Code (ORIGEN) depletion/decay code at Oak Ridge National Laboratory. PROTEUS’s key capability is the efficient and scalable (up to hundreds of thousands of cores) neutron transport solver on general, unstructured, three-dimensional finite-element-type meshes. The scalability and mesh generality enable the transfer of neutron and power distributions to other codes in the NEAMS toolkit for advanced multiphysics analysis. Recently, ORIGEN has received considerablemore » modernization to provide the high-performance depletion/decay capability within the NEAMS toolkit. This work presents a description of the initial integration of ORIGEN in PROTEUS, mainly performed during FY 2015, with minor updates in FY 2016.« less

  20. An international survey of building energy codes and their implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Meredydd; Roshchanka, Volha; Graham, Peter

    Buildings are key to low-carbon development everywhere, and many countries have introduced building energy codes to improve energy efficiency in buildings. Yet, building energy codes can only deliver results when the codes are implemented. For this reason, studies of building energy codes need to consider implementation of building energy codes in a consistent and comprehensive way. This research identifies elements and practices in implementing building energy codes, covering codes in 22 countries that account for 70% of global energy demand from buildings. Access to benefits of building energy codes depends on comprehensive coverage of buildings by type, age, size, andmore » geographic location; an implementation framework that involves a certified agency to inspect construction at critical stages; and independently tested, rated, and labeled building energy materials. Training and supporting tools are another element of successful code implementation, and their role is growing in importance, given the increasing flexibility and complexity of building energy codes. Some countries have also introduced compliance evaluation and compliance checking protocols to improve implementation. This article provides examples of practices that countries have adopted to assist with implementation of building energy codes.« less

  1. The Escuela Moderna Movement of Francisco Ferrer: "Por la Verdad y la Justicia."

    ERIC Educational Resources Information Center

    Fidler, Geoffrey C.

    1985-01-01

    The educational theory and practice of the Escuela Modern (Modern School) movement of the Spanish educator Francisco Ferrer, born in 1850, are discussed. Two fundamental tendencies of the movement are child-centered education and education in didactic terms. (RM)

  2. Retraining the Modern Civil Engineer.

    ERIC Educational Resources Information Center

    Priscoli, Jerome Delli

    1983-01-01

    Discusses why modern engineering requires social science and the nature of planning. After these conceptional discussions, 12 practical tools which social science brings to engineering are reviewed. A tested approach to training engineers in these tools is then described. Tools include institutional analysis, policy profiling, and other impact…

  3. Making Early Modern Medicine: Reproducing Swedish Bitters.

    PubMed

    Ahnfelt, Nils-Otto; Fors, Hjalmar

    2016-05-01

    Historians of science and medicine have rarely applied themselves to reproducing the experiments and practices of medicine and pharmacy. This paper delineates our efforts to reproduce "Swedish Bitters," an early modern composite medicine in wide European use from the 1730s to the present. In its original formulation, it was made from seven medicinal simples: aloe, rhubarb, saffron, myrrh, gentian, zedoary and agarikon. These were mixed in alcohol together with some theriac, a composite medicine of classical origin. The paper delineates the compositional history of Swedish Bitters and the medical rationale underlying its composition. It also describes how we go about to reproduce the medicine in a laboratory using early modern pharmaceutical methods, and analyse it using contemporary methods of pharmaceutical chemistry. Our aim is twofold: first, to show how reproducing medicines may provide a path towards a deeper understanding of the role of sensual and practical knowledge in the wider context of early modern medical culture; and second, how it may yield interesting results from the point of view of contemporary pharmaceutical science.

  4. Light Water Reactor Sustainability Program Operator Performance Metrics for Control Room Modernization: A Practical Guide for Early Design Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald Boring; Roger Lew; Thomas Ulrich

    2014-03-01

    As control rooms are modernized with new digital systems at nuclear power plants, it is necessary to evaluate the operator performance using these systems as part of a verification and validation process. There are no standard, predefined metrics available for assessing what is satisfactory operator interaction with new systems, especially during the early design stages of a new system. This report identifies the process and metrics for evaluating human system interfaces as part of control room modernization. The report includes background information on design and evaluation, a thorough discussion of human performance measures, and a practical example of how themore » process and metrics have been used as part of a turbine control system upgrade during the formative stages of design. The process and metrics are geared toward generalizability to other applications and serve as a template for utilities undertaking their own control room modernization activities.« less

  5. Some practical universal noiseless coding techniques

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1979-01-01

    Some practical adaptive techniques for the efficient noiseless coding of a broad class of such data sources are developed and analyzed. Algorithms are designed for coding discrete memoryless sources which have a known symbol probability ordering but unknown probability values. A general applicability of these algorithms to solving practical problems is obtained because most real data sources can be simply transformed into this form by appropriate preprocessing. These algorithms have exhibited performance only slightly above all entropy values when applied to real data with stationary characteristics over the measurement span. Performance considerably under a measured average data entropy may be observed when data characteristics are changing over the measurement span.

  6. Hydrodynamic Simulations of Protoplanetary Disks with GIZMO

    NASA Astrophysics Data System (ADS)

    Rice, Malena; Laughlin, Greg

    2018-01-01

    Over the past several decades, the field of computational fluid dynamics has rapidly advanced as the range of available numerical algorithms and computationally feasible physical problems has expanded. The development of modern numerical solvers has provided a compelling opportunity to reconsider previously obtained results in search for yet undiscovered effects that may be revealed through longer integration times and more precise numerical approaches. In this study, we compare the results of past hydrodynamic disk simulations with those obtained from modern analytical resources. We focus our study on the GIZMO code (Hopkins 2015), which uses meshless methods to solve the homogeneous Euler equations of hydrodynamics while eliminating problems arising as a result of advection between grid cells. By comparing modern simulations with prior results, we hope to provide an improved understanding of the impact of fluid mechanics upon the evolution of protoplanetary disks.

  7. Multiple-Symbol Noncoherent Decoding of Uncoded and Convolutionally Codes Continous Phase Modulation

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Raphaeli, D.

    2000-01-01

    Recently, a method for combined noncoherent detection and decoding of trellis-codes (noncoherent coded modulation) has been proposed, which can practically approach the performance of coherent detection.

  8. Video traffic characteristics of modern encoding standards: H.264/AVC with SVC and MVC extensions and H.265/HEVC.

    PubMed

    Seeling, Patrick; Reisslein, Martin

    2014-01-01

    Video encoding for multimedia services over communication networks has significantly advanced in recent years with the development of the highly efficient and flexible H.264/AVC video coding standard and its SVC extension. The emerging H.265/HEVC video coding standard as well as 3D video coding further advance video coding for multimedia communications. This paper first gives an overview of these new video coding standards and then examines their implications for multimedia communications by studying the traffic characteristics of long videos encoded with the new coding standards. We review video coding advances from MPEG-2 and MPEG-4 Part 2 to H.264/AVC and its SVC and MVC extensions as well as H.265/HEVC. For single-layer (nonscalable) video, we compare H.265/HEVC and H.264/AVC in terms of video traffic and statistical multiplexing characteristics. Our study is the first to examine the H.265/HEVC traffic variability for long videos. We also illustrate the video traffic characteristics and statistical multiplexing of scalable video encoded with the SVC extension of H.264/AVC as well as 3D video encoded with the MVC extension of H.264/AVC.

  9. Video Traffic Characteristics of Modern Encoding Standards: H.264/AVC with SVC and MVC Extensions and H.265/HEVC

    PubMed Central

    2014-01-01

    Video encoding for multimedia services over communication networks has significantly advanced in recent years with the development of the highly efficient and flexible H.264/AVC video coding standard and its SVC extension. The emerging H.265/HEVC video coding standard as well as 3D video coding further advance video coding for multimedia communications. This paper first gives an overview of these new video coding standards and then examines their implications for multimedia communications by studying the traffic characteristics of long videos encoded with the new coding standards. We review video coding advances from MPEG-2 and MPEG-4 Part 2 to H.264/AVC and its SVC and MVC extensions as well as H.265/HEVC. For single-layer (nonscalable) video, we compare H.265/HEVC and H.264/AVC in terms of video traffic and statistical multiplexing characteristics. Our study is the first to examine the H.265/HEVC traffic variability for long videos. We also illustrate the video traffic characteristics and statistical multiplexing of scalable video encoded with the SVC extension of H.264/AVC as well as 3D video encoded with the MVC extension of H.264/AVC. PMID:24701145

  10. Incorporation of Whole, Ancient Grains into a Modern Asian Indian Diet: Practical Strategies to Reduce the Burden of Chronic Disease

    PubMed Central

    Dixit, Anjali A.; Azar, Kristen M. J.; Gardner, Christopher D.; Palaniappan, Latha P.

    2011-01-01

    Refined carbohydrates, such as white rice and white flour, are the mainstay of the modern Asian Indian diet, and may contribute to the rising incidence of type 2 diabetes and cardiovascular disease in this population. Prior to the 1950s, whole grains such as amaranth, barley, brown rice, millet, and sorghum were more commonly used in Asian Indian cooking. These grains and other non-Indian grains such as couscous, quinoa, and spelt are nutritionally advantageous and may be culturally acceptable carbohydrate substitutes for Asian Indians. This review focuses on practical recommendations for culturally sensitive carbohydrate modification in a modern Asian Indian diet, in an effort to reduce type 2 diabetes and cardiovascular disease in this population. PMID:21790614

  11. Professional Ethics in Teaching: Towards the Development of a Code of Practice.

    ERIC Educational Resources Information Center

    Campbell, Elizabeth

    2000-01-01

    Provides a theoretical discussion about the process of creating a professional code of ethics for educators. Discusses six key issues and questions, introducing the development of a code of professional ethics and the complexities the code should address. Includes references. (CMK)

  12. 77 FR 13513 - Modernizing the Regulation of Clinical Trials and Approaches to Good Clinical Practice; Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-07

    ...The Food and Drug Administration (FDA) is announcing a 2-day public hearing to obtain input from interested persons on FDA's scope and direction in modernizing the regulations, policies, and practices that apply to the conduct of clinical trials of FDA-regulated products. Clinical trials are a critical source of evidence to inform medical policy and practice, and effective regulatory oversight is needed to ensure that human subjects are protected and resulting clinical trial data are credible and accurate. FDA is aware of concerns within the clinical trial community that certain regulations and policies applicable to the conduct of clinical trials may result in inefficiencies or increased cost and may not facilitate the use of innovative methods and technological advances to improve clinical trial quality. The Agency is involved in an effort to modernize the regulatory framework that governs clinical trials and approaches to good clinical practice (GCP). The purpose of this hearing is to solicit public input from a broad group of stakeholders on the scope and direction of this effort, including encouraging the use of innovative models that may enhance the effectiveness and efficiency of the clinical trial enterprise.

  13. ECLIPPx: an innovative model for reflective portfolios in life-long learning.

    PubMed

    Cheung, C Ronny

    2011-03-01

    For healthcare professionals, the educational portfolio is the most widely used component of lifelong learning - a vital aspect of modern medical practice. When used effectively, portfolios provide evidence of continuous learning and promote reflective practice. But traditional portfolio models are in danger of becoming outmoded, in the face of changing expectations of healthcare provider competences today. Portfolios in health care have generally focused on competencies in clinical skills. However, many other domains of professional development, such as professionalism and leadership skills, are increasingly important for doctors and health care professionals, and must be addressed in amassing evidence for training and revalidation. There is a need for modern health care learning portfolios to reflect this sea change. A new model for categorising the health care portfolios of professionals is proposed. The ECLIPPx model is based on personal practice, and divides the evidence of ongoing professional learning into four categories: educational development; clinical practice; leadership, innovation and professionalism; and personal experience. The ECLIPPx model offers a new approach for personal reflection and longitudinal learning, one that gives flexibility to the user whilst simultaneously encompassing the many relatively new areas of competence and expertise that are now required of a modern doctor. © Blackwell Publishing Ltd 2011.

  14. Amalgamation of management information system into anaesthesiology practice: A boon for the modern anaesthesiologists

    PubMed Central

    Bajwa, Sukhminder Jit Singh

    2014-01-01

    Over the years, traditional anaesthesia record keeping system has been the backbone of anaesthesiology ever since its introduction in the 1890s by Dr. Harvey Cushing and Dr. Ernest A. Codman. Besides providing the important information regarding patients’ vital physiologic parameters, paper records had been a reliable source for various clinical research activities. The introduction of electronic monitoring gadgets and electronic record keeping systems has revolutionised the anaesthesiology practice to a large extent. Recently, the introduction of anaesthesia information management system (AIMS), which incorporates all the features of monitoring gadgets, such as electronic storage of large accurate data, quality assurance in anaesthesia, enhancing patient safety, ensuring legal protection, improved billing services and effecting an organisational change, is almost a revolution in modern-day anaesthesiology practice. The clinical research activities that are responsible for taking anaesthesiology discipline to higher peaks have also been boosted by the amalgamation of AIMS, enabling multicenter studies and sharing of clinical data. Barring few concerns in its installation, cost factors and functional aspects, the future of AIMS seems to be bright and will definitely prove to be a boon for modern-day anaesthesiology practice. PMID:24963173

  15. Psychoanalysis in modern mental health practice.

    PubMed

    Yakeley, Jessica

    2018-05-01

    Like any discipline, psychoanalysis has evolved considerably since its inception by Freud over a century ago, and a multitude of different psychoanalytic traditions and schools of theory and practice now exist. However, some of Freud's original ideas, such as the dynamic unconscious, a developmental approach, defence mechanisms, and transference and countertransference remain essential tenets of psychoanalytic thinking to this day. This Review outlines several areas within modern mental health practice in which contemporary adaptations and applications of these psychoanalytic concepts might offer helpful insights and improvements in patient care and management, and concludes with an overview of evidence-based psychoanalytically informed treatments and the links between psychoanalysis, attachment research, and neuroscience. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Between trust and accountability: different perspectives on the modernization of postgraduate medical training in the Netherlands.

    PubMed

    Wallenburg, Iris; van Exel, Job; Stolk, Elly; Scheele, Fedde; de Bont, Antoinette; Meurs, Pauline

    2010-06-01

    Postgraduate medical training was reformed to be more responsive to changing societal needs. In the Netherlands, as in various other Western countries, a competency-based curriculum was introduced reflecting the clinical and nonclinical roles a modern doctor should fulfill. It is still unclear, however, what this modernization process exactly comprises and what its consequences might be for clinical practice and medical work. The authors conducted a Q methodological study to investigate which different perspectives exist on the modernization of postgraduate medical training among actors involved. The authors found four distinct perspectives, reflecting the different features of medical training. The accountability perspective stresses the importance of formal regulations within medical training and the monitoring of results in order to be more transparent and accountable to society. According to the educational perspective, medical training should be more formalized and directed at the educational process. The work-life balance perspective stresses the balance between a working life and a private life, as well as the changing professional relationship between staff members and residents. The trust-based perspective reflects the classic view of medical training in which role modeling and trust are considered most important. The four perspectives on the modernization of postgraduate medical training show that various aspects of the modernization process are valued differently by stakeholders, highlighting important sources of agreement and disagreement between them. An important source of disagreement is diverging expectations of the role of physicians in modern medical practice.

  17. Unusual but sound minds: mental health indicators in spiritual individuals.

    PubMed

    Farias, Miguel; Underwood, Raphael; Claridge, Gordon

    2013-08-01

    Previous research has linked certain types of modern spirituality, including New Age and Pagan, with either benign schizotypy or insecure attachment. While the first view emphasizes a positive aspect of spiritual believers' mental health (benign schizotypy), the second view emphasizes a negative aspect, namely the unhealthy emotional compensation associated with an insecure attachment style. This study addresses these two conflicting views by comparing a sample of modern spiritual individuals (N = 114) with a contrast group of traditional religious believers (N = 86). Measures of schizotypy and attachment style were combined with mental health scales of anxiety and depression. We further assessed death anxiety to determine whether modern spiritual beliefs fulfilled a similar function as traditional religious beliefs in the reduction of existential threat. Our results support a psychological contiguity between traditional and modern spiritual believers and reinforce the need to de-stigmatize spiritual ideas and experiences. Using hierarchical regression, we showed that unusual experiences and ideas are the major predictor of engagement in modern spiritual practices. Anxiety, depression variables, and insecure attachment were not significant predictors of spirituality or correlated with them; on the other hand, the results show that spiritual believers report high social support satisfaction and this variable predicts involvement in modern spirituality. Further, spiritual practices were negatively correlated with and negatively predicted by death anxiety scores. Overall, the results strengthen the association between modern spirituality, good mental health, and general well-being. ©2012 The British Psychological Society.

  18. History of mathematics and history of science reunited?

    PubMed

    Gray, Jeremy

    2011-09-01

    For some years now, the history of modern mathematics and the history of modern science have developed independently. A step toward a reunification that would benefit both disciplines could come about through a revived appreciation of mathematical practice. Detailed studies of what mathematicians actually do, whether local or broadly based, have often led in recent work to examinations of the social, cultural, and national contexts, and more can be done. Another recent approach toward a historical understanding of the abstractness of modern mathematics has been to see it as a species of modernism, and this thesis will be tested by the raft of works on the history of modern applied mathematics currently under way.

  19. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less

  20. Parallel Processable Cryptographic Methods with Unbounded Practical Security.

    ERIC Educational Resources Information Center

    Rothstein, Jerome

    Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

  1. RETRACTED — PMD mitigation through interleaving LDPC codes with polarization scramblers

    NASA Astrophysics Data System (ADS)

    Han, Dahai; Chen, Haoran; Xi, Lixia

    2012-11-01

    The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved as an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this paper as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10 MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes brings incremental performance of error correction, and the PMD tolerance is 10 ps at OSNR=11.4 dB. The results show that the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.

  2. PMD mitigation through interleaving LDPC codes with polarization scramblers

    NASA Astrophysics Data System (ADS)

    Han, Dahai; Chen, Haoran; Xi, Lixia

    2013-09-01

    The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this article as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes bring incremental performance of error correction, and the PMD tolerance is 10ps at OSNR=11.4dB. The results show the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.

  3. Myths of Teaching the Golf Swing.

    ERIC Educational Resources Information Center

    Kraft, Robert E.

    1987-01-01

    This article dispells 11 myths about common teaching practices and misconceptions about the modern golf swing. Each myth is counterbalanced by facts presented by researchers about appropriate movements, skills, and practices. (CB)

  4. 75 FR 46903 - Notice of Proposed Changes to the National Handbook of Conservation Practices for the Natural...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-04

    ... Treatment (Code 521D), Pond Sealing or Lining--Soil Dispersant Treatment (Code 521B), Salinity and Sodic Soil Management (Code 610), Stream Habitat Improvement and Management (Code 395), Vertical Drain (Code... the criteria section; an expansion of the considerations section to include fish and wildlife and soil...

  5. A Parallel Numerical Micromagnetic Code Using FEniCS

    NASA Astrophysics Data System (ADS)

    Nagy, L.; Williams, W.; Mitchell, L.

    2013-12-01

    Many problems in the geosciences depend on understanding the ability of magnetic minerals to provide stable paleomagnetic recordings. Numerical micromagnetic modelling allows us to calculate the domain structures found in naturally occurring magnetic materials. However the computational cost rises exceedingly quickly with respect to the size and complexity of the geometries that we wish to model. This problem is compounded by the fact that the modern processor design no longer focuses on the speed at which calculations are performed, but rather on the number of computational units amongst which we may distribute our calculations. Consequently to better exploit modern computational resources our micromagnetic simulations must "go parallel". We present a parallel and scalable micromagnetics code written using FEniCS. FEniCS is a multinational collaboration involving several institutions (University of Cambridge, University of Chicago, The Simula Research Laboratory, etc.) that aims to provide a set of tools for writing scientific software; in particular software that employs the finite element method. The advantages of this approach are the leveraging of pre-existing projects from the world of scientific computing (PETSc, Trilinos, Metis/Parmetis, etc.) and exposing these so that researchers may pose problems in a manner closer to the mathematical language of their domain. Our code provides a scriptable interface (in Python) that allows users to not only run micromagnetic models in parallel, but also to perform pre/post processing of data.

  6. Future GOES-R global ground receivers

    NASA Astrophysics Data System (ADS)

    Dafesh, P. A.; Grayver, E.

    2006-08-01

    The Aerospace Corporation has developed an end-to-end testbed to demonstrate a wide range of modern modulation and coding alternatives for future broadcast by the GOES-R Global Rebroadcast (GRB) system. In particular, this paper describes the development of a compact, low cost, flexible GRB digital receiver that was designed, implemented, fabricated, and tested as part of the development. This receiver demonstrates a 10-fold increase in data rate compared to the rate achievable by the current GOES generation, without a major impact on either cost or size. The digital receiver is integrated on a single PCI card with an FPGA device, and analog-to-digital converters. It supports a wide range of modulations (including 8-PSK and 16-QAM) and turbo coding. With appropriate FPGA firmware and software changes, it can also be configured to receive the current (legacy) GOES signals. The receiver has been validated by sending large image files over a high-fidelity satellite channel emulator, including a space-qualified power amplifier and a white noise source. The receiver is a key component of a future GOES-R weather receiver system (also called user terminal) that includes the antenna, low-noise amplifier, downconverter, filters, digital receiver, and receiver system software. This work describes this receiver proof of concept and its application to providing a very credible estimate of the impact of using modern modulation and coding techniques in the future GOES-R system.

  7. Gender and assistance: historical and conceptual considerations regarding assistance practices and policies.

    PubMed

    Martins, Ana Paula Vosne

    2011-12-01

    The article offers some theoretical and historical reflections on the concept of gender as it relates to the notion of assistance. Explores the political dimensions of both concepts and problematizes the dichotomy between the gender-marked realms of the political and the pre-political, a dichotomy that has greatly influenced modern political theory and thought. It examines the modern state's care practices and the transformations in assistance which occurred within the charitable and assistance organizations that took shape in parallel and in consonance with this state action.

  8. Modern materials in fabrication of scaffolds for bone defect replacement

    NASA Astrophysics Data System (ADS)

    Bazlov, V. A.; Mamuladze, T. Z.; Pavlov, V. V.; Kirilova, I. A.; Sadovoy, M. A.

    2016-08-01

    The article defines the requirements for modern scaffold-forming materials and describes the main advantages and disadvantages of various synthetic materials. Osseointegration of synthetic scaffolds approved for use in medical practice is evaluated. Nylon 618 (certification ISO9001 1093-1-2009) is described as the most promising synthetic material used in medical practice. The authors briefly highlight the issues of individual bone grafting with the use of 3D printing technology. An example of contouring pelvis defect after removal of a giant tumor with the use of 3D models is provided.

  9. Electromagnetic on-aircraft antenna radiation in the presence of composite plates

    NASA Technical Reports Server (NTRS)

    Kan, S. H-T.; Rojas, R. G.

    1994-01-01

    The UTD-based NEWAIR3 code is modified such that it can model modern aircraft by composite plates. One good model of conductor-backed composites is the impedance boundary condition where the composites are replaced by surfaces with complex impedances. This impedance-plate model is then used to model the composite plates in the NEWAIR3 code. In most applications, the aircraft distorts the desired radiation pattern of the antenna. However, test examples conducted in this report have shown that the undesired scattered fields are minimized if the right impedance values are chosen for the surface impedance plates.

  10. Field Validation of the Stability Limit of a Multi MW Turbine

    NASA Astrophysics Data System (ADS)

    Kallesøe, Bjarne S.; Kragh, Knud A.

    2016-09-01

    Long slender blades of modern multi-megawatt turbines exhibit a flutter like instability at rotor speeds above a critical rotor speed. Knowing the critical rotor speed is crucial to a safe turbine design. The flutter like instability can only be estimated using geometrically non-linear aeroelastic codes. In this study, the estimated rotor speed stability limit of a 7 MW state of the art wind turbine is validated experimentally. The stability limit is estimated using Siemens Wind Powers in-house aeroelastic code, and the results show that the predicted stability limit is within 5% of the experimentally observed limit.

  11. Adaptive antioxidant methionine accumulation in respiratory chain complexes explains the use of a deviant genetic code in mitochondria.

    PubMed

    Bender, Aline; Hajieva, Parvana; Moosmann, Bernd

    2008-10-28

    Humans and most other animals use 2 different genetic codes to translate their hereditary information: the standard code for nuclear-encoded proteins and a modern variant of this code in mitochondria. Despite the pivotal role of the genetic code for cell biology, the functional significance of the deviant mitochondrial code has remained enigmatic since its first description in 1979. Here, we show that profound and functionally beneficial alterations on the encoded protein level were causative for the AUA codon reassignment from isoleucine to methionine observed in most mitochondrial lineages. We demonstrate that this codon reassignment leads to a massive accumulation of the easily oxidized amino acid methionine in the highly oxidative inner mitochondrial membrane. This apparently paradoxical outcome can yet be smoothly settled if the antioxidant surface chemistry of methionine is taken into account, and we present direct experimental evidence that intramembrane accumulation of methionine exhibits antioxidant and cytoprotective properties in living cells. Our results unveil that methionine is an evolutionarily selected antioxidant building block of respiratory chain complexes. Collective protein alterations can thus constitute the selective advantage behind codon reassignments, which authenticates the "ambiguous decoding" hypothesis of genetic code evolution. Oxidative stress has shaped the mitochondrial genetic code.

  12. Ethics in perioperative practice--patient advocacy.

    PubMed

    Schroeter, Kathryn

    2002-05-01

    Though often difficult, ethical decision making is necessary when caring for surgical patients. Perioperative nurses have to recognize ethical dilemmas and be prepared to take action based on the ethical code outlined in the American Nurses Association's (ANA's) Code of Ethics for Nurses with Interpretive Statements. In this second of a nine-part series that will help perioperative nurses relate the ANA code to their own area of practice, the author looks at the third provision statement, which addresses nurses' position as patient advocates.

  13. Distributed Joint Source-Channel Coding in Wireless Sensor Networks

    PubMed Central

    Zhu, Xuqi; Liu, Yu; Zhang, Lin

    2009-01-01

    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560

  14. An adaptable binary entropy coder

    NASA Technical Reports Server (NTRS)

    Kiely, A.; Klimesh, M.

    2001-01-01

    We present a novel entropy coding technique which is based on recursive interleaving of variable-to-variable length binary source codes. We discuss code design and performance estimation methods, as well as practical encoding and decoding algorithms.

  15. The impact of medical tourism and the code of medical ethics on advertisement in Nigeria

    PubMed Central

    Makinde, Olusesan Ayodeji; Brown, Brandon; Olaleye, Olalekan

    2014-01-01

    Advances in management of clinical conditions are being made in several resource poor countries including Nigeria. Yet, the code of medical ethics which bars physician and health practices from advertising the kind of services they render deters these practices. This is worsened by the incursion of medical tourism facilitators (MTF) who continue to market healthcare services across countries over the internet and social media thereby raising ethical questions. A significant review of the advertisement ban in the code of ethics is long overdue. Limited knowledge about advances in medical practice among physicians and the populace, the growing medical tourism industry and its attendant effects, and the possibility of driving brain gain provide evidence to repeal the code. Ethical issues, resistance to change and elitist ideas are mitigating factors working in the opposite direction. The repeal of the code of medical ethics against advertising will undoubtedly favor health facilities in the country that currently cannot advertise the kind of services they render. A repeal or review of this code of medical ethics is necessary with properly laid down guidelines on how advertisements can be and cannot be done. PMID:25722776

  16. The impact of medical tourism and the code of medical ethics on advertisement in Nigeria.

    PubMed

    Makinde, Olusesan Ayodeji; Brown, Brandon; Olaleye, Olalekan

    2014-01-01

    Advances in management of clinical conditions are being made in several resource poor countries including Nigeria. Yet, the code of medical ethics which bars physician and health practices from advertising the kind of services they render deters these practices. This is worsened by the incursion of medical tourism facilitators (MTF) who continue to market healthcare services across countries over the internet and social media thereby raising ethical questions. A significant review of the advertisement ban in the code of ethics is long overdue. Limited knowledge about advances in medical practice among physicians and the populace, the growing medical tourism industry and its attendant effects, and the possibility of driving brain gain provide evidence to repeal the code. Ethical issues, resistance to change and elitist ideas are mitigating factors working in the opposite direction. The repeal of the code of medical ethics against advertising will undoubtedly favor health facilities in the country that currently cannot advertise the kind of services they render. A repeal or review of this code of medical ethics is necessary with properly laid down guidelines on how advertisements can be and cannot be done.

  17. Comparison of k Q factors measured with a water calorimeter in flattening filter free (FFF) and conventional flattening filter (cFF) photon beams

    NASA Astrophysics Data System (ADS)

    de Prez, Leon; de Pooter, Jacco; Jansen, Bartel; Perik, Thijs; Wittkämper, Frits

    2018-02-01

    Recently flattening filter free (FFF) beams became available for application in modern radiotherapy. There are several advantages of FFF beams over conventional flattening filtered (cFF) beams, however differences in beam spectra at the point of interest in a phantom potentially affect the ion chamber response. Beams are also non-uniform over the length of a typical reference ion chamber and recombination is usually larger. Despite several studies describing FFF beam characteristics, only a limited number of studies investigated their effect on k Q factors. Some of those studies predicted significant discrepancies in k Q factors (0.4% up to 1.0%) if TPR20,10 based codes of practice (CoPs) were to be used. This study addresses the question to which extent k Q factors, based on a TPR20,10 CoP, can be applied in clinical reference dosimetry. It is the first study that compares k Q factors measured directly with an absorbed dose to water primary standard in FFF-cFF pairs of clinical photon beams. This was done with a transportable water calorimeter described elsewhere. The measurements corrected for recombination and beam radial non-uniformity were performed in FFF-cFF beam pairs at 6 MV and 10 MV of an Elekta Versa HD for a selection of three different Farmer-type ion chambers (eight serial numbers). The ratio of measured k Q factors of the FFF-cFF beam pairs were compared with the TPR20,10 CoPs of the NCS and IAEA and the %dd(10) x CoP of the AAPM. For the TPR20,10 based CoPs differences less than 0.23% were found in k Q factors between the corresponding FFF-cFF beams with standard uncertainties smaller than 0.35%, while for the %dd(10) x these differences were smaller than 0.46% and within the expanded uncertainty of the measurements. Based on the measurements made with the equipment described in this study the authors conclude that the k Q factors provided by the NCS-18 and IAEA TRS-398 codes of practice can be applied for flattening filter free beams without additional correction. However, existing codes of practice cannot be applied ignoring the significant volume averaging effect of the FFF beams over the ion chamber cavity. For this a corresponding volume averaging correction must be applied.

  18. XSEOS: An Open Software for Chemical Engineering Thermodynamics

    ERIC Educational Resources Information Center

    Castier, Marcelo

    2008-01-01

    An Excel add-in--XSEOS--that implements several excess Gibbs free energy models and equations of state has been developed for educational use. Several traditional and modern thermodynamic models are available in the package with a user-friendly interface. XSEOS has open code, is freely available, and should be useful for instructors and students…

  19. Learner Code-Switching in the Content-Based Foreign Language Classroom

    ERIC Educational Resources Information Center

    Liebscher, Grit; Dailey-O'Cain, Jennifer

    2005-01-01

    This article is republished from "The Canadian Modern Language Review," 60, 4, pp. 501-526. It is published as an article exchange between the "MLJ" and the "CMLR." The articles for the exchange were selected by committees from the Editorial Board of each journal according to the following criteria: articles of…

  20. Composeable Chat over Low-Bandwidth Intermittent Communication Links

    DTIC Science & Technology

    2007-04-01

    Compression (STC), introduced in this report, is a data compression algorithm intended to compress alphanumeric... Ziv - Lempel coding, the grandfather of most modern general-purpose file compression programs, watches for input symbol sequences that have previously... data . This section applies these techniques to create a new compression algorithm called Small Text Compression . Various sequence compression

  1. 42 CFR 414.904 - Average sales price as the basis for payment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subsection (c), the term billing unit means the identifiable quantity associated with a billing and payment code, as established by CMS. (c) Single source drugs—(1) Average sales price. The average sales price... report as required by section 623(c) of the Medicare Prescription Drug, Improvement, and Modernization...

  2. 42 CFR 414.904 - Average sales price as the basis for payment.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subsection (c), the term billing unit means the identifiable quantity associated with a billing and payment code, as established by CMS. (c) Single source drugs—(1) Average sales price. The average sales price... report as required by section 623(c) of the Medicare Prescription Drug, Improvement, and Modernization...

  3. Extra-Required Service: A Radical Shift in Frontline Geriatric Caregiving

    ERIC Educational Resources Information Center

    Clarke, Egerton

    2011-01-01

    Much research examines the professional nursing practices of traditional and modern caregivers, but it remains unclear whether the delivery of extra-required services is diminished as the caregiver moves from traditional to modern community. Building on the classical works of sociologists Ferdinand Tonnies, Max Weber, and Emile Durkheim, this…

  4. Thought Experiments in Physics Education: A Simple and Practical Example.

    ERIC Educational Resources Information Center

    Lattery, Mark J.

    2001-01-01

    Uses a Galilean thought experiment to enhance learning in a college-level physical science course. Presents both modern and historical perspectives of Galileo's work. As a final project, students explored Galileo's thought experiment in the laboratory using modern detectors with satisfying results. (Contains 25 references.) (Author/ASK)

  5. Statistics without Tears: Complex Statistics with Simple Arithmetic

    ERIC Educational Resources Information Center

    Smith, Brian

    2011-01-01

    One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…

  6. Performance of the fusion code GYRO on four generations of Cray computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fahey, Mark R

    2014-01-01

    GYRO is a code used for the direct numerical simulation of plasma microturbulence. It has been ported to a variety of modern MPP platforms including several modern commodity clusters, IBM SPs, and Cray XC, XT, and XE series machines. We briefly describe the mathematical structure of the equations, the data layout, and the redistribution scheme. Also, while the performance and scaling of GYRO on many of these systems has been shown before, here we show the comparative performance and scaling on four generations of Cray supercomputers including the newest addition - the Cray XC30. The more recently added hybrid OpenMP/MPImore » imple- mentation also shows a great deal of promise on custom HPC systems that utilize fast CPUs and proprietary interconnects. Four machines of varying sizes were used in the experiment, all of which are located at the National Institute for Computational Sciences at the University of Tennessee at Knoxville and Oak Ridge National Laboratory. The advantages, limitations, and performance of using each system are discussed.« less

  7. CHARRON: Code for High Angular Resolution of Rotating Objects in Nature

    NASA Astrophysics Data System (ADS)

    Domiciano de Souza, A.; Zorec, J.; Vakili, F.

    2012-12-01

    Rotation is one of the fundamental physical parameters governing stellar physics and evolution. At the same time, spectrally resolved optical/IR long-baseline interferometry has proven to be an important observing tool to measure many physical effects linked to rotation, in particular, stellar flattening, gravity darkening, differential rotation. In order to interpret the high angular resolution observations from modern spectro-interferometers, such as VLTI/AMBER and VEGA/CHARA, we have developed an interferometry-oriented numerical model: CHARRON (Code for High Angular Resolution of Rotating Objects in Nature). We present here the characteristics of CHARRON, which is faster (≃q10-30 s per model) and thus more adapted to model-fitting than the first version of the code presented by Domiciano de Souza et al. (2002).

  8. Examples of Use of SINBAD Database for Nuclear Data and Code Validation

    NASA Astrophysics Data System (ADS)

    Kodeli, Ivan; Žerovnik, Gašper; Milocco, Alberto

    2017-09-01

    The SINBAD database currently contains compilations and evaluations of over 100 shielding benchmark experiments. The SINBAD database is widely used for code and data validation. Materials covered include: Air, N. O, H2O, Al, Be, Cu, graphite, concrete, Fe, stainless steel, Pb, Li, Ni, Nb, SiC, Na, W, V and mixtures thereof. Over 40 organisations from 14 countries and 2 international organisations have contributed data and work in support of SINBAD. Examples of the use of the database in the scope of different international projects, such as the Working Party on Evaluation Cooperation of the OECD and the European Fusion Programme demonstrate the merit and possible usage of the database for the validation of modern nuclear data evaluations and new computer codes.

  9. High Speed Research Noise Prediction Code (HSRNOISE) User's and Theoretical Manual

    NASA Technical Reports Server (NTRS)

    Golub, Robert (Technical Monitor); Rawls, John W., Jr.; Yeager, Jessie C.

    2004-01-01

    This report describes a computer program, HSRNOISE, that predicts noise levels for a supersonic aircraft powered by mixed flow turbofan engines with rectangular mixer-ejector nozzles. It fully documents the noise prediction algorithms, provides instructions for executing the HSRNOISE code, and provides predicted noise levels for the High Speed Research (HSR) program Technology Concept (TC) aircraft. The component source noise prediction algorithms were developed jointly by Boeing, General Electric Aircraft Engines (GEAE), NASA and Pratt & Whitney during the course of the NASA HSR program. Modern Technologies Corporation developed an alternative mixer ejector jet noise prediction method under contract to GEAE that has also been incorporated into the HSRNOISE prediction code. Algorithms for determining propagation effects and calculating noise metrics were taken from the NASA Aircraft Noise Prediction Program.

  10. Atomic-scale Modeling of the Structure and Dynamics of Dislocations in Complex Alloys at High Temperatures

    NASA Technical Reports Server (NTRS)

    Daw, Murray S.; Mills, Michael J.

    2003-01-01

    We report on the progress made during the first year of the project. Most of the progress at this point has been on the theoretical and computational side. Here are the highlights: (1) A new code, tailored for high-end desktop computing, now combines modern Accelerated Dynamics (AD) with the well-tested Embedded Atom Method (EAM); (2) The new Accelerated Dynamics allows the study of relatively slow, thermally-activated processes, such as diffusion, which are much too slow for traditional Molecular Dynamics; (3) We have benchmarked the new AD code on a rather simple and well-known process: vacancy diffusion in copper; and (4) We have begun application of the AD code to the diffusion of vacancies in ordered intermetallics.

  11. Update on the Code Intercomparison and Benchmark for Muon Fluence and Absorbed Dose Induced by an 18 GeV Electron Beam After Massive Iron Shielding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fasso, A.; Ferrari, A.; Ferrari, A.

    In 1974, Nelson, Kase and Svensson published an experimental investigation on muon shielding around SLAC high-energy electron accelerators [1]. They measured muon fluence and absorbed dose induced by 14 and 18 GeV electron beams hitting a copper/water beamdump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical models available at that time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results are then compared between the codes, andmore » with the SLAC data.« less

  12. The Revised SEND Code of Practice 0-25: Effective Practice in Engaging Children and Young People in Decision-Making about Interventions for Social, Emotional and Mental Health Needs

    ERIC Educational Resources Information Center

    Kennedy, Emma-Kate

    2015-01-01

    A key principle upon which the Revised Special Educational Needs and Disability (SEND) Code of Practice 0-25 (2015) is based is children's involvement in decision-making that affects them, and a significant change is the removal of the term "behaviour" and an emphasis on social, emotional and mental health (SEMH) needs. To ensure that…

  13. Sustainable Rest Area Design and Operations

    DOT National Transportation Integrated Search

    2017-10-01

    One way in which State Departments of Transportation (DOTs) can modernize their rest areas while reducing operations and maintenance costs is by incorporating sustainable practices into rest area design and operations. Sustainability practices that D...

  14. [Prevalence of chronic fatigue syndrome in 4 family practices in Leiden].

    PubMed

    Versluis, R G; de Waal, M W; Opmeer, C; Petri, H; Springer, M P

    1997-08-02

    To determine the prevalence of chronic fatigue syndrome (CFS) in general practice. Descriptive. General practice and primary health care centres in Leyden region, the Netherlands. RNUH-LEO is a computerized database which contains the anonymous patient information of one general practice (with two practitioners) and four primary health care centres. The fourteen participating general practitioners were asked what International Classification of Primary Care (ICPC) code they used to indicate a patient with chronic fatigue or with CFS. With these codes and with the code for depression patients were selected from the database. It then was determined whether these patients met the criteria of CFS by Holmes et al. The general practitioners used 10 codes. Including the code for depression a total of 601 patients were preselected from a total of 23,000 patients in the database. Based on the information from the patients' records in the database, 42 of the preselected patients were selected who might fulfill the Holmes' criteria of CFS. According to the patients' own general practitioner, 25 of the 42 patients would fulfil the Holmes' criteria. The men:women ratio was 1:5. The prevalence of CFS in the population surveyed was estimated to be at least 1.1 per 1,000 patients.

  15. Emergency general surgery: definition and estimated burden of disease.

    PubMed

    Shafi, Shahid; Aboutanos, Michel B; Agarwal, Suresh; Brown, Carlos V R; Crandall, Marie; Feliciano, David V; Guillamondegui, Oscar; Haider, Adil; Inaba, Kenji; Osler, Turner M; Ross, Steven; Rozycki, Grace S; Tominaga, Gail T

    2013-04-01

    Acute care surgery encompasses trauma, surgical critical care, and emergency general surgery (EGS). While the first two components are well defined, the scope of EGS practice remains unclear. This article describes the work of the American Association for the Surgery of Trauma to define EGS. A total of 621 unique International Classification of Diseases-9th Rev. (ICD-9) diagnosis codes were identified using billing data (calendar year 2011) from seven large academic medical centers that practice EGS. A modified Delphi methodology was used by the American Association for the Surgery of Trauma Committee on Severity Assessment and Patient Outcomes to review these codes and achieve consensus on the definition of primary EGS diagnosis codes. National Inpatient Sample data from 2009 were used to develop a national estimate of EGS burden of disease. Several unique ICD-9 codes were identified as primary EGS diagnoses. These encompass a wide spectrum of general surgery practice, including upper and lower gastrointestinal tract, hepatobiliary and pancreatic disease, soft tissue infections, and hernias. National Inpatient Sample estimates revealed over 4 million inpatient encounters nationally in 2009 for EGS diseases. This article provides the first list of ICD-9 diagnoses codes that define the scope of EGS based on current clinical practices. These findings have wide implications for EGS workforce training, access to care, and research.

  16. Coding in Stroke and Other Cerebrovascular Diseases.

    PubMed

    Korb, Pearce J; Jones, William

    2017-02-01

    Accurate coding is critical for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of coding principles for patients with strokes and other cerebrovascular diseases and includes an illustrative case as a review of coding principles in a patient with acute stroke.

  17. Non-White, No More: Effect Coding as an Alternative to Dummy Coding with Implications for Higher Education Researchers

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Simonoff, Jeffrey S.

    2015-01-01

    The purpose of this article is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, race-based independent variables in higher education research. Unlike indicator (dummy) codes that imply that one group will be a reference group, effect codes use average responses as a means for…

  18. 76 FR 19971 - Notice of Proposed Changes to the National Handbook of Conservation Practices for the Natural...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-11

    ... 344), Silvopasture Establishment (Code 381), Tree/Shrub Establishment (Code 612), Waste Recycling... Criteria were added. Tree/Shrub Establishment (Code 612)--A new Purpose of ``Develop Renewable Energy...

  19. Cracking the Code: Synchronizing Policy and Practice for Performance-Based Learning

    ERIC Educational Resources Information Center

    Patrick, Susan; Sturgis, Chris

    2011-01-01

    Performance-based learning is one of the keys to cracking open the assumptions that undergird the current educational codes, structures, and practices. By finally moving beyond the traditions of a time-based system, greater customized educational services can flourish, preparing more and more students for college and careers. This proposed policy…

  20. Multilingual Practices in Contemporary and Historical Contexts: Interfaces between Code-Switching and Translation

    ERIC Educational Resources Information Center

    Kolehmainen, Leena; Skaffari, Janne

    2016-01-01

    This article serves as an introduction to a collection of four articles on multilingual practices in speech and writing, exploring both contemporary and historical sources. It not only introduces the articles but also discusses the scope and definitions of code-switching, attitudes towards multilingual interaction and, most pertinently, the…

  1. The Behavior Code Companion: Strategies, Tools, and Interventions for Supporting Students with Anxiety-Related and Oppositional Behaviors

    ERIC Educational Resources Information Center

    Minahan, Jessica

    2014-01-01

    Since its publication in 2012, "The Behavior Code: A Practical Guide to Understanding and Teaching the Most Challenging Students" has helped countless classroom teachers, special educators, and others implement an effective, new approach to teaching focused on skill-building, practical interventions, and purposeful, positive interactions…

  2. Learning by Doing: Teaching Decision Making through Building a Code of Ethics.

    ERIC Educational Resources Information Center

    Hawthorne, Mark D.

    2001-01-01

    Notes that applying abstract ethical principles to the practical business of building a code of applied ethics for a technical communication department teaches students that they share certain unarticulated or unconscious values that they can translate into ethical principles. Suggests that combining abstract theory with practical policy writing…

  3. Practical moral codes in the transgenic organism debate.

    PubMed

    Cooley, D R; Goreham, Gary; Youngs, George A

    2004-01-01

    In one study funded by the United States Department of Agriculture, people from North Dakota were interviewed to discover which moral principles they use in evaluating the morality of transgenic organisms and their introduction into markets. It was found that although the moral codes the human subjects employed were very similar, their views on transgenics were vastly different. In this paper, the codes that were used by the respondents are developed, compared to that of the academically composed Belmont Report, and then modified to create the more practical Common Moral Code. At the end, it is shown that the Common Moral Code has inherent inconsistency flaws that might be resolvable, but would require extensive work on the definition of terms and principles. However, the effort is worthwhile, especially if it results in a common moral code that all those involved in the debate are willing to use in negotiating a resolution to their differences.

  4. Autonomy, responsibility and the Italian Code of Deontology for Nurses.

    PubMed

    Barazzetti, Gaia; Radaelli, Stefania; Sala, Roberta

    2007-01-01

    This article is a first assessment of the Italian Code of deontology for nurses (revised in 1999) on the basis of data collected from focus groups with nurses taking part in the Ethical Codes in Nursing (ECN) project. We illustrate the professional context in which the Code was introduced and explain why the 1999 revision was necessary in the light of changes affecting the Italian nursing profession. The most remarkable findings concern professional autonomy and responsibility, and how the Code is thought of as a set of guidelines for nursing practice. We discuss these issues, underlining that the 1999 Code represents a valuable instrument for ethical reflection and examination, a stimulus for putting the moral sense of the nursing profession into action, and that it represents a new era for professional nursing practice in Italy. The results of the analysis also deserve further qualitative study and future consideration.

  5. Farm & Ranch Business Management. Second Edition.

    ERIC Educational Resources Information Center

    Steward, Jim; Jobes, Raleigh

    This practical guide for the agribusiness manager (the farmer, rancher, and other agribusiness people who work with agricultural commodities, supplies, and services) gives a basic understanding of modern management practices. It provides guidelines that can help them make practical business decisions. Chapter 1 is an introduction that highlights…

  6. Code Conversion Impact Factor and Cash Flow Impact of International Classification of Diseases, 10th Revision, on a Large Multihospital Radiology Practice.

    PubMed

    Jalilvand, Aryan; Fleming, Margaret; Moreno, Courtney; MacFarlane, Dan; Duszak, Richard

    2018-01-01

    The 2015 conversion of the International Classification of Diseases (ICD) system from the ninth revision (ICD-9) to the 10th revision (ICD-10) was widely projected to adversely impact physician practices. We aimed to assess code conversion impact factor (CCIF) projections and revenue delay impact to help radiology groups better prepare for eventual conversion to ICD, 11th revision (ICD-11). Studying 673,600 claims for 179 radiologists for the first year after ICD-10's implementation, we identified primary ICD-10 codes for the top 90th percentile of all examinations for the entire enterprise and each subspecialty division. Using established methodology, we calculated CCIFs (actual ICD-10 codes ÷ prior ICD-9 codes). To assess ICD-10's impact on cash flow, average monthly days in accounts receivable status was compared for the 12 months before and after conversion. Of all 69,823 ICD-10 codes, only 7,075 were used to report primary diagnoses across the entire practice, and just 562 were used to report 90% of all claims, compared with 348 under ICD-9. This translates to an overall CCIF of 1.6 for the department (far less than the literature-predicted 6). By subspecialty division, CCIFs ranged from 0.7 (breast) to 3.5 (musculoskeletal). Monthly average days in accounts receivable for the 12 months before and after ICD-10 conversion did not increase. The operational impact of the ICD-10 transition on radiology practices appears far less than anticipated with respect to both CCIF and delays in cash flow. Predictive models should be refined to help practices better prepare for ICD-11. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. ENDF-6 Formats Manual Data Formats and Procedures for the Evaluated Nuclear Data File ENDF/B-VI and ENDF/B-VII

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman, M.; Members of the Cross Sections Evaluation Working Group

    2009-06-01

    In December 2006, the Cross Section Evaluation Working Group (CSEWG) of the United States released the new ENDF/B-VII.0 library. This represented considerable achievement as it was the 1st major release since 1990 when ENDF/B-VI has been made publicly available. The two libraries have been released in the same format, ENDF-6, which has been originally developed for the ENDF/B-VI library. In the early stage of work on the VII-th generation of the library CSEWG made important decision to use the same formats. This decision was adopted even though it was argued that it would be timely to modernize the formats andmore » several interesting ideas were proposed. After careful deliberation CSEWG concluded that actual implementation would require considerable resources needed to modify processing codes and to guarantee high quality of the files processed by these codes. In view of this the idea of format modernization has been postponed and ENDF-6 format was adopted for the new ENDF/B-VII library. In several other areas related to ENDF we made our best to move beyond established tradition and achieve maximum modernization. Thus, the 'Big Paper' on ENDF/B-VII.0 has been published, also in December 2006, as the Special Issue of Nuclear Data Sheets 107 (1996) 2931-3060. The new web retrieval and plotting system for ENDF-6 formatted data, Sigma, was developed by the NNDC and released in 2007. Extensive paper has been published on the advanced tool for nuclear reaction data evaluation, EMPIRE, in 2007. This effort was complemented with release of updated set of ENDF checking codes in 2009. As the final item on this list, major revision of ENDF-6 Formats Manual was made. This work started in 2006 and came to fruition in 2009 as documented in the present report.« less

  8. The DoD's High Performance Computing Modernization Program - Ensuing the National Earth Systems Prediction Capability Becomes Operational

    NASA Astrophysics Data System (ADS)

    Burnett, W.

    2016-12-01

    The Department of Defense's (DoD) High Performance Computing Modernization Program (HPCMP) provides high performance computing to address the most significant challenges in computational resources, software application support and nationwide research and engineering networks. Today, the HPCMP has a critical role in ensuring the National Earth System Prediction Capability (N-ESPC) achieves initial operational status in 2019. A 2015 study commissioned by the HPCMP found that N-ESPC computational requirements will exceed interconnect bandwidth capacity due to the additional load from data assimilation and passing connecting data between ensemble codes. Memory bandwidth and I/O bandwidth will continue to be significant bottlenecks for the Navy's Hybrid Coordinate Ocean Model (HYCOM) scalability - by far the major driver of computing resource requirements in the N-ESPC. The study also found that few of the N-ESPC model developers have detailed plans to ensure their respective codes scale through 2024. Three HPCMP initiatives are designed to directly address and support these issues: Productivity Enhancement, Technology, Transfer and Training (PETTT), the HPCMP Applications Software Initiative (HASI), and Frontier Projects. PETTT supports code conversion by providing assistance, expertise and training in scalable and high-end computing architectures. HASI addresses the continuing need for modern application software that executes effectively and efficiently on next-generation high-performance computers. Frontier Projects enable research and development that could not be achieved using typical HPCMP resources by providing multi-disciplinary teams access to exceptional amounts of high performance computing resources. Finally, the Navy's DoD Supercomputing Resource Center (DSRC) currently operates a 6 Petabyte system, of which Naval Oceanography receives 15% of operational computational system use, or approximately 1 Petabyte of the processing capability. The DSRC will provide the DoD with future computing assets to initially operate the N-ESPC in 2019. This talk will further describe how DoD's HPCMP will ensure N-ESPC becomes operational, efficiently and effectively, using next-generation high performance computing.

  9. A Not-So-Gentle Refutation of the Defence of Homeopathy.

    PubMed

    Zawiła-Niedźwiecki, Jakub; Olender, Jacek

    2016-03-01

    In a recent paper, Levy, Gadd, Kerridge, and Komesaroff attempt to defend the ethicality of homeopathy by attacking the utilitarian ethical framework as a basis for medical ethics and by introducing a distinction between evidence-based medicine and modern science. This paper demonstrates that their argumentation is not only insufficient to achieve that goal but also incorrect. Utilitarianism is not required to show that homeopathic practice is unethical; indeed, any normative basis of medical ethics will make it unethical, as a defence of homeopathic practice requires the rejection of modern natural sciences, which are an integral part of medical ethics systems. This paper also points out that evidence-based medicine lies at the very core of modern science. Particular arguments made by Levy et al. within the principlist medical ethics normative system are also shown to be wrong.

  10. Concept and Practice of Teaching Technical University Students to Modern Technologies of 3d Data Acquisition and Processing: a Case Study of Close-Range Photogrammetry and Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Kravchenko, Iulia; Luhmann, Thomas; Shults, Roman

    2016-06-01

    For the preparation of modern specialists in the acquisition and processing of three-dimensional data, a broad and detailed study of related modern methods and technologies is necessary. One of the most progressive and effective methods of acquisition and analyzing spatial data is terrestrial laser scanning. The study of methods and technologies for terrestrial laser scanning is of great importance not only for GIS specialists, but also for surveying engineers who make decisions in traditional engineering tasks (monitoring, executive surveys, etc.). The understanding and formation of the right approach in preparing new professionals need to develop a modern and variable educational program. This educational program must provide effective practical and laboratory work and the student's coursework. The resulting knowledge of the study should form the basis for practical or research of young engineers. In 2014, the Institute of Applied Sciences (Jade University Oldenburg, Germany) and Kyiv National University of Construction and Architecture (Kiev, Ukraine) had launched a joint educational project for the introduction of terrestrial laser scanning technology for collection and processing of spatial data. As a result of this project practical recommendations have been developed for the organization of educational processes in the use of terrestrial laser scanning. An advanced project-oriented educational program was developed which is presented in this paper. In order to demonstrate the effectiveness of the program a 3D model of the big and complex main campus of Kyiv National University of Construction and Architecture has been generated.

  11. Lightning Protection Certification for High Explosives Facilities at Lawrence Livermore National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clancy, T J; Brown, C G; Ong, M M

    2006-01-11

    Presented here is an innovation in lighting safety certification, and a description of its implementation for high explosives processing and storage facilities at Lawrence Livermore National Laboratory. Lightning rods have proven useful in the protection of wooden structures; however, modern structures made of rebar, concrete, and the like, require fresh thinking. Our process involves a rigorous and unique approach to lightning safety for modern buildings, where the internal voltages and currents are quantified and the risk assessed. To follow are the main technical aspects of lightning protection for modern structures and these methods comply with the requirements of the Nationalmore » Fire Protection Association, the National Electrical Code, and the Department of Energy [1][2]. At the date of this release, we have certified over 70 HE processing and storage cells at our Site 300 facility.« less

  12. Decelerated genome evolution in modern vertebrates revealed by analysis of multiple lancelet genomes

    PubMed Central

    Huang, Shengfeng; Chen, Zelin; Yan, Xinyu; Yu, Ting; Huang, Guangrui; Yan, Qingyu; Pontarotti, Pierre Antoine; Zhao, Hongchen; Li, Jie; Yang, Ping; Wang, Ruihua; Li, Rui; Tao, Xin; Deng, Ting; Wang, Yiquan; Li, Guang; Zhang, Qiujin; Zhou, Sisi; You, Leiming; Yuan, Shaochun; Fu, Yonggui; Wu, Fenfang; Dong, Meiling; Chen, Shangwu; Xu, Anlong

    2014-01-01

    Vertebrates diverged from other chordates ~500 Myr ago and experienced successful innovations and adaptations, but the genomic basis underlying vertebrate origins are not fully understood. Here we suggest, through comparison with multiple lancelet (amphioxus) genomes, that ancient vertebrates experienced high rates of protein evolution, genome rearrangement and domain shuffling and that these rates greatly slowed down after the divergence of jawed and jawless vertebrates. Compared with lancelets, modern vertebrates retain, at least relatively, less protein diversity, fewer nucleotide polymorphisms, domain combinations and conserved non-coding elements (CNE). Modern vertebrates also lost substantial transposable element (TE) diversity, whereas lancelets preserve high TE diversity that includes even the long-sought RAG transposon. Lancelets also exhibit rapid gene turnover, pervasive transcription, fastest exon shuffling in metazoans and substantial TE methylation not observed in other invertebrates. These new lancelet genome sequences provide new insights into the chordate ancestral state and the vertebrate evolution. PMID:25523484

  13. Decelerated genome evolution in modern vertebrates revealed by analysis of multiple lancelet genomes.

    PubMed

    Huang, Shengfeng; Chen, Zelin; Yan, Xinyu; Yu, Ting; Huang, Guangrui; Yan, Qingyu; Pontarotti, Pierre Antoine; Zhao, Hongchen; Li, Jie; Yang, Ping; Wang, Ruihua; Li, Rui; Tao, Xin; Deng, Ting; Wang, Yiquan; Li, Guang; Zhang, Qiujin; Zhou, Sisi; You, Leiming; Yuan, Shaochun; Fu, Yonggui; Wu, Fenfang; Dong, Meiling; Chen, Shangwu; Xu, Anlong

    2014-12-19

    Vertebrates diverged from other chordates ~500 Myr ago and experienced successful innovations and adaptations, but the genomic basis underlying vertebrate origins are not fully understood. Here we suggest, through comparison with multiple lancelet (amphioxus) genomes, that ancient vertebrates experienced high rates of protein evolution, genome rearrangement and domain shuffling and that these rates greatly slowed down after the divergence of jawed and jawless vertebrates. Compared with lancelets, modern vertebrates retain, at least relatively, less protein diversity, fewer nucleotide polymorphisms, domain combinations and conserved non-coding elements (CNE). Modern vertebrates also lost substantial transposable element (TE) diversity, whereas lancelets preserve high TE diversity that includes even the long-sought RAG transposon. Lancelets also exhibit rapid gene turnover, pervasive transcription, fastest exon shuffling in metazoans and substantial TE methylation not observed in other invertebrates. These new lancelet genome sequences provide new insights into the chordate ancestral state and the vertebrate evolution.

  14. Monitoring Sub-Saharan African Physician Migration and Recruitment Post-Adoption of the WHO Code of Practice: Temporal and Geographic Patterns in the United States

    PubMed Central

    Tankwanchi, Akhenaten Benjamin Siankam; Vermund, Sten H.; Perkins, Douglas D.

    2015-01-01

    Data monitoring is a key recommendation of the WHO Global Code of Practice on the International Recruitment of Health Personnel, a global framework adopted in May 2010 to address health workforce retention in resource-limited countries and the ethics of international migration. Using data on African-born and African-educated physicians in the 2013 American Medical Association Physician Masterfile (AMA Masterfile), we monitored Sub-Saharan African (SSA) physician recruitment into the physician workforce of the United States (US) post-adoption of the WHO Code of Practice. From the observed data, we projected to 2015 with linear regression, and we mapped migrant physicians’ locations using GPS Visualizer and ArcGIS. The 2013 AMA Masterfile identified 11,787 active SSA-origin physicians, representing barely 1.3% (11,787/940,456) of the 2013 US physician workforce, but exceeding the total number of physicians reported by WHO in 34 SSA countries (N = 11,519). We estimated that 15.7% (1,849/11,787) entered the US physician workforce after the Code of Practice was adopted. Compared to pre-Code estimates from 2002 (N = 7,830) and 2010 (N = 9,938), the annual admission rate of SSA émigrés into the US physician workforce is increasing. This increase is due in large part to the growing number of SSA-born physicians attending medical schools outside SSA, representing a trend towards younger migrants. Projection estimates suggest that there will be 12,846 SSA migrant physicians in the US physician workforce in 2015, and over 2,900 of them will be post-Code recruits. Most SSA migrant physicians are locating to large urban US areas where physician densities are already the highest. The Code of Practice has not slowed the SSA-to-US physician migration. To stem the physician “brain drain”, it is essential to incentivize professional practice in SSA and diminish the appeal of US migration with bolder interventions targeting primarily early-career (age ≤ 35) SSA physicians. PMID:25875010

  15. MDSplus quality improvement project

    DOE PAGES

    Fredian, Thomas W.; Stillerman, Joshua; Manduchi, Gabriele; ...

    2016-05-31

    MDSplus is a data acquisition and analysis system used worldwide predominantly in the fusion research community. Development began 29 years ago on the OpenVMS operating system. Since that time there have been many new features added and the code has been ported to many different operating systems. There have been contributions to the MDSplus development from the fusion community in the way of feature suggestions, feature implementations, documentation and porting to different operating systems. The bulk of the development and support of MDSplus, however, has been provided by a relatively small core developer group of three or four members. Givenmore » the size of the development team and the large number of users much more effort was focused on providing new features for the community than on keeping the underlying code and documentation up to date with the evolving software development standards. To ensure that MDSplus will continue to provide the needs of the community in the future, the MDSplus development team along with other members of the MDSplus user community has commenced on a major quality improvement project. The planned improvements include changes to software build scripts to better use GNU Autoconf and Automake tools, refactoring many of the source code modules using new language features available in modern compilers, using GNU MinGW-w64 to create MS Windows distributions, migrating to a more modern source code management system, improvement of source documentation as well as improvements to the www.mdsplus.org web site documentation and layout, and the addition of more comprehensive test suites to apply to MDSplus code builds prior to releasing installation kits to the community. This paper should lead to a much more robust product and establish a framework to maintain stability as more enhancements and features are added. Finally, this paper will describe these efforts that are either in progress or planned for the near future.« less

  16. Rapid algorithm prototyping and implementation for power quality measurement

    NASA Astrophysics Data System (ADS)

    Kołek, Krzysztof; Piątek, Krzysztof

    2015-12-01

    This article presents a Model-Based Design (MBD) approach to rapidly implement power quality (PQ) metering algorithms. Power supply quality is a very important aspect of modern power systems and will become even more important in future smart grids. In this case, maintaining the PQ parameters at the desired level will require efficient implementation methods of the metering algorithms. Currently, the development of new, advanced PQ metering algorithms requires new hardware with adequate computational capability and time intensive, cost-ineffective manual implementations. An alternative, considered here, is an MBD approach. The MBD approach focuses on the modelling and validation of the model by simulation, which is well-supported by a Computer-Aided Engineering (CAE) packages. This paper presents two algorithms utilized in modern PQ meters: a phase-locked loop based on an Enhanced Phase Locked Loop (EPLL), and the flicker measurement according to the IEC 61000-4-15 standard. The algorithms were chosen because of their complexity and non-trivial development. They were first modelled in the MATLAB/Simulink package, then tested and validated in a simulation environment. The models, in the form of Simulink diagrams, were next used to automatically generate C code. The code was compiled and executed in real-time on the Zynq Xilinx platform that combines a reconfigurable Field Programmable Gate Array (FPGA) with a dual-core processor. The MBD development of PQ algorithms, automatic code generation, and compilation form a rapid algorithm prototyping and implementation path for PQ measurements. The main advantage of this approach is the ability to focus on the design, validation, and testing stages while skipping over implementation issues. The code generation process renders production-ready code that can be easily used on the target hardware. This is especially important when standards for PQ measurement are in constant development, and the PQ issues in emerging smart grids will require tools for rapid development and implementation of such algorithms.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    This document outlines the development of a high fidelity, best estimate nuclear power plant severe transient simulation capability that will complement or enhance the integral system codes historically used for licensing and analysis of severe accidents. As with other tools in the Risk Informed Safety Margin Characterization (RISMC) Toolkit, the ultimate user of Enhanced Severe Transient Analysis and Prevention (ESTAP) capability is the plant decision-maker; the deliverable to that customer is a modern, simulation-based safety analysis capability, applicable to a much broader class of safety issues than is traditional Light Water Reactor (LWR) licensing analysis. Currently, the RISMC pathway’s majormore » emphasis is placed on developing RELAP-7, a next-generation safety analysis code, and on showing how to use RELAP-7 to analyze margin from a modern point of view: that is, by characterizing margin in terms of the probabilistic spectra of the “loads” applied to systems, structures, and components (SSCs), and the “capacity” of those SSCs to resist those loads without failing. The first objective of the ESTAP task, and the focus of one task of this effort, is to augment RELAP-7 analyses with user-selected multi-dimensional, multi-phase models of specific plant components to simulate complex phenomena that may lead to, or exacerbate, severe transients and core damage. Such phenomena include: coolant crossflow between PWR assemblies during a severe reactivity transient, stratified single or two-phase coolant flow in primary coolant piping, inhomogeneous mixing of emergency coolant water or boric acid with hot primary coolant, and water hammer. These are well-documented phenomena associated with plant transients but that are generally not captured in system codes. They are, however, generally limited to specific components, structures, and operating conditions. The second ESTAP task is to similarly augment a severe (post-core damage) accident integral analyses code with high fidelity simulations that would allow investigation of multi-dimensional, multi-phase containment phenomena that are only treated approximately in established codes.« less

  18. A Method for Constructing a New Extensible Nomenclature for Clinical Coding Practices in Sub-Saharan Africa.

    PubMed

    Van Laere, Sven; Nyssen, Marc; Verbeke, Frank

    2017-01-01

    Clinical coding is a requirement to provide valuable data for billing, epidemiology and health care resource allocation. In sub-Saharan Africa, we observe a growing awareness of the need for coding of clinical data, not only in health insurances, but also in governments and the hospitals. Presently, coding systems in sub-Saharan Africa are often used for billing purposes. In this paper we consider the use of a nomenclature to also have a clinical impact. Often coding systems are assumed to be complex and too extensive to be used in daily practice. Here, we present a method for constructing a new nomenclature based on existing coding systems by considering a minimal subset in the sub-Saharan region. Evaluation of completeness will be done nationally using the requirements of national registries. The nomenclature requires an extension character for dealing with codes that have to be used for multiple registries. Hospitals will benefit most by using this extension character.

  19. The Divergence of Neandertal and Modern Human Y Chromosomes

    PubMed Central

    Mendez, Fernando L.; Poznik, G. David; Castellano, Sergi; Bustamante, Carlos D.

    2016-01-01

    Sequencing the genomes of extinct hominids has reshaped our understanding of modern human origins. Here, we analyze ∼120 kb of exome-captured Y-chromosome DNA from a Neandertal individual from El Sidrón, Spain. We investigate its divergence from orthologous chimpanzee and modern human sequences and find strong support for a model that places the Neandertal lineage as an outgroup to modern human Y chromosomes—including A00, the highly divergent basal haplogroup. We estimate that the time to the most recent common ancestor (TMRCA) of Neandertal and modern human Y chromosomes is ∼588 thousand years ago (kya) (95% confidence interval [CI]: 447–806 kya). This is ∼2.1 (95% CI: 1.7–2.9) times longer than the TMRCA of A00 and other extant modern human Y-chromosome lineages. This estimate suggests that the Y-chromosome divergence mirrors the population divergence of Neandertals and modern human ancestors, and it refutes alternative scenarios of a relatively recent or super-archaic origin of Neandertal Y chromosomes. The fact that the Neandertal Y we describe has never been observed in modern humans suggests that the lineage is most likely extinct. We identify protein-coding differences between Neandertal and modern human Y chromosomes, including potentially damaging changes to PCDH11Y, TMSB4Y, USP9Y, and KDM5D. Three of these changes are missense mutations in genes that produce male-specific minor histocompatibility (H-Y) antigens. Antigens derived from KDM5D, for example, are thought to elicit a maternal immune response during gestation. It is possible that incompatibilities at one or more of these genes played a role in the reproductive isolation of the two groups. PMID:27058445

  20. The Divergence of Neandertal and Modern Human Y Chromosomes.

    PubMed

    Mendez, Fernando L; Poznik, G David; Castellano, Sergi; Bustamante, Carlos D

    2016-04-07

    Sequencing the genomes of extinct hominids has reshaped our understanding of modern human origins. Here, we analyze ∼120 kb of exome-captured Y-chromosome DNA from a Neandertal individual from El Sidrón, Spain. We investigate its divergence from orthologous chimpanzee and modern human sequences and find strong support for a model that places the Neandertal lineage as an outgroup to modern human Y chromosomes-including A00, the highly divergent basal haplogroup. We estimate that the time to the most recent common ancestor (TMRCA) of Neandertal and modern human Y chromosomes is ∼588 thousand years ago (kya) (95% confidence interval [CI]: 447-806 kya). This is ∼2.1 (95% CI: 1.7-2.9) times longer than the TMRCA of A00 and other extant modern human Y-chromosome lineages. This estimate suggests that the Y-chromosome divergence mirrors the population divergence of Neandertals and modern human ancestors, and it refutes alternative scenarios of a relatively recent or super-archaic origin of Neandertal Y chromosomes. The fact that the Neandertal Y we describe has never been observed in modern humans suggests that the lineage is most likely extinct. We identify protein-coding differences between Neandertal and modern human Y chromosomes, including potentially damaging changes to PCDH11Y, TMSB4Y, USP9Y, and KDM5D. Three of these changes are missense mutations in genes that produce male-specific minor histocompatibility (H-Y) antigens. Antigens derived from KDM5D, for example, are thought to elicit a maternal immune response during gestation. It is possible that incompatibilities at one or more of these genes played a role in the reproductive isolation of the two groups. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Coding of DNA samples and data in the pharmaceutical industry: current practices and future directions--perspective of the I-PWG.

    PubMed

    Franc, M A; Cohen, N; Warner, A W; Shaw, P M; Groenen, P; Snapir, A

    2011-04-01

    DNA samples collected in clinical trials and stored for future research are valuable to pharmaceutical drug development. Given the perceived higher risk associated with genetic research, industry has implemented complex coding methods for DNA. Following years of experience with these methods and with addressing questions from institutional review boards (IRBs), ethics committees (ECs) and health authorities, the industry has started reexamining the extent of the added value offered by these methods. With the goal of harmonization, the Industry Pharmacogenomics Working Group (I-PWG) conducted a survey to gain an understanding of company practices for DNA coding and to solicit opinions on their effectiveness at protecting privacy. The results of the survey and the limitations of the coding methods are described. The I-PWG recommends dialogue with key stakeholders regarding coding practices such that equal standards are applied to DNA and non-DNA samples. The I-PWG believes that industry standards for privacy protection should provide adequate safeguards for DNA and non-DNA samples/data and suggests a need for more universal standards for samples stored for future research.

  2. Modern trends in infection control practices in intensive care units.

    PubMed

    Gandra, Sumanth; Ellison, Richard T

    2014-01-01

    Hospital-acquired infections (HAIs) are common in intensive care unit (ICU) patients and are associated with increased morbidity and mortality. There has been an increasing effort to prevent HAIs, and infection control practices are paramount in avoiding these complications. In the last several years, numerous developments have been seen in the infection prevention strategies in various health care settings. This article reviews the modern trends in infection control practices to prevent HAIs in ICUs with a focus on methods for monitoring hand hygiene, updates in isolation precautions, new methods for environmental cleaning, antimicrobial bathing, prevention of ventilator-associated pneumonia, central line-associated bloodstream infections, catheter-associated urinary tract infections, and Clostridium difficile infection. © The Author(s) 2013.

  3. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser

    PubMed Central

    Almeida, Jonas S.; Iriabho, Egiebade E.; Gorrepati, Vijaya L.; Wilkinson, Sean R.; Grüneberg, Alexander; Robbins, David E.; Hackney, James R.

    2012-01-01

    Background: Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. Materials and Methods: ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Results: Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. Conclusions: The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local “download and installation”. PMID:22934238

  4. SoAx: A generic C++ Structure of Arrays for handling particles in HPC codes

    NASA Astrophysics Data System (ADS)

    Homann, Holger; Laenen, Francois

    2018-03-01

    The numerical study of physical problems often require integrating the dynamics of a large number of particles evolving according to a given set of equations. Particles are characterized by the information they are carrying such as an identity, a position other. There are generally speaking two different possibilities for handling particles in high performance computing (HPC) codes. The concept of an Array of Structures (AoS) is in the spirit of the object-oriented programming (OOP) paradigm in that the particle information is implemented as a structure. Here, an object (realization of the structure) represents one particle and a set of many particles is stored in an array. In contrast, using the concept of a Structure of Arrays (SoA), a single structure holds several arrays each representing one property (such as the identity) of the whole set of particles. The AoS approach is often implemented in HPC codes due to its handiness and flexibility. For a class of problems, however, it is known that the performance of SoA is much better than that of AoS. We confirm this observation for our particle problem. Using a benchmark we show that on modern Intel Xeon processors the SoA implementation is typically several times faster than the AoS one. On Intel's MIC co-processors the performance gap even attains a factor of ten. The same is true for GPU computing, using both computational and multi-purpose GPUs. Combining performance and handiness, we present the library SoAx that has optimal performance (on CPUs, MICs, and GPUs) while providing the same handiness as AoS. For this, SoAx uses modern C++ design techniques such template meta programming that allows to automatically generate code for user defined heterogeneous data structures.

  5. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser.

    PubMed

    Almeida, Jonas S; Iriabho, Egiebade E; Gorrepati, Vijaya L; Wilkinson, Sean R; Grüneberg, Alexander; Robbins, David E; Hackney, James R

    2012-01-01

    Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local "download and installation".

  6. SAM Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    The System Analysis Module (SAM) is an advanced and modern system analysis tool being developed at Argonne National Laboratory under the U.S. DOE Office of Nuclear Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM development aims for advances in physical modeling, numerical methods, and software engineering to enhance its user experience and usability for reactor transient analyses. To facilitate the code development, SAM utilizes an object-oriented application framework (MOOSE), and its underlying meshing and finite-element library (libMesh) and linear and non-linear solvers (PETSc), to leverage modern advanced software environments and numerical methods. SAM focuses on modeling advanced reactormore » concepts such as SFRs (sodium fast reactors), LFRs (lead-cooled fast reactors), and FHRs (fluoride-salt-cooled high temperature reactors) or MSRs (molten salt reactors). These advanced concepts are distinguished from light-water reactors in their use of single-phase, low-pressure, high-temperature, and low Prandtl number (sodium and lead) coolants. As a new code development, the initial effort has been focused on modeling and simulation capabilities of heat transfer and single-phase fluid dynamics responses in Sodium-cooled Fast Reactor (SFR) systems. The system-level simulation capabilities of fluid flow and heat transfer in general engineering systems and typical SFRs have been verified and validated. This document provides the theoretical and technical basis of the code to help users understand the underlying physical models (such as governing equations, closure models, and component models), system modeling approaches, numerical discretization and solution methods, and the overall capabilities in SAM. As the code is still under ongoing development, this SAM Theory Manual will be updated periodically to keep it consistent with the state of the development.« less

  7. Empirical impact evaluation of the WHO Global Code of Practice on the International Recruitment of Health Personnel in Australia, Canada, UK and USA

    PubMed Central

    2013-01-01

    Background The active recruitment of health workers from developing countries to developed countries has become a major threat to global health. In an effort to manage this migration, the 63rd World Health Assembly adopted the World Health Organization (WHO) Global Code of Practice on the International Recruitment of Health Personnel in May 2010. While the Code has been lauded as the first globally-applicable regulatory framework for health worker recruitment, its impact has yet to be evaluated. We offer the first empirical evaluation of the Code’s impact on national and sub-national actors in Australia, Canada, United Kingdom and United States of America, which are the English-speaking developed countries with the greatest number of migrant health workers. Methods 42 key informants from across government, civil society and private sectors were surveyed to measure their awareness of the Code, knowledge of specific changes resulting from it, overall opinion on the effectiveness of non-binding codes, and suggestions to improve this Code’s implementation. Results 60% of respondents believed their colleagues were not aware of the Code, and 93% reported that no specific changes had been observed in their work as a result of the Code. 86% reported that the Code has not had any meaningful impact on policies, practices or regulations in their countries. Conclusions This suggests a gap between awareness of the Code among stakeholders at global forums and the awareness and behaviour of national and sub-national actors. Advocacy and technical guidance for implementing the Code are needed to improve its impact on national decision-makers. PMID:24228827

  8. High Resolution Near Real Time Image Processing and Support for MSSS Modernization

    NASA Astrophysics Data System (ADS)

    Duncan, R. B.; Sabol, C.; Borelli, K.; Spetka, S.; Addison, J.; Mallo, A.; Farnsworth, B.; Viloria, R.

    2012-09-01

    This paper describes image enhancement software applications engineering development work that has been performed in support of Maui Space Surveillance System (MSSS) Modernization. It also includes R&D and transition activity that has been performed over the past few years with the objective of providing increased space situational awareness (SSA) capabilities. This includes Air Force Research Laboratory (AFRL) use of an FY10 Dedicated High Performance Investment (DHPI) cluster award -- and our selection and planned use for an FY12 DHPI award. We provide an introduction to image processing of electro optical (EO) telescope sensors data; and a high resolution image enhancement and near real time processing and summary status overview. We then describe recent image enhancement applications development and support for MSSS Modernization, results to date, and end with a discussion of desired future development work and conclusions. Significant improvements to image processing enhancement have been realized over the past several years, including a key application that has realized more than a 10,000-times speedup compared to the original R&D code -- and a greater than 72-times speedup over the past few years. The latest version of this code maintains software efficiency for post-mission processing while providing optimization for image processing of data from a new EO sensor at MSSS. Additional work has also been performed to develop low latency, near real time processing of data that is collected by the ground-based sensor during overhead passes of space objects.

  9. The Practice of Foreign Language Teaching.

    ERIC Educational Resources Information Center

    Cajkler, Wasyl; Addelman, Ron

    This book on aspects of modern foreign language teaching is written for trainee, new, and experienced teachers of students aged 11-16 and is intended as a practical source of information. The discussion of specific teaching issues includes implications for classroom practice. While not directly addressing Britain's new National Curriculum, it does…

  10. College Students' View of Biotechnology Products and Practices in Sustainable Agriculture Systems

    ERIC Educational Resources Information Center

    Anderson, William A.

    2008-01-01

    Sustainable agriculture implies the use of products and practices that sustain production, protect the environment, ensure economic viability, and maintain rural community viability. Disagreement exists as to whether or not the products and practices of modern biotechnological support agricultural sustainability. The purpose of this study was to…

  11. Diversity of faculty practice in workshop classrooms

    NASA Astrophysics Data System (ADS)

    Franklin, Scott V.; Chapman, Tricia

    2013-01-01

    We present a temporally fine-grained characterization of faculty practice in workshop-style introductory physics courses. Practice is binned in five minute intervals and coded through two complementary observational protocols: the Reform Teaching Observation Protocol provides a summative assessment of fidelity to reform-teaching principles, while the Teaching Dimensions Observation Protocol records direct practice. We find that the TDOP's direct coding of practice explains nuances in the holistic RTOP score, with higher RTOP scores corresponding to less lecture, but not necessarily more student-directed activities. Despite using similar materials, faculty show significant differences in practice that manifests in both TDOP and RTOP scores. We also find a significant dependence of practice on course subject reflected in both RTOP and TDOP scores, with Electricity & Magnetism using more instructor-centered practices (lecture, illustration, etc.) than Mechanics courses.

  12. Lactobacillus backii and Pediococcus damnosus isolated from 170-year-old beer recovered from a shipwreck lack the metabolic activities required to grow in modern lager beer.

    PubMed

    Kajala, Ilkka; Bergsveinson, Jordyn; Friesen, Vanessa; Redekop, Anna; Juvonen, Riikka; Storgårds, Erna; Ziola, Barry

    2018-01-01

    In 2010, bottles of beer containing viable bacteria of the common beer-spoilage species Lactobacillus backii and Pediococcus damnosus were recovered from a shipwreck near the Åland Islands, Finland. The 170-year quiescent state maintained by the shipwreck bacteria presented a unique opportunity to study lactic acid bacteria (LAB) evolution vis-a-vis growth and survival in the beer environment. Three shipwreck bacteria (one L. backii strain and two P. damnosus strains) and modern-day beer-spoilage isolates of the same two species were genome sequenced, characterized for hop iso-α-acid tolerance, and growth in degassed lager and wheat beer. In addition, plasmid variants of the modern-day P. damnosus strain were analyzed for the effect of plasmid-encoded genes on growth in lager beer. Coding content on two plasmids was identified as essential for LAB growth in modern lager beer. Three chromosomal regions containing genes related to sugar transport and cell wall polysaccharides were shared by pediococci able to grow in beer. Our results show that the three shipwreck bacteria lack the necessary plasmid-located genetic content to grow in modern lager beer, but carry additional genes related to acid tolerance and biofilm formation compared to their modern counterparts. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Coding for effective denial management.

    PubMed

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of ABNs and the compliance risks associated with improper use. Finally, training programs should include routine audits to monitor coders for competence and precision. Constantly changing codes and guidelines mean that a coder's skills can quickly become obsolete if not reinforced by ongoing training and monitoring. Comprehensive reporting and routine analysis of claim denials is without a doubt one of the greatest assets to a practice that is suffering from excessive claim denials and should be considered an investment capable of providing both short and long term ROIs. Some radiologists may lack the funding or human resources needed to implement truly effective coding programs for their staff members. In these circumstances, radiology business managers should consider outsourcing their coding.

  14. Gnotobiology in modern medicine

    NASA Technical Reports Server (NTRS)

    Podoprigora, G. I.

    1980-01-01

    A review is given of currently accepted theories and applications of gnotobiology. A brief history of gnotobiology is supplied. Problems involved in creating germ-free gnotobiota and the use of these animals in experimental biology are cited. Examples of how gnotobiology is used in modern medical practice illustrate the future prospects for this area of science.

  15. English 7-8: Modern Media of Communication.

    ERIC Educational Resources Information Center

    McGowan, Madelon

    This grade 7-8 level course guide covers aspects of media communication such as verbal and nonverbal communication theory, forms of modern media (newspapers, feature films, artistic films, music, advertising, etc.), and practice for the student in the various aspects of communication media. The guide is designed for a one-year course and enhances…

  16. Formation of Pedagogical System for Individual Self-Development by Means of Physical Culture and Sport

    ERIC Educational Resources Information Center

    Panachev, Valery D.; Zelenin, Leonid A.; Opletin, Anatoly A.; Verbytskyi, Sergei A.

    2017-01-01

    Problems of formation, development and introduction of the modern pedagogical selfdevelopment system in university educational process by means of physical culture and sport have been considered in this article. Such generated pedagogical system reflects practical implementation of social order on the modern educational paradigm aimed at creation…

  17. The Fateful Rift: The San Andreas Fault in the Modern Mind.

    ERIC Educational Resources Information Center

    Percy, Walker

    1990-01-01

    Claims that modern science is radically incoherent and that this incoherence lies within the practice of science. Details the work of the scientist and philosopher Charles Sanders Pierce, expounding on the difference between Rene Descartes' dualistic philosophy and Pierce's triadic view. Concludes with a brief description of the human existence.…

  18. Traditional Occupations in a Modern World: Implications for Career Guidance and Livelihood Planning

    ERIC Educational Resources Information Center

    Ratnam, Anita

    2011-01-01

    This article is an attempt to examine the place and significance of traditional occupations as careers in today's world. The areas of tension and compatibility between ideas and values that signify modernity and the practice of traditional occupations are reviewed. The meaning of "traditional occupations" is unravelled, the potential that…

  19. Mothers Roles in Traditional and Modern Korean Families: The Consequences for Parental Practices and Adolescent Socialization.

    ERIC Educational Resources Information Center

    Kim, Hye-On; Hoppe-Graff, Siegfried

    2001-01-01

    Compares mothers' roles in socializing their children in traditional South Korean families with that of mothers' in modern families. While Confusion influence remains strong, significant changes in South Korean culture often create complex, ambiguous, and emotionally unstable relationships between mothers and their adolescent children. Discusses…

  20. A Socio-Technical Analysis of Knowledgeable Practice in Radiation Therapy

    ERIC Educational Resources Information Center

    Lozano, Reynaldo Garza

    2012-01-01

    The role of the modern radiation therapist is directed and driven by the organizational system. Changes affecting their role are implemented as a response to changes in the industry. Operations of the modern cancer center, with new and changing treatment technologies bring questions regarding the learning process of radiation therapists at a time…

Top